follow Dictionary.com

11 Trending Words of 2014

Markov chain

[mahr-kawf] /ˈmɑr kɔf/
noun, Statistics.
1.
a Markov process restricted to discrete random events or to discontinuous time sequences.
Also, Markoff chain.
Origin
1940-1945
1940-45; see Markov process
Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source
British Dictionary definitions for Markov chain

Markov chain

/ˈmɑːkɒf/
noun
1.
(statistics) a sequence of events the probability for each of which is dependent only on the event immediately preceding it
Word Origin
C20: named after Andrei Markov (1856–1922), Russian mathematician
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition
© William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins
Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
Cite This Source
Markov chain in Technology

probability
(Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.
A Markov process is governed by a Markov chain.
In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.
[Better explanation?]
(1995-02-23)

The Free On-line Dictionary of Computing, © Denis Howe 2010 http://foldoc.org
Cite This Source

Word of the Day

Difficulty index for Markov chain

Few English speakers likely know this word

Word Value for Markov

0
0
Scrabble Words With Friends

Nearby words for markov chain