Try Our Apps


Pore Over vs. Pour Over

Markov chain

or Markoff chain

[mahr-kawf] /ˈmɑr kɔf/
noun, Statistics.
a Markov process restricted to discrete random events or to discontinuous time sequences.
Origin of Markov chain
1940-45; see Markov process Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2015.
Cite This Source
British Dictionary definitions for Markov chain

Markov chain

(statistics) a sequence of events the probability for each of which is dependent only on the event immediately preceding it
Word Origin
C20: named after Andrei Markov (1856–1922), Russian mathematician
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition
© William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins
Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
Cite This Source
Markov chain in Technology

(Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.
A Markov process is governed by a Markov chain.
In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.
[Better explanation?]

The Free On-line Dictionary of Computing, © Denis Howe 2010
Cite This Source

Word of the Day

Difficulty index for Markov chain

Few English speakers likely know this word

Word Value for Markov

Scrabble Words With Friends