Markov chain

Markov chain

[mahr-kawf]
noun Statistics.
a Markov process restricted to discrete random events or to discontinuous time sequences.
Also, Markoff chain.


Origin:
1940–45; see Markov process

Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source Link To Markov chain
Collins
World English Dictionary
Markov chain (ˈmɑːkɒf)
 
n
statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
 
[C20: named after Andrei Markov (1856--1922), Russian mathematician]

Collins English Dictionary - Complete & Unabridged 10th Edition
2009 © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins
Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009
Cite This Source
FOLDOC
Computing Dictionary

Markov chain definition

probability
(Named after Andrei Markov) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.
A Markov process is governed by a Markov chain.
In simulation, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript II.5 uses this approach for some modelling functions.
[Better explanation?]
(1995-02-23)

The Free On-line Dictionary of Computing, © Denis Howe 2010 http://foldoc.org
Cite This Source
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature