Markov chain

noun Statistics.
a Markov process restricted to discrete random events or to discontinuous time sequences.
Also, Markoff chain.

1940–45; see Markov process Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source Link To markoff-chain
World English Dictionary
Markov chain (ˈmɑːkɒf)
statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
[C20: named after Andrei Markov (1856--1922), Russian mathematician]

Collins English Dictionary - Complete & Unabridged 10th Edition
2009 © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins
Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009
Cite This Source
Copyright © 2014, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature