markoff-chain

Markov chain

[mahr-kawf]
noun Statistics.
a Markov process restricted to discrete random events or to discontinuous time sequences.
Also, Markoff chain.


Origin:
1940–45; see Markov process

Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source Link To markoff-chain
Collins
World English Dictionary
Markov chain (ˈmɑːkɒf)
 
n
statistics a sequence of events the probability for each of which is dependent only on the event immediately preceding it
 
[C20: named after Andrei Markov (1856--1922), Russian mathematician]

Collins English Dictionary - Complete & Unabridged 10th Edition
2009 © William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins
Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009
Cite This Source
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature
FAVORITES
RECENT

;