follow Dictionary.com

8 Wintry Words to Defrost Your Vocabulary

Markov chain

[mahr-kawf] /ˈmɑr kɔf/
noun, Statistics.
1.
a Markov process restricted to discrete random events or to discontinuous time sequences.
Also, Markoff chain.
Origin
1940-1945
1940-45; see Markov process
Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source
British Dictionary definitions for markoff chain

Markov chain

/ˈmɑːkɒf/
noun
1.
(statistics) a sequence of events the probability for each of which is dependent only on the event immediately preceding it
Word Origin
C20: named after Andrei Markov (1856–1922), Russian mathematician
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition
© William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins
Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
Cite This Source

Word of the Day

Difficulty index for Markov chain

Few English speakers likely know this word

Word Value for markoff

0
0
Scrabble Words With Friends

Nearby words for markoff chain