Why was "tantrum" trending last week?

Markov chain

[mahr-kawf] /ˈmɑr kɔf/
noun, Statistics.
a Markov process restricted to discrete random events or to discontinuous time sequences.
Also, Markoff chain.
Origin of Markov chain
1940-45; see Markov process Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2015.
Cite This Source
British Dictionary definitions for markoff-chain

Markov chain

(statistics) a sequence of events the probability for each of which is dependent only on the event immediately preceding it
Word Origin
C20: named after Andrei Markov (1856–1922), Russian mathematician
Collins English Dictionary - Complete & Unabridged 2012 Digital Edition
© William Collins Sons & Co. Ltd. 1979, 1986 © HarperCollins
Publishers 1998, 2000, 2003, 2005, 2006, 2007, 2009, 2012
Cite This Source

Word of the Day

Difficulty index for Markov chain

Few English speakers likely know this word

Word Value for markoff

Scrabble Words With Friends

Quotes with markoff-chain

Nearby words for markoff-chain