Try Our Apps


Pore Over vs. Pour Over

Markov process

or Markoff process

noun, Statistics.
a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Origin of Markov process
1935-40; after Russian mathematician Andreĭ Andreevich Markov (1856-1922), who developed it Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2015.
Cite This Source
Markov process in Technology

probability, simulation
A process in which the sequence of events can be described by a Markov chain.

The Free On-line Dictionary of Computing, © Denis Howe 2010
Cite This Source

Word of the Day

Difficulty index for Markov process

Few English speakers likely know this word

Word Value for Markov

Scrabble Words With Friends