follow Dictionary.com

Check out new words added to Dictionary.com

Markov process

noun, Statistics.
1.
a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Origin of Markov process
1935-1940
1935-40; after Russian mathematician Andreĭ Andreevich Markov (1856-1922), who developed it
Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2015.
Cite This Source
Markov process in Technology

probability, simulation
A process in which the sequence of events can be described by a Markov chain.
(1995-02-23)

The Free On-line Dictionary of Computing, © Denis Howe 2010 http://foldoc.org
Cite This Source

Word of the Day

Difficulty index for Markov process

Few English speakers likely know this word

Word Value for Markov

0
0
Scrabble Words With Friends