markoff pro cess

Markov process

noun Statistics.
a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Also, Markoff process.


Origin:
1935–40; after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it

Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source Link To markoff pro cess
WordNet
markov process

noun
a simple stochastic process in which the distribution of future states depends only on the present state and not on how it arrived in the present state 
WordNet® 3.0, © 2006 by Princeton University.
Cite This Source
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature