markov-process

Markov process

noun Statistics.
a process in which future values of a random variable are statistically determined by present events and dependent only on the event immediately preceding.
Also, Markoff process.


Origin:
1935–40; after Russian mathematician Andreĭ Andreevich Markov (1856–1922), who developed it

Dictionary.com Unabridged
Based on the Random House Dictionary, © Random House, Inc. 2014.
Cite This Source Link To markov-process
Copyright © 2014 Dictionary.com, LLC. All rights reserved.
  • Please Login or Sign Up to use the Recent Searches feature