Markov chain definition probability
(Named after Andrei Markov
) A model of sequences of events where the probability of an event occurring depends upon the fact that a preceding event occurred.
A Markov process
is governed by a Markov chain.
, the principle of the Markov chain is applied to the selection of samples from a probability density function to be applied to the model. Simscript
II.5 uses this approach for some modelling functions.