Markov Process

A stochastic process where the behavior of the variable over a short period of time depends solely on the value of the variable at the beginning of the period, not on its past history. Alternatively it is a finite set of ‘‘states’’ and whose next progression is determinable solely by the current state. A transition matrix model is an example of a Markov process.