Markov process
From Wikipedia, the free encyclopedia
In probability theory, a Markov process is a stochastic process that has the Markov property.
Often, the term Markov chain is used to mean a discrete-time Markov process. Also see continuous-time Markov process.
[edit] See also
- Markov chain
- Shift of finite type
- Markov decision process
- Semi-Markov process
- Continuous-time Markov process
[edit] External Links
Markov process from MathWorld
[edit] References
- P-E E. Bergner, Dynamics of Markovian Particles; A kinetics of macroscopic particles in open heterogeneous systems , (2005)