Markov process

From Wikipedia, the free encyclopedia

In probability theory, a Markov process is a stochastic process that has the Markov property.

Often, the term Markov chain is used to mean a discrete-time Markov process. Also see continuous-time Markov process.

[edit] See also

[edit] External Links

Markov process from MathWorld

[edit] References

This probability-related article is a stub. You can help Wikipedia by expanding it.