Markov property
From Wikipedia, the free encyclopedia
In probability theory, a stochastic process has the Markov property if the conditional probability distribution of future states of the process, given the present state and all past states, depends only upon the present state and not on any past states, i.e. it is conditionally independent of the past states (the path of the process) given the present state. A process with the Markov property is usually called a Markov process, and may be described as Markovian. See in particular
Mathematically, if X(t), t > 0, is a stochastic process, the Markov property states that
Markov processes are typically termed (time-) homogeneous if
and otherwise are termed (time-) inhomogeneous (or (time-) nonhomogeneous). Homogeneous Markov processes, usually being simpler than inhomogeneous ones, form the most important class of Markov processes.
In some cases, apparently non-Markovian processes may still have Markovian representations, constructed by expanding the concept of the 'current' and 'future' states. For example, let X be a non-Markovian process. Then define a process Y, such that each state of Y represents a time-interval of states of X, i.e. mathematically,
If Y has the Markov property, then it is a Markovian representation of X. In this case, X is also called a second-order Markov process. Higher-order Markov processes are defined analogously.
An example of a non-Markovian process with a Markovian representation is a moving average time series.
The most famous Markov processes are Markov chains, but many other processes, including Brownian motion (to a close approximation), are Markovian.