Gauss–Markov process

From Wikipedia, the free encyclopedia

This article is not about the Gauss–Markov theorem of mathematical statistics.

Gauss–Markov stochastic processes (named after Carl Friedrich Gauss and Andrey Markov) are stochastic processes that satisfy the requirements for both Gaussian processes and Markov processes.

Every Gauss-Markov process X(t) possesses the three following properties:

  1. If h(t) is a non-zero scalar function of t, then Z(t) = h(t)X(t) is also a Gauss-Markov process
  2. If f(t) is a non-decreasing scalar function of t, then Z(t) = X(f(t)) is also a Gauss-Markov process
  3. There exists a non-zero scalar function h(t) and a non-decreasing scalar function f(t) such that X(t) = h(t)W(f(t)), where W(t) is the standard Wiener process.

Property (3) means that every Gauss–Markov process can be synthesized from the standard Wiener process (SWP).

[edit] Properties

A stationary Gauss–Markov process with variance \textbf{E}(X^{2}(t)) = \sigma^{2} and time constant β − 1 have the following properties.

Exponential autocorrelation:

\textbf{R}_{x}(\tau) = \sigma^{2}e^{-\beta |\tau|}.\,

(Power) spectral density function:

\textbf{S}_{x}(j\omega) = \frac{2\sigma^{2}\beta}{\omega^{2} + \beta^{2}}.\,

The above yields the following spectral factorisation:

\textbf{S}_{x}(s) = \frac{2\sigma^{2}\beta}{-s^{2} + \beta^{2}} 
                         = \frac{\sqrt{2\beta}\,\sigma}{(s + \beta)} 
                           \cdot\frac{\sqrt{2\beta}\,\sigma}{(-s + \beta)}.
Languages