Stationary process

From Wikipedia, the free encyclopedia

In the mathematical sciences, a stationary process (or strict(ly) stationary process) is a stochastic process whose probability distribution at a fixed time or position is the same for all times or positions. As a result, parameters such as the mean and variance, if they exist, also do not change over time or position.

As an example, white noise is stationary. However, the sound of a cymbal crashing is not stationary because the acoustic power of the crash (and hence its variance) diminishes with time.

Stationarity is used as a tool in time series analysis, where the raw data are often transformed to become stationary, for example, economic data are often seasonal and/or dependent on the price level. Processes are described as trend stationary if they are a linear combination of a stationary process and one or more processes exhibiting a trend. Transforming this data to leave a stationary data set for analysis is referred to as de-trending.

A discrete-time stationary process where the sample space is also discrete (so that the random variable may take one of N possible values) is known as a Bernoulli scheme. When N = 2, the process is called a Bernoulli process.

[edit] Weak or wide-sense stationarity

A weaker form of stationarity commonly employed in signal processing is known as weak-sense stationarity, wide-sense stationarity (WSS), or covariance stationarity. WSS random processes only require that 1st and 2nd moments do not vary with respect to time. Any strictly stationary process which has a mean and a covariance is also WSS.

So, a continuous-time random process x(t) which is WSS has the following restrictions on its mean function

1. \mathbb{E}\{x(t)\} = m_x(t) = m_x(t + \tau) \,\, \forall \, \tau \in \mathbb{R}

and correlation function

2. \mathbb{E}\{x(t_1)x(t_2)\} = R_x(t_1, t_2) = R_x(t_1 + \tau, t_2 + \tau) = R_x(t_1 - t_2, 0) \,\, \forall \, \tau \in \mathbb{R}.

The first property implies that the mean function mx(t) must be constant. The second property implies that the correlation function depends only on the difference between t1 and t2 and only needs to be indexed by one variable rather than two variables. Thus, instead of writing,

\,\!R_x(t_1 - t_2, 0)\,

we usually abbreviate the notation and write

R_x(\tau) \,\! \mbox{ where } \tau = t_1 - t_2.

When processing WSS random signals with linear, time-invariant (LTI) filters, it is helpful to think of the correlation function as a linear operator. Since it is a circulant operator (depends only on the difference between the two arguments), its eigenfunctions are the Fourier complex exponentials. Additionally, since the eigenfunctions of LTI operators are also complex exponentials, LTI processing of WSS random signals is highly tractable—all computations can be performed in the frequency domain. Thus, the WSS assumption is widely employed in signal processing algorithms.

[edit] See also