Wold's theorem

From Wikipedia, the free encyclopedia

In statistics, Wold's theorem or Wold representation theorem, named after Herman Wold, says that every covariance-stationary time series can be written as a infinite moving average (MA(\infty)) process of its forecast errors. This is also called Wold decomposition.

Formally

y_{t}=\sum_{j=1}^{\infty}\theta_{j}\varepsilon_{t-j}+\varepsilon_{t}+\eta_{t},

where yt is the observed time series, \varepsilon_{t} is the forecast error (actual minus predicted value), θ is a coefficient of which the root is between −1 and 1 (i.e. inside the unit circle) and ηt is an additional error.