Wold's theorem
From Wikipedia, the free encyclopedia
In statistics, Wold's theorem or Wold representation theorem, named after Herman Wold, says that every covariance-stationary time series can be written as a infinite moving average (MA()) process of its forecast errors. This is also called Wold decomposition.
Formally
where yt is the observed time series, is the forecast error (actual minus predicted value), θ is a coefficient of which the root is between −1 and 1 (i.e. inside the unit circle) and ηt is an additional error.