Statistical signal processing

From Wikipedia, the free encyclopedia

Statistical signal processing is an area of signal processing dealing with signals and their statistical properties (e.g., mean, covariance, etc.). It is primarily covered by the field of electrical and computer engineering, although important applications exist in almost all scientific fields.

Statistical signal processing is founded on the principle that signals are not deterministic functions. Rather, signals are modeled as functions consisting of both deterministic and stochastic components. A simple example and also a common model of many statistical systems is a signal x(t) that consists of a deterministic part s(t) with added Gaussian noise: x(t) = s(t) + \mathcal{N}(\mu,\sigma^2). Given information about a statistical system and the random variable from which it is derived, we can increase our knowledge of the output signal; conversely, given the statistical properties of the output signal, we can infer the properties of the underlying random variable.

These statistical techniques are developed in the fields of estimation theory, detection theory, and numerous related fields that rely on statistical information to maximize their efficiency.