Kernel regression
From Wikipedia, the free encyclopedia
The kernel regression is a non-parametrical technique in statistics to estimate the conditional expectation of a random variable. The objective is to find a non-linear relation between a pair of random variables X and Y.
In any nonparametric regression, the conditional expectation of a variable Y relative to a variable X may be written:
where m is a non-parametric function.
Contents |
[edit] Nadaraya-Watson kernel regression
Nadaraya (1964) and Watson (1964) proposed to estimate m as a locally weighted average, using a kernel as a weighting function. The Nadaraya-Watson estimator is:
where K is a kernel with a bandwidth h.
[edit] Derivation
Using the kernel density estimation for the joint distribution f(x,y) and f(x) with a kernel K,
,
we obtain the Nadaraya-Watson estimator.
[edit] Priestley-Chao kernel estimator
[edit] Gasser-Müller kernel estimator
where si = (xi-1 + xi)/2
[edit] Kernel regression for image processing
Applications of Kernel regression for image processing purposes include but is not limited to denoising, deblurring, interpolation, super-resolution, and many other applications [1].
[edit] References
- ^ H. Takeda, S. Farsiu, and P. Milanfar. Kernel Regression for Image Processing and Reconstruction. IEEE Trans. on Image Processing, vol. 16, no. 2, pp. 349-366, Feb. 2007.
Nadaraya, E. A. (1964). "On Estimating Regression". Theory of Probability and its Applications 9 (1): 141–142. doi: .
Simonoff, Jeffrey S. (1996). Smoothing Methods in Statistics. Springer. ISBN 0-387-94716-7.
[edit] Statistical implementation
kernreg2 y x, bwidth(.5) kercode(3) npoint(500) gen(kernelprediction gridofpoints)
[edit] External links
- Scale-adaptive kernel regression (with Matlab software).
- Teknomo's Kernel Regression Tutorial (with MS Excel).
- An online kernel regression demonstration Requires .NET 3.0 or latter and IE 7.0 or latter