Two-dimensional singular value decomposition
From Wikipedia, the free encyclopedia
This article may require cleanup to meet Wikipedia's quality standards. Please improve this article if you can. (March 2007) |
Two-dimensional singular value decomposition (2DSVD) computes the low-rank approximation of a set of matrices such as 2D images or weather maps in a manner almost identical to SVD (singular value decomposition) which computes the low-rank approximation of a single matrix (or a set of 1D vectors).
SVD
Let matrix X = (x1,...,xn) contains the set of 1D vectors. In SVD, we construct covariance and Gram matrices
- F = XXT , G = XTX,
and compute their eigenvectors U = (u1,...,un) and V = (v1,...,vn). Since VVT = I,UUT = I, we have
- X = UUTXVVT = U(UTXV)VT = UΣVT
If we retain only K principal eigenvectors in U,V, this gives low-rank approximation of X.
In 2DSVD, we deal with a set of 2D matrices (X1,...,Xn) . We construct row-row and column-column covariance matrices
- ,
in exactly the same manner as in SVD, and compute their eigenvectors U and V. We approximate Xi as
- Xi = UUTXiVVT = U(UTXiV)VT = UMiVT
in identical fashion as in SVD. This gives a near optimal low-rank approximation of (X1,...,Xn) with the objective function
J = | ∑ | | Xi − LMiRT | 2 |
i |
Error bounds similar to Eckard-Young Theorem also exist.
2DSVD is mostly used in image compression and representation.
[edit] References
- Chris Ding and Jieping Ye. "Two-dimensional Singular Value Decomposition (2DSVD) for 2D Maps and Images". Proc. SIAM Int'l Conf. Data Mining (SDM'05), pp:32-43, April 2005.
- Jieping Ye. "Generalized Low Rank Approximations of Matrices". Machine Learning Journal. Vol. 61, pp. 167—191, 2005.