Two-dimensional singular value decomposition

From Wikipedia, the free encyclopedia

Two-dimensional singular value decomposition (2DSVD) computes the low-rank approximation of a set of matrices such as 2D images or weather maps in a manner almost identical to SVD (singular value decomposition) which computes the low-rank approximation of a single matrix (or a set of 1D vectors).

SVD

Let matrix X = (x1,...,xn) contains the set of 1D vectors. In SVD, we construct covariance and Gram matrices

F = XXT , G = XTX,

and compute their eigenvectors U = (u1,...,un) and V = (v1,...,vn). Since VVT = I,UUT = I, we have

X = UUTXVVT = U(UTXV)VT = UΣVT

If we retain only K principal eigenvectors in U,V, this gives low-rank approximation of X.

In 2DSVD, we deal with a set of 2D matrices (X1,...,Xn) . We construct row-row and column-column covariance matrices

F=\sum_i X_i X_i^T , G=\sum_i X_i^T X_i

in exactly the same manner as in SVD, and compute their eigenvectors U and V. We approximate Xi as

Xi = UUTXiVVT = U(UTXiV)VT = UMiVT

in identical fasion as in SVD. This gives a near optimal low-rank approximation of (X1,...,Xn) with the objective function

J = | XiLMiRT | 2
i

Error bounds similar to Eckard-Young Theorem also exist.

2DSVD is mostly used in image compression and representation.

[edit] References

  • Chris Ding and Jieping Ye. "Two-dimensional Singular Value Decomposition (2DSVD) for 2D Maps and Images". Proc. SIAM Int'l Conf. Data Mining (SDM'05), pp:32-43, April 2005.
  • Jieping Ye. "Generalized Low Rank Approximations of Matrices". Machine Learning Journal. Vol. 61, pp. 167—191, 2005.