SVD (Singular Value Decomposition) is one of my favorite topics in linear algebra. It’s almost magical to factorize any matrix into a product of two orthogonal matrices and a diagonal matrix.
Here is a proof the existence of SVD in my poor handwriting :p
In other words, A can be decomposed into a product of U, ∑ and V , where ∑ contains the singular values along it’s diagonal, and U & V are unitary matrices.
- Image compression
- Low rank approximations
- Rank determination
- Least squares
and a million more…
Let’s look at one such application in computer vision:
What if we delete small singular values from ∑ and its corresponding vectors from U and V ? We can then obtain the projection of A onto a lower dimensional subspace. This technique can be used to compress an image at the loss of some high frequency information.
Using this code, we can reconstruct the image using the first N components
There are 942 singular values for the original image, since it’s a (942x942) image with unique pixel rows.
We can recover most of the low frequency information from the image using only the first 15-30 components of SVD. Even finer details like eye balls are present in the reconstructed image, with a compression rate ~ 2000-4000.