PCA, as mentioned earlier, is a dimensionality reduction techniques. It has numerous applications like, visualization of high dimensional data, facial recognition, computer vision, image compression, determining patterns in a data, data mining, bioinformatics, psychology, analyzing and forecasting stock data,etc. We mention, image compression as one of the applications.
Subsection10.2.1Image compression with PCA
Similar to SVD, we can also compress the images using PCA. We take any image, first of all we separate the RBG channels of the images and apply PCA separately to red channel, green channel and blue channel. Next we take first \(k\) principal components and project the red, green and blue channel images and then combine the three channels to obtained the transformed image with \(k\) principal components.
Example10.2.1.
Consider an image of a Rose as shown in the Figure 10.2.2 This image is of sinze \(600\times 800\times 3\) array.
After applying PCA and taking first 5, 20 and 50 principal components and combining the three channels together we get the following approximate images as shown in the Figures 10.2.6, Figure 10.2.7, Figure 10.2.8, respectively. Each channel is of size \(600\times 800\text{.}\)
Figure10.2.6.5 components
Figure10.2.7.20 components
Figure10.2.8.50 components
We can see from the image, that 1st 50 components gives a very good approximation to the original image.
Subsection10.2.2Relation Between SVD and PCA
Consider a matrix \(X\) of size \(n\times d\text{.}\) We can apply SVD and PCA on \(X\text{.}\) Suppose the SVD of \(X\) is given by
\begin{equation*}
X = U\Sigma V^T\text{.}
\end{equation*}
Let \(U=[u_1~\ldots~ u_n]\) and \(V^T=\begin{bmatrix}v_1^T\\v_2^T\\\vdots\\v_d^T \end{bmatrix}\text{.}\) Then
The covariance matrix of \(X\) is \(\frac{1}{n-1}X^TX\text{.}\) This shows that \(S\) and \(X^TX\) are similar matrices. If \(\lambda_1,\ldots, \lambda_r\) are non zero eigenvalues of \(S\) and \(\sigma_1,\ldots, \sigma_r\) are singular values of \(X\text{.}\) Then they are related by the following relation
\begin{equation*}
\sigma_i^2=(n-1)\lambda_i, i = 1, 2,\ldots, r\text{.}
\end{equation*}
The relation \(X^TX = V\left( \Sigma^T\Sigma\right) V^T\) shows that right singular vectors are same as principal components. The left singular vectors are given by