Code360 powered by Coding Ninjas X Naukri.com. Code360 powered by Coding Ninjas X Naukri.com
Table of contents
1.
Introduction
2.
Mathematical Equation
2.1.
Example
3.
Applications of SVD
3.1.
Calculating pseudo-inverse
3.2.
Rank, Range, and Null Space
3.3.
Curve Fitting Problems
4.
Implementation
4.1.
Image compression using SDV
5.
Frequently Asked Questions
6.
Key Takeaways
Last Updated: Mar 27, 2024

Singular Value Decomposition

Author soham Medewar
0 upvote
Master Python: Predicting weather forecasts
Speaker
Ashwin Goyal
Product Manager @

Introduction

Singular Value Decomposition(SVD) is simply the factorization of the matrix. A given matrix is broken down into three matrices such that their dot product is equal to the original matrix. SVD is the most widely used method for matrix decomposition. SVD has its application in data reduction, data compression, etc. Furthermore, we will discuss its mathematical method, code, and an example.

Mathematical Equation

Let us consider a matrix A of dimension m×n. As shown below, we can break matrix A into three matrices U, Σ, and V.

A = U . Σ . VT

(VT represents the transpose of the matrix V)

Here the dimensions of U are m×m, the dimensions of Σ are m×n, and the dimensions of V are n×n.

Matrix U is also called the left singular vectors of matrix A, and V is called the right singular vectors of matrix A. The Σ is known as the sigma matrix. The sigma matrix is the diagonal matrix.

  • U is a matrix of the orthonormal eigenvectors of AAT.
  • VT is a matrix of the orthonormal eigenvectors of ATA.
  • Σ is matrix having singular values, which are the square roots of the eigenvalues of ATA.

Example

Let us under SVD by an example.

Consider a matrix A find SVD of the matrix A.

The first step for finding the SVD is to calculate the singular values 𝜎i by finding eigenvalues of AAT.

AAT =

 

The characteristics polynomial will be:

AAT - λI= 0

Therefore, we will get:

λ2 - 34λ + 225 = 0

Eigen values will be 9, 25. 

Singular values = sqrt(9), sqrt(25) i.e, 3, 5.

𝜎= 5, 𝜎2 = 3.

Furthermore, we will find the right singular vectors by finding an orthonormal set of eigenvectors of ATA. The eigenvalues will be 0, 9, 25. As ATA is symmetric, the eigen vectors will be orthogonal.

For λ = 25, we have

which is row-reduced to:

A unit length vector in that direction is:

Similarly, we will calculate the unit length vector for λ = 9.

which is row-reduced to:

 

A unit length vector in that direction is:

For the last eigenvector, we can find the unit vector perpendicular to v1 and v2.

v1T.v3. = 0

v2T.v3. = 0

Solving the equations we get.

Therefore, v3 equals to:

Till now, we know sigma and VT.

We can calculate U by using the following formula ui = 1/σ(Avi)

Hence matrix A is factorized into three matrices.

Get the tech career you deserve, faster!
Connect with our expert counsellors to understand how to hack your way to success
User rating 4.7/5
1:1 doubt support
95% placement record
Akash Pal
Senior Software Engineer
326% Hike After Job Bootcamp
Himanshu Gusain
Programmer Analyst
32 LPA After Job Bootcamp
After Job
Bootcamp

Applications of SVD

Calculating pseudo-inverse

SVD is used to calculate the pseudo inverse or Moore-Penrose inverse. It is a generalization of the matrix inverse that may not be invertible. If the matrix is not invertible, then it's pseudo-inverse exists, and if the matrix is invertible, its inverse is equal to pseudo-inverse. It is denoted by AT.

Methods to the pseudo-inverse of a matrix.

Let us consider a matrix A.

SVD of a matrix can be written as:

A = U . Σ . VT

Multiplying both sides by A-1.

A-1A = A-1 . U . Σ . VT

I = A-1 . U . Σ . VT

Now, multiply both sides by V

V = A-1 . U . Σ . VTV

V = A-1 . U . Σ

Now, multiply by Σ-1, as Σ is singular matrix its inverse will be:

diag(a1, a2, a3, a4, ….)-1

diag(1/a1, 1/a2, 1/a3, 1/a4, ….)

V . Σ-1 = A-1 . U . Σ . Σ-1

V . Σ-1 = A-1 . U

Now, multiply by UT.

V . Σ-1 . UT = A-1 . U . UT

V . Σ-1 . UT = A-1

The above equation qives pseudo-inverse.

Rank, Range, and Null Space

  • The rank of the matrix can be calculated by using SVD. Counting the number of non-zero singular values.
  • The range of matrix A is the left singular vectors of U corresponding to singular non-zero values.
  • The right singular vectors of matrix V corresponding to zeroed singular values represent the Null Space of matrix A.

Curve Fitting Problems

The application of SVD is used to minimize the least square error. It uses pseudo-inverse for approximation.

Other than this, SVD is also used in digital signal processing and image processing.

Implementation

We will implement the Singular Value Decomposition code in python.

import numpy as np
from scipy.linalg import svd

# define a matrix
X = np.array([[332], [2,3,-2]])
print(X)

# performing SVD
U, sigma, V_transpose = svd(X)

#printing matrices
print("Matrix U: ",U)
print("Sigma matrix: ",sigma)
print("Transpose of V: ",V_transpose)

 

[[ 3  3  2]
2  3 -2]]
Matrix U: [[ 0.7815437 -0.6238505]
0.6238505  0.7815437]]
Sigma matrix: [5.54801894 2.86696457]
Transpose of V: [[ 0.64749817  0.7599438   0.05684667]
[-0.10759258  0.16501062 -0.9804057 ]
[-0.75443354  0.62869461  0.18860838]]

Image compression using SDV

import numpy as np
from scipy.linalg import svd
import matplotlib.pyplot as plt
from skimage import data
from skimage.color import rgb2gray

astronaut = data.astronaut()

#displating astronaut image
plt.imshow(astronaut)

# converting to grayscale
gray_astronaut = rgb2gray(astronaut)
 
# calculate the SVD and plot the image
U,S,V_T = svd(gray_astronaut, full_matrices=False)
S = np.diag(S)
fig, ax = plt.subplots(42, figsize=(515))
 
curr_fig=0
forin [1050100500]:
  ast =U[:, :r] @  S[0:r, :r] @ V_T[:r, :]
  ax[curr_fig][0].imshow(256-ast)
  ax[curr_fig][0].set_title("k = "+str(r))
  ax[curr_fig,0].axis('off')
  ax[curr_fig][1].set_title("Original Image")
  ax[curr_fig][1].imshow(gray_astronaut)
  ax[curr_fig,1].axis('off')
  curr_fig +=1
plt.show()

Frequently Asked Questions

  1. What is SVD used for?
    Singular Value Decomposition (SVD) is a widely used technique to decompose a matrix into several component matrices, exposing many of the useful and interesting properties of the original matrix.
     
  2. Does SVD always exists?
    Yes, SVD always exists for any square and rectangular matrix.
     
  3. What is an advantage of SVD?
    SVD helps simplify data, removing noise from the data; it is also used for representing the dataset in simple and reduced form.

Key Takeaways

In this article, we have discussed the following topics:

  • Method to calculate singular value decomposition
  • Application of SVD
  • Image compression using SVD

Check out this problem - Matrix Median


Hello readers, here's a perfect course that will guide you to dive deep into Machine learning.

Happy Coding!

Previous article
Eigenvalue Decomposition
Next article
Singular Value Decomposition and Eigenvalue Decomposition
Live masterclass