Singular Value Decomposition (SVD):
Where:
From: | To: |
Singular Value Decomposition (SVD) is a fundamental matrix factorization technique in linear algebra that decomposes any m×n matrix into three component matrices: U, Σ, and Vᵀ. It's widely used in signal processing, statistics, and machine learning.
The SVD decomposes a matrix A into:
Where:
Key Properties: The singular values in Σ are always non-negative and typically ordered from largest to smallest. The number of non-zero singular values equals the rank of the matrix.
Common Uses: Principal Component Analysis (PCA), data compression, noise reduction, solving linear systems, pseudoinverse computation, and recommendation systems.
Instructions: Enter your matrix using commas or spaces to separate elements and semicolons to separate rows. For example, "1,2,3;4,5,6" represents a 2×3 matrix.
Q1: What's the difference between SVD and eigenvalue decomposition?
A: SVD works for any matrix (including rectangular ones), while eigenvalue decomposition only works for square matrices. SVD always exists and is numerically stable.
Q2: How are singular values related to eigenvalues?
A: The singular values of A are the square roots of the eigenvalues of AᵀA (or AAᵀ).
Q3: What do the U and V matrices represent?
A: U's columns are eigenvectors of AAᵀ, V's columns are eigenvectors of AᵀA. They represent orthogonal bases for the column and row spaces.
Q4: When is SVD unique?
A: SVD is unique up to sign flips of singular vectors when singular values are distinct. Repeated singular values have corresponding subspaces that can be rotated.
Q5: How is SVD used in data science?
A: In PCA for dimensionality reduction, in recommender systems (collaborative filtering), and for low-rank approximations that reduce noise in data.