0% found this document useful (0 votes)
10 views30 pages

CS115 SVD QR Decompositions

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views30 pages

CS115 SVD QR Decompositions

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 30

MATRIX FACTORIZATIONS

 A factorization of a matrix A is an equation that


expresses A as a product of two or more matrices.
 Whereas matrix multiplication involves a synthesis of
data (combining the effects of two or more linear
transformations into a single matrix), matrix
factorization is an analysis of data.

Examples? Diagonalize matrix 𝐴 = 𝑃𝐷𝑃−1

Eigen Decomposition.
THE SINGULAR VALUE DECOMPOSITION
 Theorem: The Singular Value Decomposition Let A be
an 𝑚 × 𝑛 matrix with rank r. Then there exists an 𝑚 × 𝑛
diagonal matrix Σ whose diagonal entries are non-
negative (the first r singular values of A, 𝜎1 ≥ 𝜎2 ≥ ⋯ ≥
𝜎𝑟 > 0), and there exist an 𝑚 × 𝑚 orthogonal matrix U
and an 𝑛 × 𝑛 orthogonal matrix V such that

𝐴 = 𝑈Σ𝑉𝑇
THE SINGULAR VALUE DECOMPOSITION

 The columns of U in such a decomposition are called left


singular vectors of A, and the columns of V are called
right singular vectors of A.
 The diagonal entries of Σ are called the singular values
of A

𝐴 = 𝑈Σ𝑉𝑇
THE SINGULAR VALUE DECOMPOSITION
 Proof Let 𝜆𝑖 and vi be as in Theorem , so that {Av1, . . . , Avr} is
an orthogonal basis for Col A.
 Normalize each Avi to obtain an orthonormal basis {u1, . . . , ur},
where
1 1
𝑢𝑖 = 𝐴𝑣𝑖 = 𝐴𝑣𝑖
𝐴𝑣𝑖 𝜎1
 And (*)
𝐴𝑣𝑖 = 𝜎𝑖𝑢𝑖 (1 ≤ i ≤ r)
 Now extend {u1, . . . , ur} to an orthonormal basis {u1, . . . , um}
of ℝ𝑚 , and let
𝑈 = [𝑢1 𝑢2 . . . 𝑢𝑚] and 𝑉 = [𝑣1 𝑣2 . . . 𝑣𝑚]

 By construction, U and V are orthogonal matrices.


THE SINGULAR VALUE DECOMPOSITION
 Also, from (*),
𝐴𝑉 = 𝐴𝑥1 … 𝐴𝑣𝑟 0 … 0 = [𝜎1𝑢1 . . . 𝜎𝑟𝑢𝑟 0 . . . 0]
 Let D be the diagonal matrix with diagonal entries
𝜎1,...,𝜎𝑟, and let Σ be as follow. Then

 Since V is an orthogonal matrix, 𝑈Σ𝑉𝑇 = 𝐴𝑉𝑉𝑇 = 𝐴.


THE SINGULAR VALUE DECOMPOSITION
 Example Construct a singular value decomposition of
1 1 0
𝐴=
0 0 1
Solution A construction can be divided into three steps.
 Step 1. Find an orthogonal diagonalization of ATA. That is,
find the eigenvalues of ATA and a corresponding orthonormal
set of eigenvectors
 Step 2. Set up V and 𝜮. Arrange the eigenvalues of ATA in
decreasing order.
 Step 3. Construct U. When A has rank r, the first r columns of
U are the normalized vectors obtained from Av1, . . . , Avr.
THE SINGULAR VALUE DECOMPOSITION
 Example Construct a singular value decomposition of
1 1 0
𝐴=
0 0 1
 Step 1. Find an orthogonal diagonalization of ATA.
1 0 1 10
1 1 0
ATA= 1 0 = 1 1 0
0 0 1
0 1 0 01
det(ATA-𝛌𝐈)=0

The eigenvalues are λ =2, λ =1, and λ =0


THE SINGULAR VALUE DECOMPOSITION
 Example Construct a singular value decomposition of
1 1 0
𝐴=
0 0 1
 Step 1. Find an orthogonal diagonalization of ATA.
1 10
ATA= 1 1 0 det(ATA-𝛌𝐈)=0
0 01
1/ 2
 Basis for λ = 2: v1 = 1/ 2
0
0 −1/ 2
 Basis for λ = 1: v2 = 0 and Basis for λ = 0: v3 = 1/ 2
1 0
THE SINGULAR VALUE DECOMPOSITION

 Step 2. Set up V and 𝜮. Arrange the eigenvalues of ATA in


decreasing order. The corresponding unit eigenvectors, v1,
v2, and v3, are the right singular vectors of A.
1/ 2 0 −1/ 2
𝑉 = 𝑣1 𝑣2 𝑣3 = 1/ 2 0 1/ 2
0 1 0

 The square roots of the eigenvalues are the singular values:


𝜎1 = 2, 𝜎2 = 1, 𝜎3 = 0
THE SINGULAR VALUE DECOMPOSITION
 The matrix Σ is

2 0 0 2 0 0
Σ = 0 1 0 , Σ=
0 1 0
0 0 0
 Step 3. Construct U. When A has rank r, the first r columns of
U are the normalized vectors obtained from Av1, . . . , Avr.
𝐴 = 𝑈Σ𝑉𝑇 𝐴𝑉 = 𝑈Σ
THE SINGULAR VALUE DECOMPOSITION
 Thus
1 1
𝑢1 = 𝐴𝑣1 =
𝜎1 0
1 0
𝑢2 = 𝐴𝑣2 =
𝜎2 1
 Note that {u1, u2} is already a basis for ℝ2 . Thus no
additional vectors are needed for U, and U = [u1 u2]. The
singular value decomposition of A is
1/ 2 1/ 2 0
1 0 2 0 0
𝐴= 0 0 1
0 1 0 1 0
−1/ 2 1/ 2 0
THE SINGULAR VALUE DECOMPOSITION
 Example Construct a singular value decomposition of
4 11 14
𝐴=
8 7 −2
The Singular Value Decomposition
Image Compression
The Singular Value Decomposition

Image Compression
QR FACTORIZATION OF MATRICES
 Theorem: The QR Factorization
 If A is an m ×n matrix with linearly independent columns,
then A can be factored as A = QR, where Q is an m ×n
matrix whose columns form an orthonormal basis for
ColA and R is an n ×n upper triangular invertible matrix
with positive entries on its diagonal.

 Proof The columns of A form a basis {x1, . . . , xn} for


Col A. Construct an orthonormal basis {u1, . . . , un} for W
= Col A with property (1) in Theorem. This basis may be
constructed by the Gram-Schmidt process or some other
means.
QR FACTORIZATION OF MATRICES
 Let
𝑄 = [𝐮1 𝐮2 . . . 𝐮𝑛 ]

 For k = 1 ,…,k, xk is in Span{x1, . . . , xk} = Span{u1, . . . ,


uk}. So there are constants, r1k , . . . , rkk, such that

𝑥𝑘 = 𝑟1𝑘𝐮1 + ⋯ + 𝑟𝑘𝑘 𝐮𝑘 + 0 ∙ 𝐮𝑘 + 1 + ⋯ + 0 ∙ 𝐮𝑛

 We may assume that rkk ≥ 0. This shows that xk is a linear


combination of the columns of Q using as weights the
entries in the vector
QR FACTORIZATION OF MATRICES
𝑟1𝑘
.
.
.
𝑟
r𝑘 = 𝑘𝑘
0
.
.
.
0
 That is, xk = Qrk for k = 1, . . . , n. Let R = [r1 . . . rn]. Then
A = [x1 . . . xn] = [Qr1 . . . Qrn] = QR
 The fact that R is invertible follows easily from the fact that the
columns of A are linearly independent. Since R is clearly upper
triangular, its nonnegative diagonal entries must be positive.
QR FACTORIZATION OF MATRICES
QR FACTORIZATION OF MATRICES
THE GRAM-SCHMIDT PROCESS
 The Gram-Schmidt Process
 Given a basis {x1, . . . , xp} for a nonzero subspace W of ℝ𝑛 ,
define
v1 = x1
x2 ∙ v1
v2 = x2 − v1
v1 ∙ v1
x3 ∙ v1 x3 ∙ v2
v3 = x3 − v1 − v2
. v1 ∙ v1 v2 ∙ v2
..
x p ∙ v1 x p ∙ v2 xp ∙ v p − 1
vp = xp − v1 − v2 − . . . − vp − 1
v 1 ∙ v1 v2 ∙ v2 vp − 1 ∙ vp − 1
 Then {v1 , . . . , vp} is an orthogonal basis for W. In addition
Span{v1, . . . , vk} = Span{x1, . . . , xk} for 1 ≤ 𝑘 ≤ 𝑝 (1)
QR FACTORIZATION OF MATRICES
1 0 0
 Example: Find a QR factorization of A = 1 1 0 .
1 1 1
1 1 1
 Solution The columns of A are the vectors x1, x2, and
x3. An orthogonal basis for Col A = Span{x1, x2, x3} was
found in that example:

1 −3 0
1 1 −2/3
𝑣1 = , 𝑣2 = , 𝑣3 =
1 1 1/3
1 1 1/3
QR FACTORIZATION OF MATRICES
 To simplify the arithmetic that follows, scale v3 by letting
v3 = 3v3. Then normalize the three vectors to obtain u1, u2,
and u3, and use these vectors as the columns of Q:

1/2 −3/ 12 0
1/2 1/ 12 −2/ 6
Q= .
1/2 1/ 12 1/ 6
1/2 1/ 12 1/ 6

 By construction, the first k columns of Q are an


orthonormal basis of Span{x1 , . . . , xk}.

Slide 6.4- 24
QR FACTORIZATION OF MATRICES
 From the proof of Theorem, A = QR for some R. To
find R, observe that QTQ = I, because the columns of Q
are orthonormal. Hence
𝑄𝑇𝐴 = 𝑄𝑇 𝑄𝑅 = 𝐼𝑅 = 𝑅
 and
1/2 1/2 1/2 1/2 1 0 0
R = −3/ 12 1/ 12 1/ 12 1/ 12 1 1 0
1 1 1
0 −2/ 6 1/ 12 1/ 6 1 1 1
2 3/2 1
= 0 3/ 12 2/ 12
0 0 3/ 12
QR Decomposition-Based Algorithm for
Background Subtraction

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy