0% found this document useful (0 votes)
5 views25 pages

Orthogonal Basis

The document discusses vector algebra, focusing on orthogonal and orthonormal bases, and the Gram-Schmidt orthogonalization process. It explains how to compute an orthogonal basis for a subspace, the projection of vectors, and the significance of orthogonal sets in linear independence. Additionally, it addresses solving inconsistent systems through projections and finding the closest points in vector spaces.

Uploaded by

Omar Galindo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views25 pages

Orthogonal Basis

The document discusses vector algebra, focusing on orthogonal and orthonormal bases, and the Gram-Schmidt orthogonalization process. It explains how to compute an orthogonal basis for a subspace, the projection of vectors, and the significance of orthogonal sets in linear independence. Additionally, it addresses solving inconsistent systems through projections and finding the closest points in vector spaces.

Uploaded by

Omar Galindo
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 25

Lecture 10: Vector Algebra: Orthogonal Basis

• Orthogonal Basis of a subspace


• Computing an orthogonal basis for a subspace using Gram-Schmidt
Orthogonalization Process

1
Orthogonal Set
• Any set of vectors that are mutually orthogonal, is a an orthogonal
set.
Orthonormal Set
• Any set of unit vectors that are mutually orthogonal, is a an orthonormal set.
• In other words, any orthogonal set is an orthonormal set if all the vectors in the
set are unit vectors.
• Example: 𝒖 ෞ1 , 𝒖
ෞ2 , 𝒖
ෞ3 is an orthonormal set, where,
3 1 1
− −
11 6 66
1 2 4
ෞ1 =
𝒖 ෞ2 =
,𝒖 ෞ3 = −
,𝒖
11 6 66
1 1 7
11 6 66
An orthogonal set is Linearly Independent
Projection of vector 𝒃 on vector 𝒂
• 𝒂 ∙ 𝒃 = 𝒂 𝒃 cos 𝜃
• Vector 𝒄 is the image/perpendicular projection of 𝒃 on 𝒂
• Direction of 𝒄 is the same as 𝒂
𝒂∙𝒃
• Magnitude of 𝒄 is 𝒄 = 𝒃 cos 𝜃 =
𝒂

ෝ∙𝒃
𝒄 =𝒂 𝒃
𝒃 sin 𝜽
• If 𝒂
ෝ is the unit vector of 𝒂, then
𝒂∙𝒃 𝒂∙𝒃 𝒂 𝒂∙𝒃
• vector 𝒄 = ෝ=
𝒂 = 𝒂 𝒃 cos 𝜃
𝒂 𝒂 𝒂 𝒂∙𝒂
Orthogonal Basis
• An orthogonal basis for a subspace 𝑊 of 𝑅𝑛 is a basis for 𝑊 that is
also an orthogonal set.

• Example:
1 0 0
0 , 1 , 0 is basically the 𝑥, 𝑦, and 𝑧 axis. It is an orthogonal basis in ℝ3 ,
0 0 1
and it spans the whole ℝ3 space. It is also an orthogonal set.
Orthogonal Basis
• We know that given a basis of a subspace, any vector in that subspace will be a linear
combination of the basis vectors.
• For example, if 𝒖 𝑎𝑛𝑑 𝒗 are linearly independent and form the basis for a subspace S,
then any vector 𝒚 in S can be expressed as:

𝒚 = 𝑐1 𝒖 + 𝑐2 𝒗
𝒚
• But computing 𝑐1 and 𝑐2 is not straight forward.
𝒗

• On the other hand, if 𝒖 𝑎𝑛𝑑 𝒗 form an orthogonal basis, then 𝒖


𝑐2
𝒚∙𝒖 𝒚∙𝒖 𝒚∙𝒗 𝒚∙𝒗
𝑐1 = = and 𝑐2 = =
𝒖 𝒖∙𝒖 𝒗 𝒗∙𝒗 𝑐1
Does not work if it is not Orthogonal basis

• 𝒚 = 𝑐1 𝒖 + 𝑐2 𝒗
• But computing 𝑐1 and 𝑐2 is not straight forward (yet).
𝒗
What is computed is,
𝒚
• 𝑑1 =
𝒚∙𝒖
=
𝒚∙𝒖
𝑑2
𝒖 𝒖∙𝒖

𝒚∙𝒗 𝒚∙𝒗 𝑐2 𝒖
• 𝑑2 = =
𝒗 𝒗∙𝒗
𝑐1
𝑑1 8
Orthogonal Decomposition Theorem
Orthogonal Decomposition Theorem
Orthogonal Decomposition Theorem
Orthogonal Decomposition Theorem
Example
Projection of a vector on a Subspace
Perpendicular
• 𝒖 and 𝒗 are orthogonal 3D vectors. to 𝒚′ , 𝒖, 𝒗
𝒚
• They span a plane (green plane) in 3D
• 𝒚 is an arbitrary 3D vector out of the plane. 𝒗
• 𝒚′ is the projection of 𝒚 onto the plane.
𝒚∙𝒖 𝒚∙𝒗 𝒚′
• 𝒚′ = 𝒖 + 𝒗
𝒖∙𝒖 𝒗∙𝒗 𝒖
• The “point” 𝒚′ is also
the closest point to 𝒚 on the plane.
𝒚 − 𝒚′is perpendicular to 𝒚′, Span{𝒖, 𝒗}, and hence 𝒖 𝑎𝑛𝑑 𝒗
Closest point of a vector to a span does not
depend on the basis of that span Perpendicular
𝒚 is an arbitrary 3D vector out of the plane. to 𝒚′ and the
• 𝒚′ is the projection of 𝒚 onto the plane. 𝒚 span
𝒖𝟒
• 𝒚′ = 𝑥1 𝒖𝟏 + 𝑥2 𝒖𝟐
𝒖𝟑
• 𝒚′ = 𝑥3 𝒖𝟑 + 𝑥4 𝒖𝟒
𝒚′
• 𝒚′ = 𝑥5 𝒖𝟏 + 𝑥6 𝒖𝟑
𝒖𝟐
•… 𝒖𝟏
• Coordinates of 𝒚′ change if the basis changes.
• But the vector 𝒚′ itself does not change.
• Hence the the closest point to 𝒚 on the plane does not change.
Even here, 𝒚 − 𝒚′is perpendicular to 𝒚′ and Span{., . }
Closest point of a vector to a span does not
depend on the basis of that span
• FINDING THE CLOSEST POINT OF A VECTOR TO
A SPAN means :
Perpendicular
“Finding the coordinates of the projection of the to 𝒚′ and the
vector” 𝒚 span
𝒖𝟒
• So, if you want to compute the closest point of 𝒖𝟑
a vector to a span, then find an appropriate
basis with which you can compute the 𝒚′
coordinates of the projection easily. 𝒖𝟐
𝒖𝟏
What would be that basis?
Answer: An orthogonal basis!
Projection on a span of non-orthogonal
vectors
• How to find projection of any arbitrary 3D vector onto
the span of two non-orthogonal, linearly independent
vectors?
𝒚
• 𝒖1 and 𝒖2 are not orthogonal, but linearly independent
vectors in 3D. 𝒗𝟐
• 𝒚 is an arbitrary 3D vector. 𝒖𝟐
• Find the projection of 𝒚 in the space spanned by 𝒖1 and 𝒚′
𝒖2 .
• a) First, find the orthogonal set of vectors 𝒗1 and 𝒗2 that span 𝒖𝟏
the same subspace as 𝒖1 and 𝒖2 . In other words, find an 𝒗𝟏
orthogonal basis.
• b) Project 𝒚 onto the space spanned by orthogonal 𝒗1 and 𝒗2
vectors, as we earlier.
How to find an orthogonal basis?
How to find an orthogonal basis?
• Assume that the first vector 𝒖1 is in the orthogonal basis. Other vector(s) of the
basis are computed that are perpendicular to 𝒖1
• Let 𝒗1 = 𝒖1
• Let 𝒗2 = 𝒖2 −
𝒖2 𝒗1
𝒗1 𝒚
𝒗1 𝒗1
• We know that 𝒗2 is perpendicular to 𝒗1 .
• 𝒗2 is in the Span{𝒖1 , 𝒖2 } (Why?) 𝒗2 𝒖2
• So Span{𝒖1 , 𝒖2 } = Span{𝒗1 , 𝒗2 } 𝒚′
• And {𝒗1 , 𝒗2 } is an orthogonal basis
• Projection of 𝒚 on to the Span{𝒖1 , 𝒖2 } 𝒖1
𝒗1
𝒚∙𝒗1 𝒚∙𝒗2
𝒚′ = 𝒗 + 𝒗
𝒗1 ∙𝒗1 1 𝒗2 ∙𝒗2 2
Gram-Schmidt Orthogonalization Process
Gram-Schmidt Orthogonalization Process
Example:
Solving Inconsistent Systems
Solving Inconsistent Systems
Solving Inconsistent Systems
Example 1:
A trader buys and/or sells tomatoes and potatoes. (Negative number means buys, positive number
means sells.) In the process, he either makes profit (positive number) or loss (negative number). A
week’s transaction is shown; find the approximate cost of tomatoes and potatoes.
Tomatoes Potatoes Profit/Loss (in
(tons) (tons) thousands)
1 -6 -1

1 -2 2

1 1 1

1 7 6
Solving Inconsistent Systems
• 𝟏𝒕 - 6𝒑 = −𝟏
• 𝟏𝒕 - 2𝒑 = 𝟐
• 𝟏𝒕 + 1𝒑 = 𝟏
• 𝟏𝒕 + 7𝒑 = 𝟔
1 −6 −1
• 1 𝑡 + −2 p = 2
1 1 1
1 7 6

The above equation might not have a solution (values of t and p that would satisfy that
equation). So the best we can do is to find the values of t and p that would result in a
vector on the right hand side that is as close as possible to the desired right hand side
vector.
Solving Inconsistent Systems
Example 2:

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy