Guru
Guru
(An Autonomous Institute Affiliated to VTU, Belagavi, Accredited by NAAC with ‘A’ Grade)
Shavige Malleshwara Hills, Kumaraswamy Layout, Bengaluru-560111
Topic Report
Linear Algebra – Orthogonal Projections and Gram-Schmidt Process Submitted by
Name USN
Sudhanva S P 1DS23EE105
T Veerabala 1DS23EE119
Hemanth 1DS23EE406
Pradyumna N 1DS23EE408
In linear algebra, orthogonal projections and the Gram-Schmidt process are fundamental concepts used to work with
orthogonal and Orthonormal in vector spaces.
1. Orthogonal Projection: An orthogonal projection maps a vector onto a subspace in such a way that the difference
between the original vector and its projection is orthogonal to the subspace. This is useful for:
1. Preserves Span: The orthogonal (or orthonormal) vectors produced by Gram-Schmidt span the same
subspace as the original vectors.
2. Uniqueness (up to scaling): The orthogonal basis obtained is unique if we enforce normalization, but
different normalizations lead to different orthonormal bases.
3. Numerical Instability: The classical Gram-Schmidt process can suffer from loss of orthogonality due to
rounding errors in floating-point arithmetic. Modified versions (e.g., Modified Gram-Schmidt) improve
stability.
4. QR Factorization Connection: Gram-Schmidt is related to the QR decomposition of a matrix,
where A=QRA=QR (QQ is orthogonal, RR is upper triangular).
5. Orthonormal Output: If the process is carried out with normalization, the resulting basis vectors are
orthonormal (unit length and mutually orthogonal).
Applications of Orthogonal Projections
Orthogonal projections involve projecting a vector onto a subspace such that the projection is orthogonal (perpendicular) to
the subspace. Applications include:
• Used in regression analysis (linear, polynomial) to minimize the error between observed data and predicted values.
• Helps solve overdetermined systems Ax=bAx=b by projecting bb onto the column space of AA.
c. Signal Processing
• Principal Component Analysis (PCA) uses projections to reduce dimensionality while preserving variance.
• Feature extraction by projecting data onto meaningful subspaces.
e. Control Theory
• State estimation and filtering (e.g., Kalman filters use projections to estimate system states).
b. Orthogonal Polynomials
d. Quantum Mechanics
Orthogonal projections and the Gram-Schmidt process are essential in linear algebra, with applications in data
fitting (least squares), computer graphics, signal processing, machine learning (PCA), and quantum mechanics.
These tools simplify computations, enhance numerical stability, and enable efficient solutions across science and
engineering. Their versatility makes them fundamental in both theory and real-world problem-solving.