MAMT01
MAMT01
15.1 Introduction
15.5 Summary
15.7 Exercises
15.0 Objectives
This unit is in continuation of the earlier units on real inner-product spaces. After reading this unit
we will be able to understand the importance of orthogonal linear transformation which preserves the
length and angle between two vectors, the corresponding orthogonal matrix and a very important result,
known as the principal axis theorem.
15.1 Introduction
This unit introduces the concept of orthogonal linear transformation from an inner product-space
to an inner-product space, and many results based on this concept. We shall also study orthogonal ma-
trices and the theorems on orthogonal linear transformations and the orthogonal matrices. We shall also
study the results on eigenvalues of a self-adjoint linear transformations and the principal axis theorem.
A linear transformation t : V V from an inner product space V into an inner product space V
is said to orthogonal if
< t (u), t (v) > = < u, v > for all u, v V.
Theorem 1. Let V and V be inner product spaces. Then every orthonormal linear trans-
formation t : V V preserves the length and angle between two vectors.
Proof. For any u, v V we have
< t (u), t (v) > = < u, v >
In particular, < t (u), t (u) > = < u, u >
223
|| t (u) ||2 = || u ||2
|| t (u) || = || u || for all u V.
Thus an orthogonal linear transformation preserves the length.
If f is the angle between two vectors u and v of V, then
cos
u, v
u v
t u , t v
or cos
t u t v
Therefore an orthogonal linear transformation preserves the angle also.
Theorem 2. Let V and V be inner product spaces. Then every orthogonal linear transfor-
mation t : V V is a monomorphism of vector spaces.
Proof. In order to prove that t : V V is a monomorphism, it is sufficient to prove that
Ker (t) = {0}.
Let v Ker (t). Then t (v) = 0 V.
Now for any u V, we have
< u, v > = < t (u), t (v) > ( t is orthogonal)
= < t (u), 0 >
=0
Thus if v Ker (t) < u, v > = 0 for all u V.
In particular, < v, v > = 0 v = 0
Hence v Ker (t) v = 0
Therefore Ker (t) = {0}.
Hence t is a monomorphism.
The following theorem proves that an orthogonal linear transformation carries an orthogonal list
of vectors to an orthogonal list.
Theorem 3. Let V and V be inner-product spaces. If (u1, u2,...,un) is an orthonormal list
of vectors in V. Then the list (t (u1), t (u2),..., t (un)) is orthonormal in V, where t : V V be an
orthogonal linear transformation.
Proof. Since the list (u1, u2,...,un) is an orthonormal list in V, therefore
1, i j
< ui, uj > ij
0, i j
for i, j = 1, 2,...,n.
u i ui
n
i 1
t (u) i t ui
n
i 1
i ui , j u j
n n
i 1 j 1
i u j
n n
ui , u j
i 1 j 1
i j ij
n n
i 1 j 1
i 2
n
(on summing over j) .....(1)
i1
i t ui , j t u j
n n
i 1 j 1
i j t ui , t u j
n n
i 1 j 1
i j ij
n n
i 1 j 1
i2
n
(on summing over j) .....(2)
i1
u iui , i R
n
Let,
i 1
Then coordinates of t (u) are < t (u), t (ui) >,
t (u) t u , t ui t ui
n
i.e.
i 1
u , ui t ui
n
i 1
i t ui
n
i 1
t (u) i t ui
n
Thus
i 1
t i ui i t ui
n n
i 1 i 1
i.e
Hence t is linear.
228
15.3 Orthogonal matrix
A square matrix A over R is said to be orthogonal if its columns are orthonormal in the standard
inner product of Rn.
Thus a square matrix A is orthogonal if Ci Cj = ij for all i, j = 1, 2,...,n, where Ci stands for the
ith column vector in Rn.
Theorem 10. Let V be a finite dimensional inner product space. Then a linear transforma-
tion t : V V is orthogonal if and only if its matrix relative to an orthonormal basis is orthogo-
nal.
Proof. Let B = {u1, u2,..., un} be an orthonormal basis of V and A = [aij] over R be the matrix
of t relative to the basis B. So that
t (ui) aki uk 1 i n.
n
k 1
Now, t is orthogonal,
< t (u), t (v) > = < u, v > for all u,v V,
< t (ui), t (uj) > = < ui, uj >, i, j = 1, 2,...,n.
k 1 r 1
t (ui) a jiu j , 1 i n
n
j 1
a ji aki
n n
uk , u j
j 1 k 1
a ji aki jk a 2ji 1
n n n
(on summing over k) .....(1)
j 1 k 1 j 1
k 1 r 1
aki arj
n n
uk , ur
k 1 r 1
Equation (1) and (2) imply that ATA = I, since the (ik)th entry of AT is aki. Similarly, AAT = I.
Conversely, ATA = I, implies that the conditions in (1) and (2) are satisfied and hence
{ t (u1, t (u2),..., t (un)} is an orthonormal set which is basis of V. Thus t takes an orthonormal basis to
an orthonormal basis, so t is an orthogonal linear transformation.
Theorem 13. The determinant of an orthogonal matrix is 1.
Proof : Let A be an orthogonal matrix. Then
AAT = I
Det (AAT) = Det (I)
Det (A) Det (AT) = 1
Hence Det (A) = 1 [ Det (A) = Det (AT)]
230
matrix. Let be an eigenvalue of t, then it is an eigenvalue of A. Also let X be an eigenvector of A
corresponding to the eigenvalue , then we have
AX = X
AX X
T T
X T AT X T
T
XTA X
T (A is symmetric and real)
X T AX X X X X
AX X T X X T X
T T
Now .....(1)
Also X T AX X T
.....(2)
X X
So, from (i) & (ii), we have
X X
X X 0
T T
T
Hence is real, because X 0 and so X T X 0 .
Self-learning exercise-1
1. An orthogonal linear transformation between inner product spaces preserves the lengths, but not
the angle between two vectors. [T/F]
2. If Ci and Cj are columns of an orthogonal matrix, then Ci Cj = ..... .
3. Matrix A is orthogonal then AT = A–1. [T/F]
4. Matrix A is orthogonal then AAT = ...... .
Theorem 14. (Principal axis theorem). Let t be a self-adjoint linear transformation on a
finite dimensional inner product space V. Then the matrix of t with respect to some orthonormal
basis is a diagonal matrix whose diagonal elements are eigenvalues of t these are real, and each
appears on the diagonal as many times as its multiplicity.
Proof : We shall prove the theorem by induction on the dimension n of inner product space V.
For n = 1, let B = {v1} be an orthonormal basis.
So t (v1)= v1 1, for some 1 R.
And therefore, v1 is an eigenvector of t and 1 is the corresponding eigenvalue which is real and
A = {1}. Hence it is true for n = 1.
Now assume that it is true for a vector space whose dimension is (n – 1). Then we shall show
that it is true for V whose dimension is n.
Now let W = v1 = {u V | < u, v1 > = 0} be the orthogonal complement of v1. It is sub-
space of V. We shall now show that W is invarient under t. To do so, let v W. Then
231
< v, v1 > = 0 .....(1)
We have < t (v), v1 > = < v, t (v1) >, since t is self-adjoint
= < v, v1 1>
= 1 < v, v1 >
=0 [from (1)]
Thus t (v) W.
Hence v W t (v) W.
Thus t maps vectors of W to vectors of W only. Therefore the restriction of t to W is
tw : W W is also a self adjoint linear transformation.
Now, dim W + dim W = n
dim W = n – dim W
dim W = n – 1, because W = < {v1} > and dim W = 1
Hence tw : W W is a self adjoint linear transformation from an (n – 1) dimensional inner-
product space W to itself, therefore by induction assumption there is an orthonormal basis of eigen-
vectors, say {v2, v3, ...., vn} of W for which matrix of tw is a diagonal matrix. Combining basis
{v1} of W to basis {v2, v3, ...., vn} of W so that we have an orthonormal basis of eigenvectors
{v1, v2, ...., vn} of V, for which matrix of self adjoint t is a diagonal matrix and each diagonal element is
an eigenvalue of t, appears as many times as its multiplicity and the eigenvalues of t are also real.
15.5 Summary
In this unit we have studied orthogonal linear transformation between inner product spaces,
orthogonal matrices related to orthogonal linear transformations and their properties, and a very useful
and fundamental result known as the principal axis theorem.
15.7 Exercises
1. Prove that if W1 and W2 are subspaces of V such that dim W1 = dim W2, then there exists an
orthogonal transformation t such that t (W1) = W2.
2. If A is an orthogonal matrix, show that AT and A–1 are orthogonal matrices.
232
Reference Books
3. Topics in Algebra
I.N. Herstein
Vani Educational Books, New Delhi, 1975
4. Studies in Algebra
Dileep S. Chauhan and K.N. Singh
Jaipur Publishing House, Jaipur, 2009
233