0% found this document useful (0 votes)
93 views237 pages

MAMT01

This document discusses orthogonal linear transformations between inner product spaces. It defines an orthogonal linear transformation as one that preserves the inner product of any two vectors. Several key results are proven, including: 1) Orthogonal linear transformations preserve the length and angle between any two vectors. 2) The composition and inverse of an orthogonal linear transformation are also orthogonal. 3) If a linear transformation maps an orthonormal basis to an orthogonal set, it is orthogonal. The document also introduces orthogonal matrices and the principal axis theorem, which characterizes the eigenvectors and eigenvalues of a self-adjoint linear transformation.

Uploaded by

amit
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
93 views237 pages

MAMT01

This document discusses orthogonal linear transformations between inner product spaces. It defines an orthogonal linear transformation as one that preserves the inner product of any two vectors. Several key results are proven, including: 1) Orthogonal linear transformations preserve the length and angle between any two vectors. 2) The composition and inverse of an orthogonal linear transformation are also orthogonal. 3) If a linear transformation maps an orthonormal basis to an orthogonal set, it is orthogonal. The document also introduces orthogonal matrices and the principal axis theorem, which characterizes the eigenvectors and eigenvalues of a self-adjoint linear transformation.

Uploaded by

amit
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 237

UNIT 15 : Real Inner Product Space-III

Structure of the Unit


15.0 Objectives

15.1 Introduction

15.2 Orthogonal linear transformation

15.3 Orthogonal matrix

15.4 Principal axis theorem

15.5 Summary

15.6 Answer to self-learning exercises

15.7 Exercises

15.0 Objectives

This unit is in continuation of the earlier units on real inner-product spaces. After reading this unit
we will be able to understand the importance of orthogonal linear transformation which preserves the
length and angle between two vectors, the corresponding orthogonal matrix and a very important result,
known as the principal axis theorem.

15.1 Introduction

This unit introduces the concept of orthogonal linear transformation from an inner product-space
to an inner-product space, and many results based on this concept. We shall also study orthogonal ma-
trices and the theorems on orthogonal linear transformations and the orthogonal matrices. We shall also
study the results on eigenvalues of a self-adjoint linear transformations and the principal axis theorem.

15.2 Orthogonal linear transformation

A linear transformation t : V  V from an inner product space V into an inner product space V
is said to orthogonal if
< t (u), t (v) > = < u, v > for all u, v V.
Theorem 1. Let V and V be inner product spaces. Then every orthonormal linear trans-
formation t : V  V preserves the length and angle between two vectors.
Proof. For any u, v V we have
< t (u), t (v) > = < u, v >
In particular, < t (u), t (u) > = < u, u >
223
 || t (u) ||2 = || u ||2
 || t (u) || = || u || for all u V.
Thus an orthogonal linear transformation preserves the length.
If f is the angle between two vectors u and v of V, then

cos  
u, v
u v

t u  , t  v 
or cos  
t u  t v
Therefore an orthogonal linear transformation preserves the angle also.
Theorem 2. Let V and V be inner product spaces. Then every orthogonal linear transfor-
mation t : V  V is a monomorphism of vector spaces.
Proof. In order to prove that t : V  V is a monomorphism, it is sufficient to prove that
Ker (t) = {0}.
Let v Ker (t). Then t (v) = 0 V.
Now for any u V, we have
< u, v > = < t (u), t (v) > (  t is orthogonal)
= < t (u), 0 >
=0
Thus if v Ker (t) < u, v > = 0 for all u V.
In particular, < v, v > = 0 v = 0
Hence v Ker (t) v = 0
Therefore Ker (t) = {0}.
Hence t is a monomorphism.
The following theorem proves that an orthogonal linear transformation carries an orthogonal list
of vectors to an orthogonal list.
Theorem 3. Let V and V be inner-product spaces. If (u1, u2,...,un) is an orthonormal list
of vectors in V. Then the list (t (u1), t (u2),..., t (un)) is orthonormal in V, where t : V  V be an
orthogonal linear transformation.
Proof. Since the list (u1, u2,...,un) is an orthonormal list in V, therefore
1, i  j
< ui, uj >  ij  
0, i  j
for i, j = 1, 2,...,n.

Since t is orthogonal, therefore


< t (ui), t (uj) > = < ui, uj > for i, j = 1, 2,...,n.
= ij
Therefore (t (u1),..., t(un)) is an orthonormal list of vectors in V.
224
Theorem 4. Let V and V be inner-product spaces. Then a linear transformation t : V V
is orthogonal if and only if
|| t (u) || = || u || for all u V.
Proof. Let t : V  V be an orthogonal linear transformation. Then
< t (u), t (v) > = < u, v > for all u, v V.
Thus < t (u), t (u) > = < u, u >
|| t (u) ||2 = || (u) ||2
Hence || t (u) || = || u ||, for all u V.
Conversely, let t : V  V be a linear transformation such that || t (u) || = || u || for all u V.
Then we have to show that t is an orthogonal linear transformation.
Let u, v V. Then
|| t (u + v) || = || u + v ||
 || t (u + v) ||2 = || u + v ||2
 < t (u + v), t (u + v) > = < u + v, u + v >
 < t (u) + t ( v), t (u) + t (v) > = < u + v, u + v >, since t is linear.
 < t (u), t (u) > + < t (v), t (v) > + 2 < t (u), t (v) > = < u, u > + < v, v > + 2 < u, v >
 || t (u) ||2 + || t (v) ||2 + 2 < t (u), t (v) > = || u ||2 + || v ||2 + 2 < u, v >
 < t (u), t (v) >= < u, v > for all u, v V.
Hence t is an orthogonal transformation.
Theorem 5. The composite of two orthogonal transformations, when defined, is an or-
thogonal transformation.
Proof. Let V, V and V be finite dimensional inner product spaces and t :VV and s : VV
be orthogonal linear transformations;
Then for any u, v V, we have
< (sot) (u), (sot) (v) > = < st (u), s t (v) >
= < t (u), t (v) > ( s is orthogonal)
= < u, v > ( t is orthogonal)
Hence sot is an orthogonal transformation.
Theorem 6. The inverse of an orthogonal linear transformation, when defined, is an or-
thogonal transformation.
Proof. Let V be a finite dimensional inner product space and t : V  Vbe an orthogonal linear
transformation. By theorem 2, t is a monomorphism and hence it is an isomorphism. Therefore t is in-
vertible.
225
Let u, v V, we have
 < tot–1 (u), tot–1 (v) > = < t (t–1 (u), t (t–1 (v)) >
 < u, v > = < t–1 (u), t–1 (v) > ( t is orthogonal)
Hence t–1 is an orthogonal transformation.
Theorem 7. Let V be a finite dimensional inner product space. Then the set of all orthogonal
transformation on V (all automorphism of the inner-product space) is a group.
Proof. Let A (V) be the set of all automorphism of the inner-product space V. By theorem 5, for
any t1, t2 A (V), (t1ot2) A (V). Therefore A (V) is closed.
Since orthogonal transformation are functions and composition of functions is always associa-
tive. Therefore, for any
t1, t2, t3 A (V)(t1ot2)ot3 = t1o(t2ot3) .....(1)
Since
< Iu (u). Iu (u) > = < u, u >  for all u V.
Hence the identify map Iu is an orthogonal transformation. Therefore
Iu A (V) .....(2)
t A (V) t is an orthogonal transformation on V
t is a monomorphism from V into itself [By theorem 2]
 t is an isomorphism [ t : V  V]
t–1 : V  V exists.
For any u, v V
< u, v > = < t t–1 (u), t t–1 (v) >
= < t (t–1 (u), t (t–1 (v)) >
= < t–1 (u), t–1 (v) > ( t is orthogonal)
 t–1 is orthogonal transformation.
 t–1 A (V) .....(3)
Hence A (V) is a group. A (V) is called orthogonal group of V.
Theorem 8. Let B = {u1, u2,..., un} be an orthonormal basis of an inner product space V.
Then a linear transformation t from V to an inner-product space V is orthogonal if and only if
the set {t (u1), t (u2),..., t (un)} is orthogonal in V.
Proof. First suppose that t : V  V is an orthogonal transformation. Then we have to prove
that {t (u1), t (u2),...,t (un)} is an orthogonal set.
Let u, v V, then
< t (u), t (v) > = < u, v >
226
Thus for all i, j = 1, 2,...,n.
< t (ui), t (uj) > = < ui, uj > = ij.
Since B is an orthogonal basis of V.
Hence {t (u1), t (u2),..., t (un)} is an orthogonal set in V.
Conversely, let t be a linear transformation such that {t (u1), t (u2),..., t (un)} is an orthogonal
set in V. Then we have to show that t is an orthogonal transformation. In order to prove this it is suffi-
cient to show that || t (u) || = || u || for all u V. Let u V be any vector, so for some i R, we have

u   i ui
n

i 1

 t (u)   i t  ui 
n

i 1

Now, || u ||2 = < u, u >

  i ui ,   j u j
n n

i 1 j 1

  i u j
n n
ui , u j
i 1 j 1

  i  j ij
n n

i 1 j 1

  i 2
n
(on summing over j) .....(1)
i1

and || t (u) ||2 = < t (u), t (u) >

  i t  ui  ,   j t  u j 
n n

i 1 j 1

  i  j t  ui  , t  u j 
n n

i 1 j 1

  i  j ij
n n

i 1 j 1

  i2
n
(on summing over j) .....(2)
i1

So from (1) and (2), we have || t (u) || = || u || for all u V.


Hence t is orthogonal.
227
Theorem 9. If t : V  V is any map from an inner-product space V to itself such that
(i) t (0) = 0
(ii) || t (u) – t (v) || = || u – v ||
Then t is an orthonormal linear transformation.
Proof. Using (i) and (ii), we have
|| t (u) || = || t (u) – 0 || = || t (u) – t (0) ||
= || u – 0 || = || u ||
That is, || t (u) || = || u || for all u V.
Also, using (ii), we get
|| t (u) – t (v) ||2 = || u – v ||2 = < u – v, u – v >
= < u, u > + < v, v > – 2 < u, v > ....(1)
and || t (u) – t (v)||2 = < t (u) – t (v), t (u) – t (v) >
= < t (u), t (u) > + < t (v), t (v) > – 2 < t (u), t (v) >
= || t (u) ||2 + || t (v) ||2 – 2 < t (u) , t (v) >
= || u ||2 + || v ||2 – 2 < t (u) , t (v) > .....(2)
Thus form (1) and (2), we obtain
< t (u), t (v) > = < u, v > for all u, v V.
So, t preserves the inner product.
Let {u1, u2,..., un} be an orthonormal basis of V, then by theorem 8, {t (u1), t (u2),..., t (un)} is
an orthonormal basis too.

u   iui , i  R
n
Let,
i 1
Then coordinates of t (u) are < t (u), t (ui) >,

t (u)   t  u  , t  ui  t  ui 
n
i.e.
i 1

  u , ui t  ui 
n

i 1

  i t  ui 
n

i 1

t (u)   i t  ui 
n
Thus
i 1

t   i ui    i t  ui 
 n  n

 i 1  i 1
i.e

Hence t is linear.
228
15.3 Orthogonal matrix

A square matrix A over R is said to be orthogonal if its columns are orthonormal in the standard
inner product of Rn.
Thus a square matrix A is orthogonal if Ci Cj = ij for all i, j = 1, 2,...,n, where Ci stands for the
ith column vector in Rn.

 aki akj  ij


n
or for all i, j = 1, 2,...,n.
k 1

Theorem 10. Let V be a finite dimensional inner product space. Then a linear transforma-
tion t : V  V is orthogonal if and only if its matrix relative to an orthonormal basis is orthogo-
nal.
Proof. Let B = {u1, u2,..., un} be an orthonormal basis of V and A = [aij] over R be the matrix
of t relative to the basis B. So that

t (ui)   aki uk 1  i  n.
n

k 1

Now, t is orthogonal,
 < t (u), t (v) > = < u, v > for all u,v V,
 < t (ui), t (uj) > = < ui, uj >, i, j = 1, 2,...,n.

  aki uk ,  arjur  ij


n n

k 1 r 1

  aki arj  ij


n n
uk , u r
k 1 r 1

  aki arj kr  ij


n n
( B is orthonormal basis)
k 1 r 1

  aki akj  ij


n
(on summing over r)
k 1

 Ci Cj = ij for all i, j = 1, 2,..., n.


 A is orthogonal matrix.
Theorem 11. A orthogonal matrix is always non singular.
Proof. Let A be an orthogonal matrix. Therefore columns of A are orthonormal in standard product
of Rn. We know that a list of orthonormal vectors in an inner product space is linearly independent.
Consequently rank (A) = n, which implies that | A |  0. Hence A is non-singular.
229
Theorem 12. Let V be a finite dimensional inner product space and t Hom (V, V). Then
linear transformation t is orthogonal if and only if the matrix A of t with respect to an orthonor-
mal basis satisfies the condition ATA = I and AAT = I.
Proof. Let A [aij] be an orthogonal matrix. Then

t (ui)   a jiu j , 1 i  n
n

j 1

< t (ui), t (ui) >   a ji u j ,  aki uk


n n
Now,
j 1 k 1

  a ji aki
n n
uk , u j
j 1 k 1

  a ji aki  jk   a 2ji  1
n n n
(on summing over k) .....(1)
j 1 k 1 j 1

< t (ui), t (uj) >   aki uk ,  arj ur


n n

k 1 r 1

  aki arj
n n
uk , ur
k 1 r 1

  aki arj kr   aki akj  0


n n n
(on summing over r) .....(2)
k 1 r 1 k 1

Equation (1) and (2) imply that ATA = I, since the (ik)th entry of AT is aki. Similarly, AAT = I.
Conversely, ATA = I, implies that the conditions in (1) and (2) are satisfied and hence
{ t (u1, t (u2),..., t (un)} is an orthonormal set which is basis of V. Thus t takes an orthonormal basis to
an orthonormal basis, so t is an orthogonal linear transformation.
Theorem 13. The determinant of an orthogonal matrix is  1.
Proof : Let A be an orthogonal matrix. Then
AAT = I
 Det (AAT) = Det (I)
 Det (A) Det (AT) = 1
 Hence Det (A) =  1 [ Det (A) = Det (AT)]

15.4 Principal axis theorem

Lemma : The eigenvalues of a self-adjoint linear transformation are real.


Proof : Let V be a finite dimensional real inner-product space. Let t : V  V be a self-adjoint
linear transformation. Then matrix A of t with respect to an orthonormal basis of V, is real symmetric

230
matrix. Let be an eigenvalue of t, then it is an eigenvalue of A. Also let X be an eigenvector of A
corresponding to the eigenvalue , then we have
AX = X 

  AX    X  
T T

 X T AT  X T
T
 XTA X 

   
T (A is symmetric and real)

X T AX  X  X X X 

 AX   X T  X     X T X  
T T
Now .....(1)

Also X T AX  X T
.....(2)

X   X 
So, from (i) & (ii), we have

X  X

X X       0
T T


T

 
Hence  is real, because X  0 and so X T X  0 .

Self-learning exercise-1

1. An orthogonal linear transformation between inner product spaces preserves the lengths, but not
the angle between two vectors. [T/F]
2. If Ci and Cj are columns of an orthogonal matrix, then Ci Cj = ..... .
3. Matrix A is orthogonal then AT = A–1. [T/F]
4. Matrix A is orthogonal then AAT = ...... .
Theorem 14. (Principal axis theorem). Let t be a self-adjoint linear transformation on a
finite dimensional inner product space V. Then the matrix of t with respect to some orthonormal
basis is a diagonal matrix whose diagonal elements are eigenvalues of t these are real, and each
appears on the diagonal as many times as its multiplicity.
Proof : We shall prove the theorem by induction on the dimension n of inner product space V.
For n = 1, let B = {v1} be an orthonormal basis.
So t (v1)= v1 1, for some 1 R.
And therefore, v1 is an eigenvector of t and 1 is the corresponding eigenvalue which is real and
A = {1}. Hence it is true for n = 1.
Now assume that it is true for a vector space whose dimension is (n – 1). Then we shall show
that it is true for V whose dimension is n.
Now let W = v1 = {u V | < u, v1 > = 0} be the orthogonal complement of v1. It is sub-
space of V. We shall now show that W is invarient under t. To do so, let v W. Then
231
< v, v1 > = 0 .....(1)
We have < t (v), v1 > = < v, t (v1) >, since t is self-adjoint
= < v, v1 1>
= 1 < v, v1 >
=0 [from (1)]
Thus t (v) W.
Hence v W  t (v) W.
Thus t maps vectors of W to vectors of W only. Therefore the restriction of t to W is
tw : W  W is also a self adjoint linear transformation.
Now, dim W + dim W = n
 dim W = n – dim W
 dim W = n – 1, because W = < {v1} > and dim W = 1
Hence tw : W  W is a self adjoint linear transformation from an (n – 1) dimensional inner-
product space W to itself, therefore by induction assumption there is an orthonormal basis of eigen-
vectors, say {v2, v3, ...., vn} of W for which matrix of tw is a diagonal matrix. Combining basis
{v1} of W to basis {v2, v3, ...., vn} of W so that we have an orthonormal basis of eigenvectors
{v1, v2, ...., vn} of V, for which matrix of self adjoint t is a diagonal matrix and each diagonal element is
an eigenvalue of t, appears as many times as its multiplicity and the eigenvalues of t are also real.

15.5 Summary

In this unit we have studied orthogonal linear transformation between inner product spaces,
orthogonal matrices related to orthogonal linear transformations and their properties, and a very useful
and fundamental result known as the principal axis theorem.

15.6 Answers to self learning exercises

Self learning exercise-1


1. false 2. ij 3. true 4. I

15.7 Exercises

1. Prove that if W1 and W2 are subspaces of V such that dim W1 = dim W2, then there exists an
orthogonal transformation t such that t (W1) = W2.
2. If A is an orthogonal matrix, show that AT and A–1 are orthogonal matrices.



232
Reference Books

1. Contemporary Abstract Algebra


Joseph A. Gallian
Narosa Publishing House, New Delhi, 1998

2. A First Course in Abstract Algebra


John B. Fraleigh
Addison-Wesley Publishing Company, 1970

3. Topics in Algebra
I.N. Herstein
Vani Educational Books, New Delhi, 1975

4. Studies in Algebra
Dileep S. Chauhan and K.N. Singh
Jaipur Publishing House, Jaipur, 2009

233

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy