0% found this document useful (0 votes)
210 views

SIMILAR Matrices

A and B are similar matrices if there exists an invertible matrix S such that AS = SB. If A is similar to B, then they have the same nullity, rank, eigenvalues, and A^t is similar to B^t for any integer t. A matrix A is similar to a diagonal matrix if and only if there exists a basis of eigenvectors for A.

Uploaded by

Oyaaa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
210 views

SIMILAR Matrices

A and B are similar matrices if there exists an invertible matrix S such that AS = SB. If A is similar to B, then they have the same nullity, rank, eigenvalues, and A^t is similar to B^t for any integer t. A matrix A is similar to a diagonal matrix if and only if there exists a basis of eigenvectors for A.

Uploaded by

Oyaaa
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

SIMILAR MATRICES

1. Similar Matrices
Fix a linear transformation
T : Rn → Rn
and an ordered basis, B = (~v1 , . . . , ~vn ), of Rn . The standard matrix [T ] and the
matrix of T with respect to B, [T ]B , are related by
[T ]S = S[T ]B and [T ] = S[T ]B S −1 and [T ]B = S −1 [T ]S.
where here
 
S = ~v1 | ··· | ~vn
is the change of basis matrix of the basis. In order to understand this relationship
better, it is convenient to take it as a definition and then study it abstractly.
Given two n × n matrices, A and B, we say that A is similar to B if there exists
an invertible n × n matrix, S, so that
AS = SB.
Observe, this is equivalently, to
A = SBS −1 or B = S −1 AS.
EXAMPLE: If A is
 similar
 to In , then A = In . 
2 −3 1 0
EXAMPLE: A = is similar B = . To see this, let
1 −2 0 −1
 
3 1
S= ,
1 1
and compute
 
3 −1
AS = = SB.
1 −1
EXAMPLE: If A is similar to B, then A2 is similar to B 2 . To see this observe,
that, by definition, A = SBS −1 for some invertible S. Hence,
A2 = (SBS −1 )(SBS −1 ) = SB(S −1 S)BS −1 = SBIn BS −1 = SB 2 S −1
and so A2 is similar to B 2 .
EXAMPLE: If A is similar to B and one is invertible, then both are and A−1
is similar to B −1 . Indeed, suppose B is invertible, then A = SBS −1 for invertible
S and so A is also invertible as it is the product of three invertible matrices. A
similar argument shows that B is invertible if A is. Finally, one has
A−1 = (SBS −1 )−1 = (S −1 )−1 B −1 S −1 = SB −1 S −1 ,
which completes the proof. In general, At is similar to B t for any integer t (need
A, B invertible when t < 0).
1
2 SIMILAR MATRICES

 
2 −3
EXAMPLE: A = has A100 = I2 . This is because A is similar to
  1 −2
1 0
B= . As B is diagonal,
0 −1
 100 
100 1 0
B = = I2 ,
0 (−1)100
and so A is similar to I2 and hence is equal to I2 . Similarly,
A101 = SB 101 S −1 = SBS −1 = A.

2. Properties of Similar matrices


Being similar is a equivalence relation. That is:
(1) (Reflexive): A is similar to A.
(2) (Symmetric): A is similar to B ⇐⇒ B is similar to A
(3) (Transitive): if A is similar to B and B is similar to C, then A is similar
to C.
This is relatively straightforward to check:
(1) Follows by taking S = In .
(2) Follows by observing AS = SB means BS −1 = S −1 A, i.e. BS 0 = S 0 A for
S 0 = S −1 .
(3) Follows by observing AS = SB and BT = T C means that AST = SBT =
ST C and so A is similar to C by using S 0 = ST . Here we used that S 0 is
the product of two invertible matrices and so is invertible.
It is not easy, in general, to tell whether two matrices are similar and this is a
question we will return to later in the class. It can be easy to tell when they are
not similar.
Theorem 2.1. If A and B are similar, then null(A) = null(B) (and so rank(A) =
rank(B)).
Proof. As A is similar to B, we have AS = SB for some invertible S. Now suppose
that ~z ∈ ker(B). It is clear that S~z ∈ ker(A). Indeed,
A(S~z) = (AS)~z = (SB)~z = S(B~z) = S~0 = ~0.
Moreover, if ~z1 , . . . , ~zp are a basis of ker(B), then S~z1 , . . . , S~zp are linearly inde-
pendent. Indeed,
c1 S~z1 + · · · + cp S~zp = ~0 ⇒ S(c1 ~z1 + · · · cp ~zp )
Hence,
c1 ~z1 + · · · cp ~zp ∈ ker(S) ⇒ c1 ~z1 + · · · cp ~zp = ~0.
This must be a trivial linear relation (as z1 , . . . , zp are a basis) which shows the
only linear relations among S~z1 , . . . , S~zp is the trivial one which proves the claim.
Hence, we have p linearly independent vectors in ker(A) and so
null(A) = dim ker(A) ≥ p = dim(ker(B)) = null(B).
As this argument is symmetric in A and B we conclude also that null(B) ≥ null(A)
which proves the result. Finally, by the rank-nullity theorem
rank(A) = n − null(A) = n − null(B) = rank(B)
where n is the number of columns in A and B. 
SIMILAR MATRICES 3

   
1 −1 1 2
EXAMPLE: is not similar to . By inspection, the first matrix
2 −2 0 −1
has rank = 1 and second has rank = 2.

3. Diagonal Matrices
A matrix is diagonal if its only non-zero entries are on the diagonal. For instance,
 
k1 0 0
B =  0 k2 0  ,
0 0 k3
is a 3 × 3 diagonal matrix. Geometrically, a diagonal matrix acts by “stretching”
each of the standard vectors. Algebraically, this means B is diagonal if and only if
B~ei = ki~ei
for each standard vector ~ei . As we have seen diagonal matrices and matrices that
are similar to diagonal matrices are extremely useful for computing large powers
of the matrix. As such, it is natural to ask when a given matrix is similar to a
diagonal matrix.
We have the following complete answer:
Theorem 3.1. A matrix A is similar to a diagonal matrix if and only if there is
an ordered basis B = (~v1 , . . . , ~vn ) so that
A~vi = ki~vi
for some ki ∈ R.
That is A stretches the ~vi by a factor ki . It is worth mentioning that the ~vi are
examples of eigenvectors of A (a topic we will study later).
Proof. Suppose first that A is similar to a diagonal matrix B. That is, AS = SB
for a diagonal matrix B and invertible matrix S. As B is diagonal B~ei = ki~ei where
ki is the ith entry on the diagonal.
If ~vi = S~ei , then B = (~v1 , . . . , ~vn ) is an ordered basis. Moreover,
A~vi = AS~ei = SB~ei = S(B~ei ) = S(ki~ei ) = ki S~ei = ki~vi .
This verifies one direction of the theorem.
Conversely, if one has the ordered basis B = (~v1 , . . . , ~vn ) so that A~vi = ki~vi , and
S is the change of basis matrix of B, then
(S −1 AS)~ei = S −1 (A(S~ei )) = S −1 (A(~vi )) = S −1 (ki~vi ) = ki S −1~vi = ki~ei .
Hence, B = S −1 AS is diagonal and is similar to A by definition. 
This may be rephrased in terms of linear transformations as follows:
Theorem 3.2. A linear transformation T : Rn → Rn given by T (~x) = A~x has a
basis B = (~v1 , . . . , ~vn ) so that [T ]B is diagonal if and only if T (~vi ) = λi~vi .
Proof. Take A = [T ] and B = [T ]B and apply the preceding theorem. 
EXAMPLE: Let R π2 : R2 → R2 be rotation counter clockwise by 90◦ . For any
~v 6= ~0, R π2 (~v ) is perpendicular to ~v , so there are no non-zero vectors that get
stretched. Hence, there is no basis B for which [R π2 ]B is diagonal.
4 SIMILAR MATRICES

EXAMPLE: Let P : R2 → R2 be projection onto the line y = x. This means


       
1 1 1 0
P = and P = .
1 1 −1 0
Hence, if    
1 1
B= , ,
1 −1
then  
1 0
[P ]B = .
0 0
As a consequence, if  
1 1
S=
1 −1
is the change of basis matrix of B, then
 
1 0 −1
[P ] = S[P ]B S −1 = S S .
0 0

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy