0% found this document useful (0 votes)
551 views19 pages

Ma1522 Impt Info Cheatsheet

Uploaded by

Calvin Chan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
551 views19 pages

Ma1522 Impt Info Cheatsheet

Uploaded by

Calvin Chan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

u

SU
CH
GO
OD
FR
IEN
DS

Drake Does MA1522


CONFIDENTIAL
Topic: Matrix General Properties and True / False

RREF
>
-
only unique

Acha
not be square
may

>
-
this is P

= [5 ]c 5

-- only
in RREF

n -
k
-
>
notForthiscor
a
-

0 = positive

>
-
O rector

CE

~
v -

Proj

Proj

Ex not As a

must

be lin
A

indest to un !
Topic: Linear Algebra basics

Properties of Transpose Triangular matrices Symmetrical matrices Row Equivalent Matrices


Two augmented matrices are
T
TT

(AB) = B A
TT row equivalent when one can
·
(A ) = A
T
be obtained from the other by
A is sum <-> A = A
a series of elementary row

operations
T
&
(cA) = cA
T T
⑧ (A+B) = A + B

kT
(A ) = (A )
-
K
A = A ->…—> A -> B
8 K

T O

·
A = 90 rotate of A
and then invert

Invertible Properties Matrix math Remark :

- k

A = (A )
-
I k
A= (a)
(i)

&
m+n

A =AA
M n
then
, A" =
mn mn

·
(A ) = A
(cA) = +A-
-

&

1 - 1 -
1

(AB) = B A
-

&

I
AM = I; M = A
-

&

LU Decomp
Algorithim:
1. RREF to form a upper triangular R
2. Use same elementary operations in
reverse with signs inverted on identity
matrix to get L

Solving shit with LU

Can find b Find y first

Can find y

Ax = b => L(Ux) = b Now find x


Topic: Invertibility See Next Topic for Props of inverse!
Main Theorem for Invertible

Key Formulas How to find Inverse -

All n x n matrices are either:


Ax = B -> A Ax = A B 1. Invertible (non singular)
I -

Form
-


I .

AA=I
1

! 2. Non-Invertible (singular)
-

(AB) = A B
- -

Singular Matrix:
. RREF A
2 At

In to In
• Not Row Equiv. To I
(A ) = inverse of A
- 1T T
·
turn At
• Do not have n pivot positions
A into
I to foul

(A A … A ) = A A … A
I 1 1
I
-
- -
-

• Have linearly dependent columns


12 k + 2 K

due to matrices of
being products

elementy operations (reduce Algorithm Sequence:


Key Ideas: 1. Suppose A is invertible (it is sq)
1. RREF Matrix relative to Identity to find Inverse Ac =
ICA I =
2. Ax = b has unique solution
2.
cRi
Ri
3S -
Ri
tri then BA =
I inverse At is unique
3. Ax = 0 has only trivial solution
j A

B 4. RREF of A is the Identity matrix


For 2x2 Matrix a x =
A
5. A is a product of elem. operations
if
,As
ad b
inverteare

*0
·

!
Note: Zero matrix is not invertible. In face, any matrix with zero row/
column is not invertible due to Lin dependence
· ifad-b=0, is
not inverteare

Topic: Determinants, Adjoint matrix and Creaming


Key Formulas: Cofactor Expansions: Adjoint Matrix:
*

( = 1 - 1) Azl

-> For 2x2 Matrix ( )


a b
Cd
Adjoint Imatrix
transpose of
·

is

def-ad-ba of 10 factors AdjA) c .


=

-> For n > 2 type ass matrix: ignore ithrow adi (A) (Ajiman) =
(Aijnen) =

det (A) def Ar and jth col


det)-1
.

= a a
-

Adjoin Prop:
*

+.... t -
1 Ain det (Ain)
1 decides + o-
r

Cofactoris deta A s Thoren AisSq


·

A[B]
-> det = (I, j)-th minor of A
·
.
=
AxB

ij
-

·
ATadj(A)] =
det (A) I
-> if A is triangular matrix:
h = order of matrix [adj(A)] A def(A) I
det(A) = product of main diagonal entries
· =

adj(AB) adj/13) udj(A)


↑=productof age
=
·

Properties: start
ent
e
abs
↓ value ·
A" =
[de + (A)]"x adj (A)
Row ops. applied and how det (A) changes (A- B) >

det (B) def (1)


·
adj(AT) =
CadiA) "if A is
sym.
R
,
=

R2
· -

::

· R, c R2
:: det(B) = -
detCA) ·
BAB" -
> A- A
f
· adjA =
(detA)"
·
kxR , :: def(B) =
k det (A) ·
if haveo color
o row
·
adj(adjA) =
(detA)" A
then def (A)
det (AT) det (A) det (A) dith
=
·
=

;
=

· det (A1) = det (A) det (B); ·


det (2A) = 2 det (A)
Creamers Rule:

Recall
x ,

· -

x =
"in Creamers is to find :
Value !
,

A = (aij ) -> invertible; b = (b )


nxn
1nx / 7
= 20

For any b in R , Unique solution of Ax = b has entries of: &


M
x2
= 27
x, = 20

i thota Yad; (A)] b


27
x2 =

det[ b)
-

x =
:

Broken Diagonal Method: Find Determinant


i
det (A)
A, (b) =
(i) -> x = F (a)

Topic: Vectors spaces and subspaces


Key Formulas: Key Ideas:
• Can use Linear algebra to represent
1) VIl =
it or 11V11 =

i
vector spaces!
Implicit Form: & vEIR")v fulfills some conditions?
• Implicit -> Explicit is to find its general
System implicit
solution !
linear -
> furn

• If Solution is unique the Explicit = the


2))
b,

Gr
911x + 02ket ...
+ Anan =

z3 and + an in

:Slatbya
+
x
y
= =

unique solution
.

am Amman
implicit from
, 32 + ameiizt .. - + =
bm

4) In u prom
• Explicit -> Implicit is to find T in terms
of params like x y and z. Then it forms a
[Ct -2
Explicit Form: General sol. To Lin. System
xample : line is -2 + + 3 , t + 1) +ER3
linear system of T. RREF and solve to
,

Step 1 : t in terms of x,
Y z

find 0 = equation. This now becomes ur


,

S Paramsh
sol.
general x= t 2
y 2t + 3 z t+

implicit like 0 = 2x + y - 3z
-
= -
=
, ,

in restor form

·
this is a linear
< +2 + O
System ! Pivotant :

• Solution Space/Set to a linear


u Subspace info !
2 Covi Sa
it

·
·
system is the set of solution
- A subset V = IR" is a subspace if: our vectors to the system
Sk 21/2 ea
(a) Contains the origin (0 vector) Whatitmore thana
+ 1 0 and -x + z-3
03 *
: 3,
= =

OEU
~

I
(b) Closed under addition
U +V f for any n
,
ver What is a homogenous linear system?
(c) Closed under scalar multiplication
Suppose, Ax = b if b = 0 then it is homo (contain
sector (
Cue
frangueV ,
COR
↑ Solution sets to non-homo systems do not True false Shit
spil contain the origin or 0. Intersection of any collection of subspaces of V is also a subspace of V
OR

So Union of 2 subspaces of V is NOT always a subspace of V


A0 = 0 but if b != 0 then 0 vector is not

d&
V is a linear span for some finite set S Dim of intersection of 2 subspaces U and W of V is always Sum of dim
⑨ 2

part of the solution space of ( i) U and W > dim(u + w)


-
= dim(u) + dim(w)
You can only write the solution space as a span for homogenous systems as
Every subspace of finite dim has a finite basis
they contain the zero vector. Solution sets for Non-homogenous systems

are not subspaces


Topic: Linear Spans, Linear independence and Basis
Key Formulas: Key Definitions constants

&
S = Span(u u … u ) where u can be col or row vector
12 n [

Linear combinations refers to av + bv + cv…. I 2 n

Key Ideas: ⑧
Span is a collection of vectors which are

Span can have many vectors the ‘building’ blocks for the space
&
Span vectors may not all be independent of each other ⑧
v is in the span if it is a Linear combination

When forming basis check to see any vectors in the span ⑨
If RREF a linear system and it is consistent
are a linear combination of any other vectors. then it is a linear combination

Linear independent means all vectors are CANNOT be
Dim of the span of a set of vectors is
formed by linear combination of the other vectors.

equal to the linearly independent


Span properties and shit vectors in that set. >
- S Spanis IR" then dim(s) =
3
2
S spans a place in IRS dim(s) =

(a)
(w) spand s) a Span
* if Span(s) = Span
but
if

the Spansul
spancs) /
>
-
- S C
w-

VI We U , 12 Us

S'IS , S' is in indept .

SEIRM
isespan To show Linear independence
u
see
1. Write the linear combi equation
-

Show 2. Solve for unknowns of the constants c


this
3. Check for trival. If only solution to
equation is c1 = c2 = 0 then linear
Basis shit indept
4. This cause has no pivot columns

① Just RREF to identity matrix bah


This shows they aren’t linear combos of the
others

True false shit:



Every vector space has at
least one basis.
When given a basis and to deduce another one ⑧
Standard basis where each
vector has a single non zero
entry

if it spans R It has a
standard basis
Topic: Dimensions, Rank and Nullity How to extend a basis
* IF Orthogonal basis
,
then complement t
-

Key Formulas:
like M

QR decorp


n = Rank(A) + Nullity(A)

Rank(A) = n = dim(A) if A is full rank
⑨ n - rank(A) = dim(solution space) Ax = b

number of cols of A = n
· dim(row space) = dim(col space) = rank(A)
just
add
non
zeroe te
rank(A) = rank(A ) Consistency Theory
↓dsincere
T
·
23
to be

rank(A) = 0 <-> A = 0 Pivot

:
-

rank(A) < m and rank(A) < n Consistency Theory shown


rank(A) < min{m, n} Ax =
b

rank(A) = min{m,n} = full rank


Sq Matrix -> is full rank <-> invertible
7
· bis reducible
a s b is linear combination

Multiply Matrix and Rank info


Span gu , 12 us ... 3 = Spanqu Kelly
, ...
63

True false shit


Properties of rank AB and space AB
* System of linear equations is & Null space of the transpose of a
matrix is equal to the orthogonal
Col space of A is not subspace of Col space of B Consistent iff coefficient matrix
complement of the rowspace
has full rank!

& Linear Transformation is injective iff


null space contains only zero vector
Also of A
&
Null space is always a subspace of row space
But not always a subspace of col space

② rank deficiency when not full rank

Nullity = 0 means 1Dimensional as contains 0v


Topic: Transition Matrix and Coordinate Vectors Coordinate vector is unique only if S is a basis

Key formulas: Key Definitions:



[w] S = P[w] T ; P = trans matrix T -> S ⑳
Trans matrix Express Basis(S) in terms of Basis(T) can be seen like
geometrical transformation as a basis can represent a change in axis!
Transition matrix math def
② Coordinate Vector represents a vector in the vector space as a linear
Suppose S and T are bases for subspace V n
R
combination like a coordinate in this space. Its the constants basically
S={ U Uz Uk }, T = { v , 22 Uk }
like how x, y are coordinate vectors of 2D space.
, ...
....

ais
="
V =
aju ,+... +
AjU = u
; [v] akj

thus trans matrix from T to S is all these as


+- T
,

PQ =
I
a -dik
P Q
P 1
(aij)km [v]s -[v]s
= = -

S -
> T -
> S

Akk then QP =
I

waV
, [W]g =
PIN] +

How to find Transition matrix P

if S -
> T

thn

( +(s)-

defining Us in terms of
Row and Column Space > ith row vector
r
ST
-

...
v

Row space is the vector space spanned by the rows >


· -
Spaner, ... r 3
(e) it column
= (I(p) "
Col space is the vector space spanned by the cols >
-

SpanEs, ....
Cm3
Vector

[UB+ [M] + 4 PGIPLIESEPH) : Col space is orthogonal to the null space


- Pivot cols of R form basis for column space of A
Examples of stuff: - Row ops DO NOT preserve column space
- Row ops preserve linear relations between columns
- Col space of A != col space of R

Row space is orthogonal to the Transpose null space
- row operations DO preserve row space of A
- Row ops do not preserve linear relations between rows
- Non-zero /pivot rows of RREF of A form basis of row space of A
- row space of A = row space of R

Topic: Dot product, vector formula

dist
d(u , v)
* d(u , w) =
+ d(r, w)

1. In rI .
ll411 VIVII
v11a1(all + 1v1
. (lu
2 +
Topic: Orthogonality and Orthonormal Orthogonal set of nonzero vectors is linear independent

Key formulas Key definitions


·
u v = 0 = is orthogonal &
Orthogonal Set is a set of nonzero vectors which are all orthogonal or
A A = I aka A A is diagonal perpindicular to one another
T - T

-
K

A A = I => A is invertible Orthonormal Set is a orthogonal set and all vectors are unit vectors
T -

⑧ &
-

A=A Orthogonal Matrix is a Sq matrix who’s columns or rows form an


T -
I
·
&

orthonormal set
If S contains n non-zero vectors in R , it is orthogonal basis
How to find orthogonal basis?

Gram-Shit process! From Basis bitch -> orthogonal Basis King

Or tho
> Orthogonal
normal
of
If S contains k non-zero vectors in V,
subset
> S = Orthonorm

S is orthogonal basis of V
70ES
S

Or tho

gonal >Subdetrogonal
algro
> SUS03 :

orthogonal

Coordinate Vector relative with ortho basis


Alt gay method of finding base

by
t
Form 7

system produc
-&
solution
general

Orthogonal Matrix
&
Suppose A is orthogonal. Suppose B is also orthogonal

Then A = A Then AB is also orthogonal


T -
I

Det(A) = + 1 A is also orthogonal


T

AA=I
T -

⑨ -
k


Cols of A form orthonormal basis for R
Rows of A form orthonormal basis for R
More on orthogonal matrix props

&
Orthogonal Matrices represent linear transformations
Pre-multiplication of an orthogonal matrix
that preserve lengths and angles of vectors

converts an orthonormal set into a orthonormal set (no change)



Transition matrix between orthonormal bases is an orthogonal matrix
Topic: Projection
Key Ideas
Key Formulas

Projection is defining a vector as a linear combination of the basis as p Span
Let w be a vector in R &
Useful for decamps or Least Square Problems
·
p = ( i)v = 11
p IIxi &
Projection can be used to find orthogonal vector to the basis
=( )v The normal from projection to vector is orthogonal to basis so its the orthogonal complement also lmao
normal base
W V - if or tho o
.

PROJECTION VISUASL

Find distance from point


Simply find p then calculate || w - p ||

Projection is the best approx

ORTOGONAL PROJECTION
Topic: Least Sum Squared Solutions
Key formulas Key Ideas
A Ax = A b The projection is best approx, minimize the difference in ||Ax - b||
T T

(A A | A b) —> to find u is to minimize the squares in distance formula so its called least
T T RREF
O


u = L.S.S sum of squares to minimize!
U is a Least Sq. Solution to Ax = b iff A Ax = A b AAu
T T

Au = projection of b onto V
⑧ =

-1

A(A A) A w = projection of w
T T

x = (A A) Ab -> projection matrix How to find this shit

Basic info
Usually if no eq given

Usually if give shit ton of data and equation


= A Au = Ab

Example of use Typically problems will give you an Using Method 1: Ax = b here is inconsistent!
equation which ‘minimizse’ the square vector not

Substant

in Space

r2

Constants in eq are unknown variables are

P-
i ② GS
known so roles are reversed! Find Orthonormal basis via

5)
8
(
0 .

5 .. vi = ,
3 ,
,
Ve = - ....

Fill 1’s for constant


. d
5

& find projection p


=
(b v )v + ( r)vz

Using Method 2: A Ax approach


.

,
>
T : -

2
=
111
psE70
(
·
11 1..
0 3
4
-

Solve
AkA
.

012-
812 --
1 .
6
Ax =
O u, +
Uz
23/6
.

p =
!

Example from tutorial


S-1
00
001.... 3 9 .

True False type shit


3 Solve the known matrices to find values for c, d and e
9275 • LSS may not be unique!

)
0 -

= 0 .
9225 also • For any choice of LSS,
3150 Projection Au always is unique
0
• False -> Ax = b has unique LSS
-

iff det(AA ) != 0 [(A A) doesn’t


work]

equations

irix >
-

-
Topic: QR Decomp Deep Dive
Key Formulas Key Ideas
·
A = QR

Can use orthonormal property to get an upper triangular matrix via dot product

Rx = Q b

Rx = Q b has unique solution so Ax = b has unique LSS which is the unique
(Q is orthonormal basis solution to this equation
and R is upper triangular) ⑨
Good for geometric representation of geometric transformations! As
orthonormal basis -> new axis -> as relative coordinate vectors

Ax = b => x = Q b

How to extend Basis to span bigger stuff?

Take any vector like e ie (1, 0, 0) Howothis ?


e1
- (e
1 w ) - (e i w)w
,
w
,
22

then normalize it.


OR

Orthogonal and QR decomposition geometric representation find Orthogonal complement and add it

How to find Orthogonal complement?


M

W = Nul(A )
-
T

I.E W = span{ (1,7,2), (-2, 3, 1) }


17 2 RREF 10 I
17
>
-

231 I 5/17
-

Cz =
free variable

Spend

= Additional information on QR decamps

A = QR => Q A = Q AR = R
A is m x n matrix with L.I columns
-> A A is invertible
-> A has a left inverse in B s.t BA = I
Topic: Diagonalization and Eigenvalues

Key Formulas ·

K
Key Definitions
k
K -1
A = PD P
-1 k 1

D=PAP Eigenvalue: root of char equation


-

D = P AP
Av =Xv eigenvector v is associated with A and Eigenvalue lambda EigenVector: vector associated with Eigenvalue via EQ.
n
EigenSpace: nullspace of input the value
2
x n = Ax n-1=A x n-2= … = A x 0 Recurssive Formula (stats)
det ( XI- A) -

Char poly // Char eq = 0 Denoted by E A, lambda

P = { u u … unj
} U >
Ali
EigenVectors are all non-zero vectors in Eigenspace
eigenvect Xi4i/
is - =

<
1

NOTE: If A is invertible then 0 is not an EigenValue


Motivation is to find a P s.t can diagonalize
• Also does not require ortho basis just wack straight into eigenvalues
• Only Sq matrix is considered for diagonalization
Algebraic and Geometric multiplicities

Key Ideas: if def(x I A) (x 1) ( 3) x, 1 3


=
xz
-
= - - =

1. Diagonal Entries of D are eigenvalues of A Then


Algebraic Multiplicity (AM) is no.
diagonal entries with X

2. Cols of P are eigenvectors of A I . E X, = / AM is 2


; X2 =
S AM is 1 .
AKA power of .
root

3. P is the eigenspace or collection of the eigenvectors Geometric dim X associated


Multiplicity is of
4. To Show NOT Diagonalizable show P is singular; det(P) = 0.
eigenspace dim(Ex) nullity (XI-A) =
AKA no .

5. If A is upper Triangular then Eigenvalues == diagonal entries


.

7 Vectors
G(x) 1 A(x)
for X
6. Compare Algebraic and Geometric Multiplicity to see if can diagonalize IF G(x) JA(x) then not diagonizable
7. Define A in terms of A to find Matrix A for recurssive
I g

8. When dealing with n -> C with recursive; find terms without power n less than 1 x, = 0 .
7 = 0 .
) =0

Criterion for Diagonalization Improved algorithm

linear factors
Topic: Orthogonal Diag + SVD
Key formulas Key ideas

Orthogonal (Symmetric matrix A) ⑧
If A is orthogonal diagonalizable THEN A = A . Therefore A must be symmetrical.
I
P =P
T
IF A is symmetrical its eigenspaces are orthogonal to each other.
-

⑨ A = UIVT ·

A is symmetrical and conditions for diagonalizable is


T T T TT ⑧ SVD is used to diagonalize
P AP = D = D = (P AP) = P A P .
T
·

retangular matrices so we
Its eigenvalues are all real numbers can find singular values
Algorithm: Orthogonal Diagonals which are the square root
For each eigenvalue of A, a( ) = g( ) = dim(E ). of eigenvalues of A A
*
Eigenvectors associated to distinct Eigenvalues are orthonormal

Example of solving Ortho diag

Sym !

7 Found eigenvalues

for each X :
space
corresponding Eigen
-

What is Singular Value Decomp?


each :: combine
normalize
new
All non-square matrix are non-diagonalizable so we Use SVD to factorize to diagonalize
to form

or thogonal
matrix
P
-

formula -T -non negative


SVD: A is factorisable into our key formula
eigenvalue Av = X-
non negative
dre to Sq . . : XI0 I
- = UZvT
2 matrices used
<- because

of where U order m
ATA X is square
Orthogonal
=
so Matrix
their ,
values

V =
Order n Orthogonal matrix
D Orx(n r)
2
-

refer to for visual


=

& -r(xr0cm-r(x( -r)


m

D =
order r diagonal matrix ,
v = min (m , n)
Example of SVD:
SVD Algorithm

exter
App
inerecte

Diagonal
0 !
rest
Av U
is

<— Example 2
=

uAv =
[

Why singular values matter

Scaling factor
Topic: Linear Transformations
T is onto iff R(T) = R
Onto is for every vector in Rm the equation T(x) = b
Key Ideas
has at least one sol in R n
Key formulas ·
f: R -> R is a linear transformation

Linear transformation is like map. It maps each parameter and transform it via

T(x) = Ax s.t A is the standard matrix for T scalar multiplication.
Linear Transform is all about transforming the vectors to another vector similar to
-I Bis similar

A = BP => B = AP => B = P AP
-
I :


to D

change of base.
I (x) = x B = CP
-

Linear operator is when R -> R when m = n


·
B Tru)
&
-

Err
=
Tv . . . &


O(x) = 0 => T(0) = A0 = 0
&
Standard Matrix is the matrix A when T(x) = Ax. It holds the coefficients to x
> 13 [w]s =
T(r) ⑧
Identity operator is when your map(x -> x) a.k.a I(x) = x.
T(u + v) = T(u) + T(v)
Zero transformation is T(x) = 0 for all x. 0. Is the standard matrix here.

[W]s
⑧ T(cv) = cT(v) ⑧
A is a new basis after transformation. Therefore B = A x trans matrix


T(c v + … c v ) = c T(v ) +… c T(v )
11 kk . +
... ) T() ...
T(r)
Example on finding standard matrix with no formula
O
nullity(T) = dim( Ker(T) ) = dim(null space of A) = nullity(A)

Ivan Intro into linear

Note: how they


gave inputs for params

Note Cannot have constant


in linear transformations Example of changing basis
These two must be true for it to be a linear transformation

T(v ) = redefine vector as linear


S

combi so think about defining a basis


in terms of another basis
Use Theory to prove NOT. Cannot be used to prove true.

[VJ +
kind
of vibes
If I linearly transform a basis it still is a basis recall span C span
2 Bases

Standard matrix & Invertibility by Ivan T (v) B[w]s C[w] +


=
=

I
P[r]S C[w] +
PLn]s
< =
=
[w] + =
)

tran matrif
= T (v)
p =

from -Base

Contra/dialation

This is why all linear


transformations
are completely det by their
basis they manipulating
A EXAMPLE FOR RANGE / KERNEL

B
-
p
p =

This shows T(cv + dw) = T(cv)


As it ignores the y variable

Topic: Function composition, geo space and kernel shit


Key formulas Key Ideas Visual on linear transforms

(T O
S)(u) = T( S(u) ) = T(Au) Linear transformations can be seen as functions. As such,
= B(Au) = BA(u) have a range and codomain.

T S
S != S O
T Range is like the new span after transform as a span is the
building blocks of a vector space of vectors that satisfy wtv
R(T) = Span{T(v ) … T(v )}
conditions. This is basically B
R(T) = B
Range is set of all images of T
rank + nullity = n Kernel is the null space of the range. Wtv values from x -> 0

Finding range Colonel Kernel

Basically B.
Collection of {T(v ) … T(v )}

T(r) =
Av : Auto

find mull space

Fun properties and formulas Example Questions

How Ker
to find !

-
Ivans & extra shit
Onto and one to one type beat

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy