0% found this document useful (0 votes)
11 views6 pages

Checklist For Math 235

Uploaded by

zd2y9sgdtj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
11 views6 pages

Checklist For Math 235

Uploaded by

zd2y9sgdtj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Checklist for Math 235 – Section to Section

This list is only a summary (hence does not contain all key points). In general, students must be able to
solve all problems (exercises and review exercises) in the textbook and apply their knowledge to other
related problems. See course schedule for which sections are covered in every exam.

Chapter 1
Section 1.1

• Definition of system of linear equations.


• Understand the solution of linear equations and their geometric interpretation.
• Knowledge of the types of linear systems and the different types of solutions.
• Elementary row operations and how they can be used to solve linear systems.

Section 1.2

• Understand key terminologies Echelon form and Reduced Echelon Form.


• Ability to reduce matrices to both echelon form and reduced echelon forms.
• Understand the uniqueness of reduced echelon form.
• Identify pivot positions.
• Identify pivot columns and free variable columns, hence pivot (basic) variables and free
variables.
• Determine conditions necessary for a linear system to have different types of solutions.

Section 1.3

• Definition of vectors and their properties.


• Understand vector operations.
• Representation of linear systems as vector equations and its geometric interpretation.
• Understand the concept of linear combinations. Ex. u=2v+3w, u is a linear combination of v and
w.
• Understand the concept of spanning and the definition of a spanning set. Ex. u=2v+3w, u is in
the spanning set of v and w. The spanning of {v,w}=Span{v,w} is the set of all linear combinations
of v and w, that is all {c1 v+ c2 w, where c1,c2 are any reals}.
• Understand the geometric description of spanning sets.

Section 1.4

• Understand Matrix-Vector product Ax and it’s relation to linear systems.


• Ability to represent linear systems as Ax=b.
• Understand that Ax is a linear combination of the columns of A, that is Ax=x1 a1+x2 a2+…+xn an.
• Understand the necessary conditions for the existence of solutions for a linear system.
• Understand the statement “The columns of A span R^m” and all the necessary conditions that
can make it happen. Ex. Every row of A must have a pivot or Every Ax=b must have a solution.

Section 1.5

• Knowledge of homogeneous systems Ax=0.


• Know how to represent every solution of a linear system Ax=b as x= xp+xh, and what each of
them does.
• Knowledge of how to write solutions in the parametric vector form.

Section 1.7

• Knowledge of key terminology Linear Dependence and Linear Independence.


• Understand the geometric interpretation of linear dependence and independence.
• Ability to determine whether a set is linear independent.
• Ability to determine whether the columns of a matrix are linearly independent. Ex. if there is a
free variable column, there is always a nontrivial solution for Ax=0, hence the columns will be
linearly dependent (not independent).
• Relation between linear independence and the zero vector.
• Interpretation of linear independence and spanning sets. Ex. If u=2v+3w, then u is in Span{v,w},
hence the set {u,v,w} is not linearly independent since u depends on v and w.
• Relation between linear (in)dependence and the number of entries of each vector.
• Relation between linear (in)dependence and the number of rows and columns of a matrix.

Section 1.8 and 1.9

• Definition of linear transformations.


• Understand key terminologies domain, codomain, image, pre-image, range.
• Understand why matrix multiplication Ax defines a linear transformation called Matrix
Transformation.
• Ability to write every linear transformation as a matrix transformation T(x)=Ax, by finding the
standard matrix A.
• Understand that any map that can be written in the form T(x)=Ax is a linear transformation.
• Properties of linear transformations. Ex. T(0)=? T(-u)=?
• Linear transformations and linear combinations. Ex. If u=2v+3w, then
T(u)=T(2v+3w)=2T(v)+3T(w).
• How to find the standard matrix for any transformation.
• Definitions of onto and one-to-one.
• Knowledge of all the equivalence relations for a map to be Onto. Ex. One of them is if T(x)=Ax,
then every row of A must have a pivot. Linked to spanning sets.
• Knowledge of all the equivalence relations for a map to be one-to-one. Ex. One of them is if
T(x)=Ax, then every column of A must have a pivot. Linked to linear independence.
• Given a standard matrix A, ability to determine the domain, codomain, range, onto-ness, one-to-
one-ness, etc.

Chapter 2
Section 2.1

• Basic matrix operations (sum, scalar multiplication, matrix multiplication) and their properties.
• Solving matrix equations or computing matrix expressions. Example: Given B=2A-3C^T, find A.
• Understand matrix multiplication as composition of linear maps.
• Understand the relation between the columns of A and the columns of AB.
• Powers of matrices.
• Matrix transpose and its properties. Example: (A-3B^2)^T=?

Section 2.2 and 2.3

• Computation of matrix inverse (2x2 using formula, 3x3 or higher using row operations).
• Knowledge of terminologies such as invertible, singular, non-singular, square, elementary
matrix.
• Properties of invertible matrices. Example: (-2AB^T)^(-1)=?
• Understand the conditions for invertibility.
• Knowledge of the invertible matrix theorem and how the statements imply each other.
• Invertible linear transformations and their standard matrices.
• Solving linear systems using matrix inverse.
• Solving matrix equations. Example: Given AB=C^2, find B (for a given singular A or non-singular
A).

Chapter 3
Section 3.1 and 3.2

• Computation of matrix determinants (2x2 using formula, nxn using cofactor expansion along any
row or column, Optional: 3x3 using trick highlighted in 3.1, exercise 15 ).
• Triangular (upper, lower, diagonal) matrices and determinants.
• Properties of determinants. Example: det(-2AB^T)=?
• Understand how elementary row operations affect determinants.
• Computing determinants using elementary row operations.
• The invertible matrix theorem and determinants. Example: Relation between determinants and
linear dependence, relation between determinants and the span of the columns of A, the
implications of a non-zero determinant, etc.

Section 3.3

• Determinants and linear systems: Cramer’s rule. Understand necessary conditions for Cramer’s
rule and its advantages over Gausssian Elimination.
• Determinants and matrix inverses: matrix inverse formula using adjugate/adjoint. Knowledge of
terminologies such as cofactors and adjoint. Advantage of this method over Gauss-Jordan
method of finding matrix inverses.
• Determinants as Area or Volume: In R2, the area of a parallelogram given the vectors connecting
the vertices, the area of a parallelogram given the vertices as points, the area of a triangle as
half the area of a parallelogram, the area of a quadrilateral as the sum of the area of two
triangles.
• Determinants and linear transformations: how do linear transformations affect the
area/volume?

Chapter 4
Section 4.1
• Understand the axioms that makes a set a vector space.
• Knowledge of some common vector spaces and their description. Example: Rn, Pn, M2x2.
• Definition of a subspace.
• Subspaces and the concept of spanning sets.
• Determine if a subset of a vector space is a subspace (using the definition of a subspace or
writing the set as a spanning set or relating the set to a null space of some matrix-after 4.2).
• Finding the spanning set of a subspace.

Section 4.2

• Definition of the null space, column space and row space.


• Finding Nul A, Col A and Row A for an mxn matrix.
• Relation (or contrast) between these subspaces.
• Relation between these subspaces and linear transformations: Kernel and Range.

Section 4.3

• Understand the concept of spanning sets and linear independence.


• Definition of basis.
• Knowledge of the standard bases of common vector spaces. Example: Rn, Pn, M2x2.
• How to derive a basis for a vector space or subspace: The spanning set theorem.
• Bases for Row A, Col A and Nul A.

Section 4.4

• The unique representation theorem.


• Definition of coordinates.
• Finding the coordinate of a vector relative to multiple bases.
• Coordinate mapping and the concept of isomorphism.
• Using coordinates to check linear independence of vectors in Pn.
• Finding the coordinates of vectors in Pn or Rn.

Section 4.5

• Definition of the dimension of a vector space.


• Relation between the dimension and linear independence.
• Relation between the dimension and the spanning set.
• Relation between the dimension of a vectors and the dimension of its subspaces.
• Finding the dimension of subspaces.
• Dimension and basis: the Basis theorem.
• Description of 0-dimensional, 1-dimensional, etc. subspaces.
• Dimensions of Row A, Nul A and Col A and their relation.
• The invertible matrix theorem and the dimensions of Row A, NulA and Col A.

Section 4.6

• Finding the change in coordinate matrix and its properties.


• Changing the coordinate of a vector from one basis to another.

Section 5.1-5.2

• Understand the definition of eigenvalues and eigenvectors (algebraically and geometrically).


• Finding the eigenvalues and eigenvectors.
• Understanding the difference between the set of eigenvectors and eigenspaces.
• Knowledge of characteristic polynomial and characteristic equation.
• Eigenvectors and linear independence.
• Eigenvalues of triangular matrices.
• How to test if a scalar is an eigenvalue.
• How to test if a vector is an eigenvector.
• Relation between eigenvalues\eigenvectors of A and of the powers of A.
• Eigenvalues and the invertible matrix theorem.
• Knowledge of similar matrices and their properties.
• Discrete dynamical systems: Xk+1=AXk

Section 5.3

• Definition of Diagonalization and the Diagonalization theorem.


• Knowledge of how to diagonalize a matrix.
• Understanding the necessary conditions for a matrix to be diagonalizable.
• Relation between diagonalization and distinct/non-distinct eigenvalues.
• Relation between eigenvalues and determinants.
• Computing powers of A, A^k, when A is diagonalizable.
• Solving discrete dynamical systems when A is diagonalizable (Simplifying AkX0).

Section 6.1

• Understand the inner (dot) product and its properties.


• Finding the length of a vector.
• Knowledge of unit vectors and how to normalize vectors.
• Finding the distance between two vectors as a length.
• Definition and condition for orthogonality.
• Understanding orthogonal complements and how to find them.
• Orthogonal complements of Row A, Col A, etc.

Section 6.2

• Definition of orthogonal sets. (Note: they can have the zero vector).
• Relation between orthogonal sets and linear independence.
• Definition of orthogonal basis for Rn.
• Writing vectors as a linear combination of orthogonal vectors.
• Understand orthogonal projections and how to project vectors onto lines.
• Definition of orthonormal sets and how to create them from orthogonal sets.
• Properties of matrix U whose columns are orthonormal.
Section 6.3

• Writing a vector as a sum of two orthogonal vectors (using orthogonal basis or orthogonal
projections).
• How to project onto subspaces given orthogonal basis (as a sum of individual orthogonal
projections).
• Overall understanding of properties of orthogonal projections (properties).

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy