Algebra Reviewer Finals
Algebra Reviewer Finals
ALG 101
ALGEBRA
ALG
101
REVIEWER
ALG
101
MATRICES AND
SYSTEMS OF LINEAR
EQUATIONS
TRANSFORMING SYSTEMS OF EQUATIONS INTO AUGMENTED
MATRIX
1 −2 3 9
0 1 3 5
0 0 1 2
ALG 101 Linear Algebra
Systems of Linear Equations
x − 2 y + 3z = 9 1 −2 3 9
− 1 3 0 − 4
1.) − x + 3 y = −4
2 x − 5 y + 5 z = 17 2 − 5 5 17
x2 + x3 − 2 x4 = −3 0 1 1 −2 −3
1 −1
2.) x1 + 2 x2 − x3 =2 2 0 2
2 x1 + 4 x2 + x3 − 3x4 = −2 2 4 1 −3 −2
x1 − 4 x2 − 7 x3 − x4 = −19 1 −4 −7 −1 − 19
1 −1 3
.
5 6 −4
x1 −3x3 =−1 1 0 −3 − 1
consistent/
0 −1
6.) x2 − x3 =0 1 0 infinitely
many
0=0 0 0 0 0 solutions
Row-Echelon Form 1 𝑎 𝑏 𝑑
1) All rows consisting entirely of zeros 0 1 𝑐 𝑒
occur at the bottom of the matrix.
2) For each row that does not consist
0 0 0 0
entirely of zeros, the first nonzero
entry is 1 (called a leading 1). 1 𝑎 𝑏 𝑑
3) For each nonzero row, the leading 0 1 𝑐 𝑒
one appears to the right and below
any leading ones in preceding rows. 0 0 1 𝑓
Note: a, b,c,d,e, and f are real numbers.
ALG 101 Linear Algebra
Solve for the unknowns using the elementary row operations.
Row-Echelon Form
Example 1.4. The following matrices are in row-echelon form:
1 1 2 9 1 4 −3 7 1 1 0 0 1 2 6 0
1. 0 1 − 4 − 10
2. 0 1 6 2 3. 0 1 0 4. 0 0 1 − 1 0
0 0 1 3 0 0 1 5 0 0 0 0 0 0 0 1
1 2 0 4 RREF 1 2 0 0 1 RREF
1. 0 0 0 0
3. 0 0 1 2 3
0 0 1 − 3 1 1 REF
0 0 0 0 0 5.
0 1
1 0 3 4
0 1 −2
5 Neither 1 0 3 −1 RREF
2.
0 1 2 2 4. 0 1 − 4 2
0 0 0 0
0 0 0 0
Gaussian Elimination
1. Write the augmented matrix of the system of linear
equations.
2. Use elementary row operations to rewrite the
augmented matrix in row-echelon form.
3. Write the system of linear equations corresponding to
the matrix in row-echelon form, and use back-
substitution to find the solution.
x1 + x2 + 2 x3 = 9
2 x1 + 4 x2 − 3 x3 = 1
3 x1 + 6 x2 − 5 x3 = 0
Row-Echelon Form
Note: A matrix in row echelon form has zeros below each leading 1, whereas a matrix in
reduced row echelon form has zeros below and above each leading 1. Thus, with
any real numbers substituted for the *'s, all matrices of the following types are in row
echelon form:
1 * * *
1 * * * 0
1 * *
0 1 *
* 3. 0 1 * * * * * * * *
1. 0 0 0 0 0
0 * 0 0 1 * * * * * *
0 1 1 * * * 0 0 0 0
0 4. 0 0 0 0 1 * * * * *
0 0 0 1 1 * *
2.
0 0 1 * 0 0 0 0 0 1 * * * *
0 0 0 0 0 0 0 0 0 0 0 0 1 *
ALG 101 Linear Algebra
Solve for the unknowns using the elementary row operations.
-R2 + R1 R1 1 0 6 19
0 1 − 4 − 10
0 0 1 3
ALG 101 Linear Algebra
Systems of Linear Equations (Gauss-Jordan Elimination)
Solution:
-R2 + R1 R1 1 0 6 19
0 1 − 4 − 10
0 0 1 3
4R3 + R2 R2 1 0 6 19
0 1 0 2
0 0 1 3
ALG 101 Linear Algebra
Systems of Linear Equations (Gauss-Jordan Elimination)
Solution: 4R3 + R2 R2 1 0 6 19
0 1 0 2
0 0 1 3
-6R3 + R1 R1 1 0 0 1
0 1 0 2 reduced
row-echelon
form
0 0 1 3
Clearly, x1 = 1, x2 = 2, and x3 = 3
ALG 101 Linear Algebra
3.1 Introduction to Determinants
Definition
The determinant of a 2 2 matrix A is denoted |A| and is given
by
a11 a12
= a11a22 − a12 a21
a21 a22
Observe that the determinant of a 2 2 matrix is given by the
different of the products of the two diagonals of the matrix.
The notation det(A) is also used for the determinant of A.
Example 1
A= 2 4
− 3 1
det( A) = 2 4 = (2 1) − (4 (−3)) = 2 + 12 = 14
−3 1
Ch03_34
Definition
Let A be a square matrix.
The minor of the element aij is denoted Mij and is the determinant
of the matrix that remains after deleting row i and column j of A.
The cofactor of aij is denoted Cij and is given by
Cij = (–1)i+j Mij
Note that Cij = Mij or −Mij .
Ch03_35
Example 2
Determine the minors and cofactors of the elements a11 and a32 of
the following matrix A.
1 0 3
A = 4 − 1 2
0 − 2 1
Solution
1 0 3
Minor of a11 : M 11 = 4 − 1 2 = − 1 2 = (−11) − (2 (−2)) = 3
0 −2 1 − 2 1
1+1
Cofactor of a11 : C11 = (−1) M 11 = (−1) (3) = 32
1 0 3
Minor of a32 : M 32 = 4 −1 2 = 1 3 = (1 2) − (3 4) = −10
0 −2 1 4 2
3+ 2
Cofactor of a32 : C32 = (−1) M 32 = (−1) (−10) = 10
5
Ch03_36
Definition
The determinant of a square matrix is the sum of the products of
the elements of the first row and their cofactors.
If A is 3 3, A = a11C11 + a12 C12 + a13C13
If A is 4 4, A = a11C11 + a12 C12 + a13C13 + a14 C14
If A is n n, A = a11C11 + a12 C12 + a13C13 + + a1nC1n
These equations are called cofactor expansions of |A|.
Ch03_37
Example 3
Evaluate the determinant of the following matrix A.
1 2 − 1
A = 3 0 1
4 2 1
Solution
A = a11C11 + a12C12 + a13C13
2 0 1 3 3 1 4 3 0
= 1(−1) + 2(−1) + (−1)(−1)
2 1 4 1 4 2
= [(0 1) − (1 2)] − 2[(3 1) − (1 4)] − [(3 2) − (0 4)]
= −2 + 2 − 6
= −6
Ch03_38
Computing Determinants of 2 2 and 3 3
Matrices
A= a11 a12
a21 a22 A = a11a 22 − a12 a 21
Ch03_39
INVERTIBLE OR NOT
2 −1
−3 1
Ch03_40
INVERTIBLE OR NOT
6 −2
−3 1
Ch03_41
INVERTIBLE OR NOT
1 −2 3
3 5 2
−1 3 −4
Ch03_42