Math 311-101 — Test I Review — Summer
I, 2012
General Information
Test 1 (Wednesday, June 13) will have 5 to 7 questions, some with
multiple parts. It will cover sections 1.1-1.4, 2.1-2.3, 3.1-3.4, 3.6, and
4.1 in Leon's Linear Algebra (Part I). Please bring an
8½×11 bluebook. Problems will be similar to ones
done for homework. I will have extra office hours on Tuesday
afternoon, 11:45 am-4 pm, and Wednesday morning, 8:30-9:30 am.
-
Calculators. You may use scientific calculators to do numerical
calculations logs, exponentials, and so on. You
may not use any calculator that has the capability of doing
algebra or calculus, or of storing course material.
-
Other devices. You may not use cell phones, computers, or any
other device capable of storing, sending, or receiving information.
Topics Covered
Systems & matrices
- Linear systems
- Solving systems via row reduction.
- Augmented matrix form. Convert a system to and from augmented
matrix form.
- Row operations and equivalent systems. Be able to define the term
equivalent system. Know the three types of row operations and
that they result in an equivalent system.
- Row echelon form of a matrix and reduced row echelon form. Be
able to use Gauss elimination to put a matrix in row echelon form. Be
able to identify the lead variables and free
variables. Be able to use Gauss-Jordan reduction to put a matrix
in reduced row echelon form. (This form makes the connection
between lead variables and free variables explicit.)
Be able to find all solutions of a linear system by row reducing its
augmented matrix and reading off the solution to the resulting
equivalent system.
- Special types of systems: homogeneous, overdetermined,
underdetermined.
- Homogeneous systems. Know the connection with solutions to a
general system and the corresponding homogeneous system.
- Matrices
- Matrix algebra. Sum, product, scalar multiples, row vectors,
column vectors, transpose, symmetric matrix, identity matrix, zero
matrix, size of a matrix, (i,j) entry, notation. Know the "basic matrix trick"
Ax = x1a1 + x2
a2 + ... +xn an
where the aj's are the columns of A.
- Inverse of a matrix.
- Know how the inverse is defined. Also, know the terms invertible,
nonsingular, and singular. Be able to be find the inverse of a matrix
or show that a matrix is singular via row reducing [A|I].
- Know that these are equivalent conditions for A to be invertible:
- A is nonsingular. (That is, A−1 exists.)
- Ax = 0 has only x = 0 as a
solution.
- A is row equivalent to I.
- det(A) ≠ 0.
- The columns of A are linearly independent.
- Know that these are equivalent conditions for A to be singular:
- A is singular. (That is, A−1 doesn't exist.)
- Ax = 0 has a nontrivial solution x
≠ 0 as a solution.
- A is row equivalent to a matrix with 0's in the last row.
- det(A) = 0.
- The columns of A are linearly dependent
- Elementary matrices
- Three types of elementary matrices and correspondence to row
operations.
- Definition of row equivalence of matrices.
Determinants
- Basic properties. Know the basic properties for
determinants. Be able to calculate the determinat of a matrix via its
cofactor expansion about a row or a column.
- Determinants of special matrices. The determinant of an
upper triangular, lower triangular, or diagonal matrix is the product
of the diagonal entries.
- Row and column operations. Be able to use row operations
to find a determinant.
- Row reduciton of a matrix A and det(A). Be able to read
off the determinant of a matrix from the row operations used to reduce
it and its row echelon form.
- Elementary matrices. Know the determinant of the three
types of elementary matrices.
- Inverses. Be able to determine whether an n×n
matrix A is invertible from knowing det A.
- Cramer's rule Classical adjoint adj(A).
Vector spaces
- Properties and examples. Vector spaces have operations
of addition and multiplication by scalars.
- The algebra of vectors is exactly the same as it is in 2D and 3D.
- Closure axioms. Addition: If u and v are
vectors, then so is u + v. Multiplication by
scalars: If c is a scalar and v is a vector, then
c·v is a vector.
- Special vector spaces: Rn,
Rm×n, Pn, C[a,b],
Ck[a,b]. (Pn is the set of polynomials of
degree less than n. So, for example, P3 is the set
of quadratics.)
- Subspaces
- Know the test to determine whether a subset S of a vector space
is a subspace: (i) Is 0 in S? (ii) Is S closed under vector
addition? (iii) Is S closed under multiplication by a scalar?
- Span(v1, v2, ...,
vn). Linear combinations. Spanning sets. Be able to
determine whether or not S is a spanning set for a vector space
— Rn and Pn only.
- Null space of a matrix A, N(A) = {x in
Rn | Ax = 0}.
- Linear Independence and Linear Dependence
- Definition and test for LI and LD sets of vectors. To test
whether a set S =
{v1, v2, ...,
vk) is LI or LD, start with the equation
(∗) c1v1 +
c2v2, ...,
ckvk = 0
- IF the only scalars for which the equation (∗) hold are
c1 = c2 = ... = ck = 0, then S is
LI.
- IF there are nonzero scalars for which (∗) holds are, then
S is LD.
- Matrix test for vectors in Rn. Let S =
{a1, a2, ...,
ak} be vectors in Rn. Let A be an
n×k matrix with the aj's as columns.
- S is LI if and only if N(A) = {0}. Equivalently, S is LD
if and only if N(A) contains a nonzero vector.
- Let k = n. S is LI if and only if A is invertible. Equivalently, S is LD
if and only A is singular. In terms of determinants, S is LI if det(A)
is not 0, and S is LD if det(A) = 0.
- Basis, Dimension and Coordinates
- Definition of basis. A set S = {v1,
v2, ..., vn} is a basis for a
vector space V if and only if (i) S is LI and (ii) S spans V.
- Know the standard bases for Pn,
Rn, Rm×n
- (Corollary 3.4.2) Any two bases for a vector space have the same number of vectors
in them.
- Definition of dimension. If V has a basis with n>0 vectors in
it, then dim(V) = n. If V ={0}, dim(V) = 0. If V has LI
arbitrarily large LI sets in it, dim(V) is infinity.
- "Counting" Theorems
- (Theorem 3.4.1) Suppose that V=span(S) and that S′ has more
vectors than S has. Then. S′ is linearly dependent.
- (Theorem 3.4.3) Suppose that dim(V) = n and that S is a set with
n vectors in it.
- If S is linearly independent, than it also spans V and is a basis.
- If S spans V, then it is also linearly independent and is a
basis.
- (Theorem 3.4.4) Suppose that dim(V) = n.
- No set with fewer than n vectors can span V.
- Every linearly independent set with fewer than n vectors
may be extended to be a basis.
- Every linearly dependent set that spans V may be pared
down to be a basis.
- Coordinates
- Coordinate Theorem. (Be able to show this. See your class
notes for Friday, 6/8/12.)
Let F = {v1, v2, ...,
vn} be a basis for a vector space V. For
every v ∈ V, v can be written in one and only one
way as a linear combination of vectors in F: That is, there are unique
scalars c1, c2, ... cn such that
v = c1v1 +
c2v2 + ... +
cnvn.
- Coordinate vectors. Let F = [v1,
v2, ..., vn] be an
ordered basis for a vector space V. There is a one-to-one
correspondence between vectors in V and column vectors in
Rn given by
v = c1v1 +
c2v2 + ... +
cnvn <-->
[v]F = (c1, c2, ...,
cn)T
The vector [v]F is called the coordinate vector for
v relative to the basis F. The c's are called coordinates.
- Isomorphism beween V and Rn.
Coordinate vectors preserve sums and multiples of vectors in V. That
is, the correspondence between v and [v]F is
an isomorphism between V and Rn. (See
problem 3.1.16 in the text.) Specifically, one has that
u + v <-->
[u]F + [v]F or,
equivalently, [u + v]F =
[u]F + [v]F
cu <--> c[u]F or,
equivalently, [cu]F =
c[u]F.
- Null, Row, and Column Spaces
- Null space. N(A) = {x in
Rn | Ax = 0}.
- Row space. This is the span of all of the rows of A,
Span(r1, r2, ...,
rm).
- Column space. This is the set of all y such that
y = Ax; it is the span of the columns of A,
Span(a1, a2, ...,
an).
- Dimensions of subspaces
- Nullity. nullity(A) := dim(null space(A)).
- Rank. rank(A) := dim(row space(A)) = dim(column space(A)).
- The Rank-Nullity Theorem: rank(A) + nullity(A) = # of
- Bases. Know how to find bases for the subspaces
associated with a matrix. See my notes,
Methods for
finding bases.
- Consistency Theorems. Know and be able to apply these
consistency theorems. Theorems 3.6.2, 3.6.3, and Corollary 3.6.4.
Updated: 6/11/2012 (fjn)