Math 311h - Test II Review
General Information
Test II (Wednesday, October 29) will have 5 to 7 questions, some with
multiple parts. It will cover chapter 3, sections 3.2B through
3.6B. In addition, it will cover the material contained in my notes,
Methods
for Finding Bases. Please bring an 8½×11
bluebook. Problems will be similar to ones done for homework
or as examples in class. You may use calculators to do arithmetic,
although you will not need them. You may not use any
calculator that has the capability of doing either calculus or linear
algebra.
Topics Covered
Vector Spaces
- Basic notation. Rn, Mm,n
(also called Rm×n), Pn, C[a,b],
C(k)[a,b].
- Subspaces
- Know the definition of a subspace.
- Test for a subspace. Know the test to determine whether a
subset S of a vector space V is a subspace: (i) Is 0 in S? (ii)
Is S closed under + ? (iii) Is S closed under · ? Be able to
use it to determine whether subsets are subspaces.
- Span(v1, v2, ...,
vn) = {All linear combinations of the
vj's}.
- C(k) is a subspace of C(k-1).
- Subspaces of R3, Rn
- Subspaces associated with matrices
- Null space of a matrix A, N(A) = {x in
Rn | Ax = 0}.
- Column space of a matrix. This is the set of all y such that
y = Ax; it is the span of the columns of A,
Span(a1, a2, ...,
an).
- Row space of a matrix. This is the span of all of the rows of A,
Span(r1, r2, ...,
rm).
- Subspaces associated with a linear function L:V->W
- Domain, V.
- Range, W
- Null space of L,
N(L) = {v in V | L[v] = 0W}
- Image of L, Image(L) =
{w in W | w = L[v] for some v in V}
- Linear functions
- Definition of a linear function. (These are also called linear
transformations and linear operators.) Let V, W be vector spaces. We
say that a function L:V->W is linear if L(au+bv) =
aL(u) + bL(v). Equivalently, L is additive,
L(u+v) = L(u) + L(v), and
homogeneous, L(cu) = cL(u).
- Subspaces associated with a linear transformation: domain, range,
null space, image.
- One-to-linear transformation. L is one-to-one if and only if
L(v) = 0 implies that v = 0.
- Simple properties.
- L(0V) = 0W
- L(c1v1 +
c2v2 + ... +
cnvn) = c1L(v1) +
c2L(v2) + ... +
cnL(vn)
- Combinations of linear transformations.
- Composition. L:U->V, K:V->W are linear, then
K·L(v) := K(L(v)) is a linear
transformation.
- Sum. If L:V->W, K:V->W are linear, then K+L is linear.
- Scalar multiple. If L:V->W is linear and if c is a scalar, then
cL is a linear transformation.
- Inverse of a linear transformation. If L is one-to-one, then
L-1 : Image(L)->V is linear.
- Matrix representation of a linear transformation. If L:V->W is
linear, and if E = {v1, v2, ...,
vn} is a basis for V and F = {w1,
w2, ...,
wm} is a basis for W, then the matrix representing L
is
A = [ [L(v1)]F,
[L(v2)]F, ...,
[L(vn)]F ]
Know how to find matrix representations for linear
transformations. Be able to work problems similar to ones done in
class and to ones in the homework. Specific examples are matrix
representations for rotations, simple differential operators, etc.
- Matrices for combinations of linear transformations. Assume that
the matrix for K is A and L is B.
- Composition: K·L <-> AB
- Sum: K+L <-> A+B
- Scalar product: cK <-> cA
- Inverse: (Image L=W) K-1 <-> A-1
- Linear Independence and Linear Dependence
- Definition and test for LI and LD sets of vectors. To test to determine
whether a set S of vectors is LI or LD, where S =
{v1, v2, ...,
vk} , start with the equation
(*) c1v1 + c2v2,
..., ckvk = 0
- IF the only scalars for which the equation (*) hold are
c1 = c2 = ... = ck = 0, then S is
LI.
- IF there are nonzero scalars for which (*) holds are, then S is LD.
- Matrix test for vectors in Rn. Let S =
{a1, a2, ...,
ak} be vectors in Rn. Let A be an
n×k matrix with the aj's as columns.
- S is LI if and only if N(A) = {0}. Equivalently, S is LD
if and only if N(A) contains a nonzero vector.
- Let k = n. S is LI if and only if A is invertible. Equivalently, S is LD
if and only A is singular. In terms of determinants, S is LI if det(A)
is not 0, and S is LD if det(A) = 0.
- Representation Theorem. (Be able to show this.)
Let S =
{v1, v2, ...,
vn} be an LI subset of a vector space V. IF v
belongs to Span(S), then v can be written in one and only one
way as a linear combination of vectors in S. That is, the scalars
c1, c2, ... cn in the equation below
below are unique.
v = c1v1 +
c2v2 + ... +
cnvn
- Basis and Dimension
- Definition of basis. A set S = {v1,
v2, ..., vn} is a basis for a
vector space V if and only if (i) S is LI and (ii) S spans V.
- Any two bases for a vector space have the same number of vectors
in them.
- Definition of dimension. If V has a basis with n>0 vectors in it,
then dim(V) = n. If V ={0}, dim(V) = 0. If V has arbitrarily
large LI sets in it, dim(V) is infinity.
- Important theorems.
- If S = {v1,
v2, ..., vn} spans V, then there
is a subset of S that is a basis for V.
- Let V be finite dimensional. If S = {v1,
v2, ..., vn} is LI but does not
span V, then we can add vectors to S to make it a basis for V.
- Know Theorems 5.9 and 5.10.
- Know how to find bases for the subspaces associated with a
matrix. See my notes, Methods
for finding bases.
- Coordinates for Finite Dimensional Vector Spaces
- Coordinate vectors. Let E = [v1,
v2, ..., vn] be an
ordered basis for a vector space V. There is a one to one
correspondence between vectors in V and column vectors in
Rn given by
v = c1v1 +
c2v2 + ... +
cnvn <-->
[v]E = (c1, c2, ...,
cn)T
The vector [v]E is called the coordinate vector for
v relative to the basis E.
- Properties of coordinate vectors. Coordinate vectors preserve
sums and multiples of vectors in V. Specifically, they satisfy
u + v <-->
[u]E + [v]E
cu <--> c[u]E
That is,
[u + v]E =
[u]E + [v]E
[cu]E = c[u]E.
- Bases for subspaces of a linear transformation L. By using the
matrix associated with a linear transformation and the correspondence
between vectors in V and column vectors, be able to find bases for the
null space and image of L.
- Eigenvalues and Eigenvectors
- Definition. Let L:V->V be linear. We say that a scalar λ
is an eigenvalue of L if and only if there is a nonzero vector
v for which L[v] = λv. The vector v
is called an eigenvector of λ.
- Matrix formulation. If dim(V) = n and E = {v1,
v2, ..., vn} is a basis for V, the
eigenvalue problem for A, the matrix of L relative to E.
- Characteristic polynomial. pA(λ) =
det(A-λI). Know that pA is a polynomial of degree
n. Be able to show that the eigenvalues of A are the roots of
pA.
- Be able to find the eigenvales and eigenvectors for a matrix. Be
able to do simple ODE systems and simple normal modes problems.
- Bases of eigenvectors.
- Diagonalizable. A linear transformation L:V->V is said to be
diagonalizable if and only if there is a basis for V relative to which
the matrix for L is diagonal.
- Know that eigenvectors corresponding to distinct eigenvalues are
linearly independent, and that L is diagonalizable if and only if
there is a basis for V comprising eigenvectors of L. Be able to
determine whether or not L is diagonalizable.
- Know that if the eigenvalues of L are distinct, then L is
diagonalizable. The converse is false, however.
- Change of coordinates. (Not on the test this time.)
- Let V be a vector space with bases E = {v1,
v2, ..., vn} and F =
{w1 w2, ...,
wn}. The transition matrix
SE->F = [ [v1]F
[v2]F
... [vn]F ]
has as columns the F coordinate vectors for the E basis vectors. Know
that
[v]F = SE->F[v]E.
In addition, SF->E = (SE->F)-1
- If AE and AF are the matrices representing
L relative to the bases E and F, the
AF = SE->FAESF->E
This is usually simplified by dropping subscripts. So, let S =
SF->E. Then,
AF = S-1AES
The last equation is called a similarity transformation.