Math 311-501 Test II Review
General Information
Test II (Tuesday, 4/10/07) will have 6 to 8 questions, some
with multiple parts. It will cover chapter 3, sections 3.2 through
3.6. In addition, it will cover the material contained in my notes,
Methods
for Finding Bases. Please bring an 8½×11
bluebook. Problems will be similar to ones done for homework
or as examples in class. You may use calculators to do arithmetic,
although you will not need them. You may not use any
calculator that has the capability of doing linear algebra. You may
not use cell phones, computers, or any other device capable of
storing, sending, or receiving information.
Topics Covered
Vector Spaces
- Standard vector spaces. For each these, know what
the space is what the operations of vector addition (+) and scalar
multiplication (·) corresponding to it.
Rn, Mm,n,
Pn, C[a,b], C(k)[a,b],
span{v1,
v2, ...,
vn} .
- Subspaces
- Be able to determine whether a subset S of a vector space V is a
subspace. Here is the test: (i) Is 0 in S? (ii) Is S closed
under + ? (iii) Is S closed under · ?
- Subspaces associated with matrices
- Null space of a matrix A, N(A) = {x in
Rn | Ax = 0}.
- Column space of a matrix. This is the span of the columns of A,
span(a1, a2, ...,
an).
- Row space of a matrix. This is the span of all of the rows of A,
span(r1, r2, ...,
rm).
- Subspaces associated with a linear function L:V->W
- Domain of L, V.
- Range of L, W
- Null space of L,
N(L) = {v in V | L[v] = 0W}
- Image of L, Image(L) =
{w in W | w = L[v] for some v in V}
- Linear independence and linear dependence
- Definition and test for LI and LD sets of vectors. To test to determine
whether a set S of vectors is LI or LD, where S =
{v1, v2, ...,
vk} , start with the equation
(*) c1v1 + c2v2,
..., ckvk = 0
- If the only scalars for which the equation (*) hold are
c1 = c2 = ... = ck = 0, then S is
LI.
- If there are nonzero scalars for which (*) holds, then S is LD.
- Matrix test for vectors in Rn. Let S =
{a1, a2, ...,
ak} be vectors in Rn. Let A be an
n×k matrix with the aj's as columns.
- S is LI if and only if N(A) = {0}. Equivalently, S is LD
if and only if N(A) contains a nonzero vector.
- Let k = n. S is LI if and only if A is invertible. Equivalently, S is LD
if and only A is singular. In terms of determinants, S is LI if det(A)
is not 0, and S is LD if det(A) = 0.
- Basis and Coordinates
- Definition of basis. A set S = {v1,
v2, ..., vn} is a basis for a
vector space V if and only if (i) S is LI and (ii) S spans V.
- Matrix subspaces. Be able to find bases for these. See my notes,
Methods for Finding Bases.
- Coordinates. If B = {v1, v2,
..., vn} is a basis for V. then there is a 1:1
correspondence between V and Rn,
v = x1v1 +
x2v2 + ... +
xnvn <-->
[v]B = (x1, x2, ...,
xn)T .
The vector [v]B is called the coordinate
vector for v relative to B. Coordinate vectors preserve
sums, multiples, and general linear combinations of vectors:
[u + v]B
= [u]B + [v]B ,
[cu]B = c[u]B ,
[c1u1 +
c2u2 + ... +
cmum]B
= c1[u1]B +
c2[u2]B + ... +
cm[um]B
- Change of coordinates.
- Vectors. Let V be a vector space with bases B = {v1,
v2, ..., vn} and C =
{w1 w2, ...,
wn}. The transition matrix
SB->C = [ [v1]C
[v2]C
... [vn]c ] = [B basis in C coordinates]
has as columns the C coordinate vectors for the B basis vectors. Know
that
[v]C = SB->C[v]B.
In addition, SC->B = (SB->C)-1
- Matrices. If AB and AC are the matrices
representing L relative to the bases B and C, then
AC = SB->CABSC->B
This is usually simplified by dropping subscripts. So, let S =
SC->B. Then,
AC = S-1ABS
The last equation is called a similarity transformation.
- Dimension
- Any two bases for a vector space have the same number of vectors
in them.
- Definition of dimension. If V has a basis with n>0 vectors in it,
then dim(V) = n. If V ={0}, dim(V) = 0. If V has arbitrarily
large LI sets in it, dim(V) is infinity.
- Dimensions of matrix subspaces. (See Methods
for Finding Bases). The dimension of the column space of an
m×n matrix A is called rank(A). This is also the dimension of
the row space of A. The dimension of the null space of A is called
nullity(A).
- Important theorems.
- If S = {v1,
v2, ..., vn} spans V, then there
is a subset of S that is a basis for V.
- Let V be finite dimensional. If S = {v1,
v2, ..., vn} is LI but does not
span V, then we can add vectors to S to make it a basis for V.
- Let dim(V) = n. S is a basis for V if any two of these hold: (i)
S has exactly n vectors in it. (ii) S is LI. (iii) V = span(S).
(Theorem 5.9, p. 141.)
- For any matrix A, rank(A)+nullity(A) = # of columns of A
Linear Functions (Transformations/Operators)
- Definition of a linear function. (These are also called linear
transformations and linear operators.) Let V, W be vector spaces. We
say that a function L:V->W is linear if L is additive ("sum
rule"), L(u+v) = L(u) + L(v), and
homogeneous ("constant multiplier rule"), L(cu) =
cL(u). These are equivalent to L(au+bv) =
aL(u) + bL(v).
- Simple properties.
- L(0V) = 0W
- L(c1v1 +
c2v2 + ... +
cnvn) = c1L(v1) +
c2L(v2) + ... +
cnL(vn)
- One-to-one. A linear function L is one-to-one if and only if
L(v) = 0 implies that v = 0.
- Inverse of a linear transformation. If L is one-to-one, then
L-1 : Image(L)->V is linear.
- Matrix representation. If L:V->W is
linear, and if B = {v1, v2, ...,
vn} is a basis for V and C = {w1,
w2, ...,
wm} is a basis for W, then the matrix representing L
is
A = [ [L(v1)]C,
[L(v2)]C, ...,
[L(vn)]C ] = [C coords. of L applied to
basis B]
Know how to find matrix representations for linear
transformations. Specific examples are matrix representations for
rotations, simple differential operators, reflections, etc.
- Bases for subspaces of a linear transformation L. By using the
matrix associated with a linear transformation and the correspondence
between vectors in V and column vectors, be able to find bases for the
null space and image of L.
- The rank of L = dimension of Image(L), and the nullity of L =
dimension of the null space of L. As is the case for matrices, rank +
nullity = dimension of V.
Eigenvalues and Eigenvectors
- Definition. Let L:V->V be linear. We say that a scalar λ
is an eigenvalue of L if and only if there is a vector
v ≠ 0 for which L[v] = λv. The
vector v is called an eigenvector of λ.
- Matrix formulation. If dim(V) = n and E = {v1,
v2, ..., vn} is a basis for V, the
eigenvalue problem for A, the matrix of L relative to E.
- Characteristic polynomial. pA(λ) =
det(A-λI). Know that the eigenvalues of A are exactly the roots of
pA.
- Be able to find the eigenvalues and eigenvectors for a matrix.
- Be able to determine whether a matrix can be diagonalized. If it
can, be able to find the matrix S such that A =
SΛS−1, where Λ =
diag(&lambda1,...,&lambdan)
- Be able to do simple ODE systems and normal-mode (spring) problems.
Updated 4/4/07