Math 304-200 Summer II, 2010
Review Sheet - Final Exam
The final exam is comprehensive. It will have six to eight problems,
some with multiple parts. Questions will be similar to those on the
quizzes, homework, and examples done in class. Below is a list of the
sections and notes that were covered.
- Sections 1.1-1.4 and
Notes on Row Reduction. We did application 1 in 1.2, but
skipped the others.
- Sections 2.1-2.3. Be sure to know the "basic matrix trick": If A
= [a1 ...
an], then Ax =
x1a1 + ... +
xnan. We skipped pp. 108-109.
- Sections 3.1-3.6
and
Methods for Finding Bases. (You don't need to
memorize the axioms of a vector space.)
- Sections 4.1-4.3. We skipped pp. 189 (bottom)-191 (top), and
pp. 194-196.
- Sections 6.1 (all), 6.2 (harmonic motion only), 6.3 (skipped
pp. 331-340). Concerning whether a matrix is diagonalizable or
defective, we have the following (cf. class notes, Aug. 3):
Theorem. Let A be an n×n matrix and suppose
its characteristic polynomial factors as follows:
pA(λ) = (&lambda1 −
λ)α1 (&lambda2 −
λ)α2...(&lambdar −
λ)αr,
where r is the number of distinct eigenvalues. Then A is
diagonalizable if and only if nullity(A − &lambdak) =
&alphak for k = 1 to r. Otherwise, A is defective.
- Section 5.1. Inner (scalar) product in Rn;
scalar and vector projections; length and angle. Skip pp. 217-223.
- Section 5.4. Inner product (know definition), inner product
spaces, Cauchy Schwarz inequality, angle, orthogonal vectors, norm
(length), distance. Skip pp. 250 (bottom)-252. Here are two very
important inner products:
Space, Rn; inner product,
< x, y > = yTx
Space, C[a,b]; inner product, < f, g> =
&intab f(x)g(x)dx
- Least squares. Section 5.3 is very confusing. Here is a
summary of the lectures on the subject.
Least-squares problem Suppose that we have an inner product
space V, a vector v in V, and a subspace S of
V. The least-squares problem is to find both the minimum of
|| v − u ||, where u is any vector in S, as well
as any minimizer p in S. That is, we want to find p in S
such that
|| v − p || = minu ∈ S
|| v − u ||.
Theorem. Solution to least-squares problem. Let V be a
vector space with an inner product < u, v >, and let S be a
subspace of V. A vector p in S minimizes the distance
|| v − u || if and only if p satisfies the
condition,
< v − p, u > = 0, for all u in S.
That is, v − p is orthogonal to the whole space
S. In addition, p is unique; it is called the orthogonal
projection of v onto S.
To actually find p, we need to have a basis for S. So, suppose,
that B = {u1 ... un} is such a
basis. The condition in the theorem becomes
v − p ⊥ ui, for i =
1,...,n. Since p ∈ S = span(B), we have p =
c1u1 + ... +
cnun. This and v − p
⊥ ui result in the system of (normal) equations
< v − p,
ui > = < v, ui>
− c1< u1, ui
> − ... −
cn< un, ui > =
0, i=1 .. n.
The normal equations imply that the coefficients cj satisfy
the matrix equation
Gc = d, where Gij= < uj,
ui >, d = (< v ,
u1 ... < v ,
un >)T, c = (c1
... cn)T.
The matrix G is called the Gram
matrix for the basis of u's; it is always
invertible. For an orthonormal basis, <uj,
ui > = δij, and G = I. Examples
may be found in the notes on Least
Squares Problems.
- Section 5.5. Orthogonal and orthonormal sets of vectors;
orthogonal matrices; connection with least squares. Skip pp. 265-270.