Chapter 4: General Vector Spaces

  • 4.5 Dimension
  • Problem 5
  • Find a basis for the solution space of a homogenous linear system, and determine the dimension of that space.
  • Problem 9
  • Determine the dimension of certain subspaces of square matrices (symmetric, diagonal, upper triangular, etc.)
  • Problem 15
  • Given a linearly independent set of vectors, enlarge it to a basis.
  • Problem 25
  • Explain why if you have a finite dimensional vector space, any subspace of that vector space is also finite dimensional.
  • 4.6 Change of Basis
  • Problems 1,3, and 5
  • Find transition matrices from one basis into another.
  • Be able to compute coordinate vectors.
  • Use transition matrices to find a coordinate vector in a different basis.
  • Problem 11
  • This problem tested whether you knew how to find transition matrices, but it also involved reflecting vectors about a line, and it involved taking the transpose of a matrix.
  • Problem 15
  • This problem gave a matrix, and asked if it were a transition matrix into the standard basis what the original basis would be. It also asked if were a transition matrix from the standard basis what the new basis would be.
  • Problem 17
  • This problem transformed the standard basis vectors, and then asked what the transition matrix back to the standard basis vectors would be.
  • 4.7 Row Space, Column Space, and Null Space
  • Problem 1
  • This asks to express a matrix-vector product (Ax) as a linear combination of the columns of A.
  • Problem 3
  • Determine whether a vector is in the column space of a matrix.
  • Problem 9
  • Find bases for the null space and row space of a matrix.
  • Problem 13
  • Find bases for the row space and column space of a matrix
  • Part b: find a basis for the row space consisting of only rows of the original matrix (not the row reduced one)
  • Problem 15
  • Gives 4 vectors and asks for a basis for their span.
  • 4.8 Rank, Nullity, and the Fundamental Matrix Spaces
  • Problem 1
  • Asks for the rank and nullity of a matrix.
  • Problem 5
  • Given the reduced row echelon form of a matrix
  • Find its rank
  • Find its nullity
  • Verify that the number of columns in the matrix equals the sum of the rank and the nullity.
  • Be able to count the number of leading variables (pivot positions)
  • Be able to count the number of parameters in solutions to the homogenous equation (Ax = 0).
  • Problems 14 & 15
  • Gives a matrix with a variables in some of the entries, and then asks how the variables affect the rank.
  • Problem 25
  • Gives a matrix with 4 rows and 6 columns, and then asks us to show that the null space of the transpose is orthogonal to the column space of the original matrix.

Chapter 5: Eigenvalues and Eigenvectors

  • 5.1 Eigenvalues and Eigenvectors
  • Problem 3
  • Gives a matrix and a vector. Asks us to confirm that the vector is an eigenvector, and to find its eigenvalue.
  • Problems 5,7,11
  • Gives a matrix, and asks for
  • The characteristic equation
  • The eigenvalues
  • A basis for each eigenspace.
  • Problems 28-29
  • These problems proved that the characteristic equation for a 2 by 2 matrix can be expressed as . Then they worked with how you could determine whether the eigenvalues of A were real or repeated or imaginary by using some formulas that came from the quadratic formula.
  • Problem 33
  • This asked us to show that the eigenvalues of the inverse of a matrix are the reciprocals of the eigenvalues of the original matrix.
  • True/False
  • Know that an eigenvector of a square matrix A is a non-zero vector x such that Ax = λxfor some scalar λ.
  • Know that an eigenvalue λ will have the property that there are non-zero vectors x where (A-λI)x = 0.
  • Know that if a square matrix does not have 0 as an eigenvalue, then it is invertible.
  • Know that the eigenspace of a matrix A corresponding to λ is any (potentially zero) vector x such that Ax = λx
  • (This is tricky. Eigenvectors by definition cannot be zero. However, eigenspaces include the zero vector. This might seem kind of annoying, but it really comes from two conflicting demands we have. First, we want eigenvectors to not be zero because otherwise every scalar would be an eigenvalue with eigenvector 0. And, we want eigenspaces to actually be subspaces, so they have to include 0.)
  • Know that row reducing a matrix changes its eigenvalues.
  • Be familiar with the Equivalent Statements theorem (Theorem 5.1.5)
  • 5.2 Diagonalization
  • Problem 3
  • Know that determinants, invertibility, rank, nullity, trace, characteristic polynomials, eigenvalues, and the dimensions of each eigenspaceare preserved when two matrices are similar. So, if two matrices do not share all of these things, they cannot be similar.
  • Problem 9
  • This problem tested whether you know that a matrix is diagonalizable only when the algebraic multiplicity equals the geometric multiplicity for each eigenvalue.
  • Problem 11
  • Find geometric and algebraic multiplicities of eigenvalues. Use this information to determine if a matrix is diagonalizable. If it is diagonalizable, know how to diagonalize the matrix.
  • Problem 21
  • Use diagonalization to find a formula for that is faster than multiplying A to itself n times.
  • Problem 37
  • This proof asked you to show that if A was diagonalizable, then is also diagonalizable for all whole number values of k.

Chapter 6: Inner Product Spaces

  • 6.1 Inner Products
  • Problem 1
  • Given a formula for an inner product, and two vectors u and v, know how to calculate:
  • u, v> (Their inner product)
  • ||u||(Length, with respect to this inner product)
  • d(u,v) (Distance between them, with respect to this inner product)
  • Problem 9
  • Know the standard inner product on matrices. (Note to Logan: Later classes call this the Frobenius inner product.)
  • Problem 11
  • Know the standard inner product for polynomials (Not the integral one, but the one where you just multiply and add coefficients)
  • Problem 19
  • Be able to find ‘lengths’ of polynomials and ‘distances between’ polynomials using the standard inner product for polynomials.
  • Problem 37
  • This problem was one where the inner product was an integral. Otherwise, the stuff you needed to know was identical to problem 1.
  • 6.2 Angle and Orthogonality in Inner Product Spaces
  • Problems 1,3,5
  • Given an inner product and two vectors, be able to find the cosine of the ‘angle’ between them.
  • Problems 7,9,11
  • Given two vectors and an inner product, be able to determine if they are orthogonal.
  • Problem 27
  • Given some vectors in , be able to find a basis for their orthogonal complement.
  • Problem 33
  • Know how to integral inner products, and use them to find ‘lengths’ of functions.
  • Problem 41
  • This was a proof. It asked you to show that if a vector w was orthogonal to each of the vectors ,then it was orthogonal to their span.
  • Problem 43
  • This was along the same lines as 41. It asked us to show why the orthogonal complement of a subspace W was the set of all vectors orthogonal to a basis for W.
  • 6.3 Gram-Schmidt Process; QR-Decomposition
  • Problems 29 & 31
  • Both of these problems just had you apply the Gram-Schmidt process to a set of vectors. In both cases, the inner product you used was the dot product (the book called it the inner product)
  • 6.4 Best Approximation; Least Squares
  • Problems 3&5
  • Both of these problems gave you a matrix A and a vector b then asked you to find the least squares solution to the equation Ax = b.
  • Problem 17
  • This problem asked you to find the projection of a vector onto a span of a set of other vectors.

Chapter 7: Diagonalization and Quadratic Forms

  • 7.1 Orthogonal Matrices
  • Problem 3
  • This problem tested whether or not you could identify an orthogonal matrix
  • Problem 26
  • This was a pretty tough question. It asked you to show an orthogonal matrix has one of these forms:or .
  • 7.2 Orthogonal Diagonalization
  • Problem 3
  • Find the characteristic polynomial of a symmetric matrix, and determine the dimension of each of its eigenspaces without finding bases for each of the eigenspaces (This problem wants you to take advantage of the fact that symmetric matrices always have geometric multiplicity equal algebraic multiplicity.)
  • Problems 7 & 11
  • These problems have you orthogonally diagonalize a matrix.
  • Problem 26
  • This was a proof. It had a very similar flavor to the spectral decomposition of a matrix.
  • 7.3 Quadratic Forms
  • Problem 1
  • This gave you a function that was a quadratic form, and then asked you to find a symmetric matrix A where we could express the quadratic form as .
  • Problems 57
  • These problems gave a quadratic form, and then asked you to make an orthogonal change of variables so that it no longer had cross product terms.
  • Problem 1315
  • These problems gave you an equation for a conic section centered at the origin. They ask you to rotate it’s principle axes so that it is in standard position. Then, they ask you to identify what kind of conic section it is (circle, ellipse, parabola or hyperbola). They also ask for the angle of rotation.
  • 7.4 Optimization Using Quadratic Forms
  • Problems 13&15
  • Both of these problems gave a function f, and then asked you to
  • Find the critical points of f.
  • Classify the critical points as either relative maxima, relative minima, or saddle points.

Chapter 8: General Linear Transformations

  • 8.1 General Linear Transformations
  • Problems 3, 5, 7, and 9
  • These problems give a transformation between vector spaces, and then ask if it’s linear. If it’s linear, they ask for the kernel.
  • Problem 23
  • This problem gave a transformation that from 3rd degree polynomials to 2nd degree polynomials. It asked us to
  • Show it was linear.
  • Find a basis for its kernel.
  • Find a basis for its range.
  • Problem 28
  • This asked for the dimension of the kernel of the trace function.
  • 8.2 Compositions of Linear Transformations
  • Problem 1
  • This exercise had you find the kernel of a transformation to determine if the transformation was one-to-one
  • Problem 3
  • This exercise had you find the nullity of a matrix, and then use that to determine if multiplication by that matrix was a one-to-one transformation.
  • Problem 13
  • This gave you three linear transformations, and then asked you to give a formula for the transformation that was the composition of all of them.
  • Problem 21
  • This gave a transformation , then it asked what conditions on the constants would make the transformation one-to-one. Then, it asked for a formula for the inverse of T.
  • Problem 29
  • This problem had you
  • Show differentiation and integration are linear functions.
  • Explain why integration is not technically an inverse to differentiation if the domain of the derivative includes constants.
  • Explain how to restrict the domain of the derivative so that integration does become an inverse.
  • Problem 31
  • This gave a transformation that was a definite integral. It asked if it was a one-to-one transformation.
  • 8.3 Isomorphism
  • Problems 3 and 5
  • These problems gave a transformation, and asked if it was an isomorphism.
  • Problem 9
  • This problem asked you to create an isomorphism between two different vector spaces.
  • Problem 23
  • This was a proof. It asked to show that if U was isomorphic to V and V was isomorphic to W, then U was isomorphic to W.
  • 8.4 Matrices for General Linear Transformations
  • Problem 1
  • This gave a transformation from second degree polynomials to 3rd degree polynomials. It asked for the matrix representation with respect to the power basis.
  • It then asked you to verify that if you first found the coordinate vector of a polynomial, and then multiplied it by the matrix representation of the transformation that you got the same thing if you first transformed the polynomial, and then found the coordinate vector.
  • Problem 3
  • This was another transformation like problem 1, but this time it was an operator on polynomials of degree 2
  • Problem 5
  • This gave a transformation, and then asked for it’s matrix representation in a super-funky basis.
  • Afterwards, it still wanted you to check that if you first found the coordinate vector and then multiplied by the matrix representation you would get the same thing as applying the transformation first, and then found the coordinate vector.
  • Problems 9 and 11
  • This gave the matrix representation of a linear transformation relative to a funky basis. It first asked for the coordinate vectors of the image of each of the basis vectors. Then, it asked for the images. Then it asked for an explicit formula for the transformation. Then it asked you to use the explicit formula on one specific vector.
  • Problem 15
  • This gave a transformation from polynomials to 2 by 2 matrices. It asked for matrix represntations in different bases, and then did the some old thing where it has you find a coordinate vector first, and then multiply by the matrix representation, and then use the resulting coordinate vector to find the actual image of the vector. Then, it had you do the transformation directly to see if you get the same result.