Instructions: You Are Allowed To Use 1 Sheet (8 1/2 by 11 Inches, Both Sides) of Notes
Instructions: You Are Allowed To Use 1 Sheet (8 1/2 by 11 Inches, Both Sides) of Notes
NAME (1 pt): TA (1 pt): Name of Neighbor to your left (1 pt): Name of Neighbor to your right (1 pt):
Instructions: You are allowed to use 1 sheet (8 1/2 by 11 inches, both sides) of notes. Otherwise this is a closed book, closed notes, closed calculator, closed computer, closed PDA, closed cellphone, closed mp3 player, closed network, open brain exam. You get one point each for lling in the 4 lines at the top of this page. All other questions are worth 10 points. Fill in the questions at the top of the page. Then stop and wait until we tell you to turn the page and start the rest of the exam. Do not start reading the rest of the exam until we tell you to start. After you start, read all the questions on the exam before you answer any of them, so you do the ones you nd easier rst. Write all your answers on this exam. If you need scratch paper, ask for it, write your name on each sheet, and attach it when you turn it in (we have a stapler).
1 2 3 4 5 6 7 8 Total
Question 1. (10 points) (version 1) Let P (R) be the vector space of all real polynomials. Dene the linear map T : P (R) P (R) by T (f ) = f , the second derivative. Part 1: Show that T is onto but not one-to-one. gi xi+2 Answer: If g(x) = d gi xi , then T (f ) = g where f (x) = f0 + f1 x + d (i+1)(i+2) (so i=0 i=0 T is onto) and f0 and f1 are arbitrary (so T is not one-to-one). Part 2: Describe all eigenvectors of T , i.e. nonzero polynomials such that f (x) = f (x) for some . Answer: Since the degree of f is less than the degree of f , f cannot be a nonzero multiple of f . Therefore = 0 and f (x) = 0 is satised by all linear polynomials f (x) = f0 + f1 x. Question 1. (10 points) (version 2) Let P (C) be the vector space of all complex polynomials. Dene the linear map S : P (C) P (C) by S(g) = g , the second derivative. Part 1: Show that S is onto but not one-to-one. fi xi+2 Answer: If f (x) = d fi xi , then S(g) = f where g(x) = g0 + g1 x + d (i+1)(i+2) (so i=0 i=0 S is onto) and g0 and g1 are arbitrary (so S is not one-to-one). Part 2: Describe all eigenvectors of S, i.e. nonzero polynomials such that g (x) = g(x) for some . Answer: Since the degree of g is less than the degree of g, g cannot be a nonzero multiple of g. Therefore = 0 and g (x) = 0 is satised by all linear polynomials g(x) = g0 + g1 x.
Question 2. (10 points) (version 1) Let A be an m-by-n complex matrix, and let B be an n-by-m complex matrix. Show that Im + A B is invertible if and only if In + B A is invertible. Answer: Solution 1: Suppose Im + A B is invertible, and (In + B A) v = 0; we need to show v = 0. Multiply by A to get A (In + B A) v = A v + A B A v = (Im + A B) (A v) so A v = 0 since Im + A B is invertible. But then v = B A v = 0 as desired. Thus Im + A B invertible implies In + B A invertible. The converse follows by the same argument. Solution 2: From the practice nal, we know A B and B A have the identical nonzero eigenvalues. Therefore, 1 is an eigenvalue of A B if and only if it an eigenvalue of B A, implying 0 is an eigenvalue of Im + A B if and only if it is an eigenvalue of In + B A, so that Im + A B is singular if and only if In + B A is singular. Question 2. (10 points) (version 2) Let X be an m-by-n real matrix, and let Y be an n-by-m complex matrix. Show that Im +X Y is invertible if and only if In +Y X is invertible. Answer: Solution 1: Suppose Im + X Y is invertible, and (In + Y X) v = 0; we need to show v = 0. Multiply by X to get X (In + Y X) v = X v + X Y X v = (Im + X Y ) (X v) so X v = 0 since Im + X Y is invertible. But then v = Y X v = 0 as desired. Thus Im + X Y invertible implies In + Y X invertible. The converse follows by the same argument. Solution 2: From the practice nal, we know X Y and Y X have the identical nonzero eigenvalues. Therefore, 1 is an eigenvalue of X Y if and only if it an eigenvalue of Y X, implying 0 is an eigenvalue of Im + X Y if and only if it is an eigenvalue of In + Y X, so that Im + X Y is singular if and only if In + Y X is singular.
Question 3. (10 points) (version 1) Let A = PR L U PC be an LU decomposition of the m-by-n real matrix A of rank r > 0. Thus PR and PC are permutation matrices, L is m-by-r and unit lower triangular, and U is r-by-n and upper triangular with Uii nonzero. Show how to express an LU decomposition of At using simple modications of the parts of this LU decomposition of A. Answer: Write U = D U , where D is r-by-r and diagonal with Dii = Uii , so U is unit PC so At = P t U t (D Lt ) P t . upper triangular. Then A = PR L U PC = PR L D U C R This is an LU decomposition of At . Question 3. (10 points) (version 2) Let B = PR L U PC be an LU decomposition of the m-by-n complex matrix B of rank r > 0. Thus PR and PC are permutation matrices, L is m-by-r and unit lower triangular, and U is r-by-n and upper triangular with Uii nonzero. Show how to express an LU decomposition of B using simple modications of the parts of this LU decomposition of B. Answer: Write U = D U , where D is r-by-r and diagonal with Dii = Uii , so U is unit = P U (D L ) P . upper triangular. Then B = PR L U PC = PR L D U PC so B C R . This is an LU decomposition of B
Question 4. (10 points) (version 1) Let T : V V be a linear operator. Suppose that T (vi ) = i vi for i = 1, ...m, and all i are distinct. If W is an invariant subspace of T and includes the vector m ai vi , where all the ai = 0, then prove that W contains vi for i=1 i = 1, ..., m. Answer: Let w1 = m ai vi . Since w1 W , so are w2 = T w1 through wn = T n1 w1 i=1 since W is invariant. Let V = [v1 , ..., vm ] and W = [w1 , ..., wm ]. Let Aij = ai j1 be mi by-m. Then we can express the dependence of all the wi on all the vi by W = V A. Now A = diag(a1 , a2 , ..., an ) B, where Bij = j1 . Thus B is the m by m Vandermonde matrix i with distinct i , and so nonsingular by homework question 4.3.22(c). Thus A is the product of nonsingular matrices and also nonsingular. Thus W A1 = V , so all the columns of V , namely the vi , are linear combinations of the wi , and so within W as desired. Question 4. (10 points) (version 2) Let S : X X be a linear operator. Suppose that S(xi ) = i xi for i = 1, ...m, and all i are distinct. If Y is an invariant subspace of S and includes the vector m bi xi , where all the bi = 0, then prove that Y contains xi for i=1 i = 1, ..., m. Answer: Let y1 = m bi xi . Since y1 Y , so are y2 = S y1 through yn = S n1 y1 i=1 since Y is invariant. Let X = [x1 , ..., xm ] and Y = [y1 , ..., ym ]. Let Aij = bi j1 be mi by-m. Then we can express the dependence of all the yi on all the xi by Y = X A. Now A = diag(b1 , b2 , ..., bn ) B, where Bij = j1 . Thus B is the m by m Vandermonde matrix i with distinct i , and so nonsingular by homework question 4.3.22(c). Thus A is the product of nonsingular matrices and also nonsingular. Thus Y A1 = X, so all the columns of X, namely the xi , are linear combinations of the yi , and so within Y as desired.
Question 5. (10 points) (version 1) Let A = X X 1 be diagonalizable ( is diagonal). Let X = QR be the QR decomposition of X, so that Q is unitary and R upper triangular. Show that T = Q A Q is upper triangular. What is the name we gave to the matrix factorization A = Q T Q ? Answer: A = X X 1 = Q R R1 Q , since Q1 = Q , and so Q1 A Q = R R1 = T . Since R is upper triangular and nonsingular, so is R1 , and so the product R R1 is a product of upper triangular matrices and so upper triangular as desired. This is the Schur decomposition. Question 5. (10 points) (version 2) Let B = Z Z 1 be diagonalizable ( is diagonal). Let Z = QR be the QR decomposition of Z, so that Q is unitary and R upper triangular. Show that T = Q B Q is upper triangular. What is the name we gave to the matrix factorization B = Q T Q ? Answer: B = Z Z 1 = Q R R1 Q , since Q1 = Q , and so Q1 B Q = R R1 = T . Since R is upper triangular and nonsingular, so is R1 , and so the product R R1 is a product of upper triangular matrices and so upper triangular as desired. This is the Schur Factorization.
Question 6. (10 points) (version 1) Let < x, y > = m xi yi = xt y be the standard i=1 dot product on Rm . An m-by-m real symmetric matrix T = T t is called positive denite if < T x, x > is positive for all nonzero vectors x Rm . We dene the function < , >T : Rm Rm R by < x, y >T = < T x, y > . Show that < x, y >T is an inner product on Rm if and only if T is symmetric and positive denite. Answer: First assume T is symmetric and positive denite: We need to conrm that < , >T satises the axioms of an inner product: < 1 x1 + 2 x2 , y >T = < T (1 x1 + 2 x2 ), y > = < 1 T x1 + 2 T x2 , y > = 1 < T x1 , y > +2 < T x2 , y > = 1 < x1 , y >T +2 < x2 , y >T as desired. Next < x, y >T = xt T y = (xt T y)t = y t T t x = y t T x =< y, x >T Finally, x = 0 implies < x, x >T = < T x, x > > 0 as desired. Now assume < x, y >T = y t T x is an inner product. Then Tij = et T ej = < ei , ej >T = < ej , ei >T = et T ei = Tji i j so T is symmetric. Also x = 0 implies 0 denite. < < x, x >T = xt T x, so T is positive
Question 6. (10 points) (version 2) Let < u, v > = n ui vi = ut v be the standard i=1 dot product on Rn . An n-by-n real symmetric matrix X = X t is called positive denite if < X u, u > is positive for all nonzero vectors u Rn . We dene the function < , >X : Rn Rn R by < u, v >X = < Xu, v > Show that < u, v >X is an inner product on Rn if and only if X is symmetric and positive denite. Answer: First assume X is symmetric and positive denite: We need to conrm that < , >X satises the axioms of an inner product: < 1 u1 + 2 u2 , v >X = < X(1 u1 + 2 u2 ), v > = < 1 X u1 + 2 Xu2 , v > = 1 < X u1 , v > +2 < X u2 , v > = 1 < u1 , v >X +2 < u2 , v >X as desired. Next < u, v >X = ut X v = (ut X v)t = v t X t u = v t X u = < v, u >X Finally, u = 0 implies < u, u >X = < X u, u > > 0 as desired. Now assume < u, v >X = v t X u is an inner product. Then Xij = et X ej = < ei , ej >X = < ej , ei >X = et X ei = Xji i j so X is symmetric. Also u = 0 implies 0 denite. < < u, u >X = ut X u, so X is positive
10 6 6 Question 7. (10 points) (version 1) Find a matrix B such that B 2 = A = 6 10 9 . 0 0 1 What are all possible sets of eigenvalues of B? Answer: The squares of the eigenvalues of B are the eigenvalues of A, so they can be {1, 2, 3}, where each sign can be chosen independently. Thus there are 23 = 8 possible triples in all for all possible choices of signs. 3 1 1 One square root of A, with positive eigenvalues, is B = 1 3 2 . We can compute it 0 0 1 in (at least) two ways. 1 1 0 First way: Diagonalize A = V V 1 with = diag(16, 4, 1), V = 1 1 1 and 0 0 1 1 1 1 1 = 1 1 1 1 . Then B = V 1/2 V 1 = V diag(4, 2, 1) V 1 . V 2 0 0 2 6 b13 A B Second way: Write A = 9 and B = b23 where A and B are 0 1 0 b33 2-by-2 submatrices, and then note that 2 (B + b33 I2 ) b13 B b23 BB = 2 0 b33 Thus we see that B 2 = A. We solve for B by diagonalizing A = U U 1 where U = 3 1 1 1 . Then b2 = 1 so we and = diag(16, 4), so B = U diag(4, 2) U 1 = 33 1 3 1 1 b13 6 = for b13 = 1 and b23 = 2 as desired. take b33 = 1. Finally, we solve (B + I) b23 9 The set of all possible answers consists of 3 1 1 3 1 1 1 3 3 1 3 3 1 3 2 , 1 3 4 , 3 1 2 , 3 1 0 0 0 1 0 0 1 0 0 1 0 0 1
13 12 7 Question 7. (10 points) (version 2) Find a matrix X such that X 2 = Y = 12 13 7 . 0 0 4 What are all possible sets of eigenvalues of X? Answer: The squares of the eigenvalues of X are the eigenvalues of Y , so they can be {1, 2, 5}, where each sign can be chosen independently. Thus there are 23 = 8 possible triples in all for all possible choices of signs. 3 2 1 One square root, with positive eigenvalues, is X = 2 3 1 . We can compute it in (at 0 0 2 least) two ways. 1 1 1 First way: Diagonalize Y = V V 1 with = diag(25, 1, 4), V = 1 1 1 , and 0 0 3 1 1 1 V 1 = 1 2 0
2
7 x13 Y X Second way: Write Y = 7 and X = x23 0 4 0 x33 2-by-2 submatrices, and then note that 2 (X + x33 I2 ) x13 X x23 XX = 2 0 x33
2 1 2
Thus we see that X 2 = Y . We solve for X by diagonalizing Y = U U 1 where U = 13 12 1 1 . Then x2 = 4 so and = diag(25, 1), so X = U diag(5, 1) U 1 = 33 12 13 1 1 x13 7 = for x13 = 1 and x23 = 1 as we take x33 = 2. Finally, we solve (X + 2 I) x23 7 desired. The set of all possible answers consists of 3 2 1 2 3 1 3 2 7/3 2 3 7/3 2 3 1 , 3 2 1 , 2 3 7/3 , 3 2 7/3 0 0 2 0 0 2 0 0 2 0 0 2
Question 8. (10 points) (version 1) Determine all possible Jordan canonical forms for matrices with the characteristic polynomial (x 1)3 (x 2)2 . In other words, list the sets of Jordan blocks that could appear in each Jordan canonical form. Answer: Let Ji () denote an i-by-i Jordan block with eigenvalue . For the triple eigenvalue at 1, the possible Jordan structures are {J3 (1), (J2 (1), J1 (1)), (J1 (1), J1 (1), J1 (1))}. for the double eigenvalue at 2, the possible Jordan structures are {J2 (2), (J1 (2), J1 (2))}. Since the Jordan structures for the 2 eigenvalues can be chosen independently, and the order in which they appear does not matter, there are 6 possible structures in all, for all choices from the rst set and the second set. Question 8. (10 points) (version 2) Determine all possible Jordan canonical forms for matrices with the characteristic polynomial (x 3)2 (x 4)3 . In other words, list the sets of Jordan blocks that could appear in each Jordan canonical form. Answer: Let Ji () denote a i-by-i Jordan block with eigenvalue . For the triple eigenvalue at 4, the possible Jordan structures are {J3 (4), (J2 (4), J1 (4)), (J1 (4), J1 (4), J1 (4))}. for the double eigenvalue at 3, the possible Jordan structures are {J2 (3), (J1 (3), J1 (3))}. Since the Jordan structures for the 2 eigenvalues can be chosen independently, and the order in which they appear does not matter, there are 6 possible structures in all, for all choices from the rst set and the second set.
10