n separately. Sorting the classes by their ranks may also help. (There may be many ways to do this, but a nice one that my son came up with about a decade ago counts the shapes of the staircases in the rref's - these shapes are in one-to-one correspondence with these classes.) [*My "Challenge Problems" may be more challenging that Strang's, so don't worry about turning these in right away - or at all! ;-] 02/01 Strang 2.5 #7,17,34,40,[513bonus]42,[513bonus]45 02/03 [01/25-27 HW due before class or to Dragos] Strang 2.6 #12,13 Strang 2.7 #2 MATLAB/Octave: Please re-read Moler's Preface and 2.1-2.5 and do Computer Problem Set 1 http://www.gang.umass.edu/~kusner/class/513hw1.pdf 02/08 Strang 3.1#2,10,15,18,22,25,[513bonus]30,[513bonus]31 * * * [For all] Use the 8 vector space axioms (I called them A1-4 & S1-4) to show: 0) Multiplying any vector v in a vector space V by the scalar 0 yields the zero vector O in V: 0v=O, i.e. 0v+w=w for any w in V. -1) Multiplying any vector v in a vector space V by the scalar -1 yields its additive inverse -v in V: (-1)v=-v, i.e. v+(-1)v=0. 02/10 [Last week's HW due before class or to Dragos] Strang 3.2#5,7,10,18,[513bonus]36,[513bonus]37 02/15 Strang 3.3#3,10,17,18 26,[513bonus]27 [cf. 28 & my earlier problem] 02/17 [Last week's HW due before class or to Dragos] Strang 3.4#2,12,16,17,[513bonus]34,[513bonus]36 MATLAB/Octave: Please read Strang's 9.1, review Moler and do Computer Problem Set 2 (this part due Tuesday 03/01) http://www.gang.umass.edu/~kusner/class/513hw2.pdf 02/22 Strang 3.5#2,10,19,22,26,[513bonus]42[how about dim(S)=2?] * * * [For all, adapted from another "CAR GUYS" puzzler] Suppose you want to spell out your room number TWELVE with reflective metallic letters that stick to your door. You go to the hardware store and find that three of your housemates want to do the same thing with their doors. The different letters have different prices, and you want to predict your cost, based upon the total costs for theirs: ONE = $2 TWO = $3 ELEVEN = $5 So... What does your TWELVE cost?!?!?!? [Hint: One systematic way to do this is to think of each word as a vector in R^26. With this in mind, try to express the vector TWELVE as linear combination of ONE, TWO and ELEVEN.] [Comment: There is not enough information to pin down the price of each letter, but as long as you come up with a consistent guess for such prices, you will arrive at the same cost for your TWELVE - explain! Try to generalize this puzzler in some interesting way: for instance, what is the smallest collection of spelled-out NUMBERS (with their total costs) in order to compute the prices of each letter in (a subset of) the alphabet?] 02/24 [Last week's HW due before class or to Dragos] Strang 3.6#1,3,11,[513bonus]31 03/01 Review [Computer Problem Set 2 due] [Here's something to ponder over the break - if you want to understand linear maps and their matrices better, try to explore this and write up what you figure out. These ideas are also discussed in sections 7.2 and 7.2 of Strang, in case you wish to read ahead....] While reviewing, we recalled that a map A: V -> W is linear if it preserves linear combinations: A(x1v1 + ... + xkvk) = x1A(v1) + ... + xkA(vk) for any vectors v1, ... , vk in V and scalars x1, ... , xk. This means that A is determined completely by where it maps a basis B for V. If we order the bases B and C, we can make a matrix [A]_CB that represents A. We do this requiring [A]_CB[v]_B=[Av]_C for any v in V. Here [v]_B denotes coordinates of v with respect to B, that is, [v]_B is the sequence of coefficients expressing v (uniquely!) as a linear combination of elements of B; similarly for [.]_C. The jth column of the matrix for A is simply the coordinate vector [A(v_j)]_C where v_j is the jth vector in the basis B for V. We usually do this when V and W are finite dimensional, but it is instructive to consider this infinite-dimensional example: Let V=P=R[x]=W be the vector space of real polynomials and let B=C={1,x,x^2,...} be its standard basis of monomials. Consider the linear maps D=d/dx and S=(integration from 0 to x). What are the semi-infinite matrices [D]_BB and [S]_BB? Their products (in each order)? How does this relate to the fundamental theorem of calculus? (If you don't like semi-infinite matrices, you can try V=P_d and W=P_{d-1} so that D: V -> W and S: W -> V are represented by d x d+1 and d+1 x d matrices, respectively.) 03/03 [Last week's HW due before class or to Dragos] Midterm Examination [in class] [Spring Break - please read Strang 4! :-] 03/15 Strang 4.1#5,6,7,9,10,23 Strang 4.2#1,3,17,[513bonus:21-26, showing P^2=P=P^T] * * * We talked about the L^2-inner product on the vector space P = R[x] of polynomials, and saw that the subspaces P_even and P_odd are orthogonal complements. Work out formulas for the lengths of the various monomials x^k and for the angles between x^k and x^l with respect to this inner product. The same inner product also works for the space of Fourier polynomials F = span{c_m = cos(m \pi x), s_n = sin(n \pi x)} where m and n run over (non-negative and positive) integers. What do you get for their L^2-inner products? (You should find these are orthogonal, but need to be rescaled to make them orthonormal - please work this out in detail.) 03/17 [All remaining pre-midterm HW to Dragos - no class today, but substitute lecture by Strang himself, to view at your leisure: http://academicearth.org/lectures/orthogonal-matrices-and-gram-schmidt] 03/22 We motivated orthogonal projection with the example of best (say, linear) approximation of a function f(x) on the interval [-1,1] using the L^2-inner product. Let V be (a subspace of) the vector space of (square-integrable) functions on [-1,1], and let P_1 be the subspace of linear polynomials {p(x) = a_0 + a_1 x}. We claim the orthogonal projection p_f = proj_{P_1}(f) of f minimizes the L^2-distance |f-p| of f to P_1. Here's the argument (that I hoped to give in class today ;-): Picturing f, p, p_f as vectors in V, note that f-p_f is orthogonal to any vector in P_1, such as p-p_f. Write f-p = (f-p_f)+(p-p_f) and so (by Pythagoras) ||f-p||^2=||f-p_f||^2+||p-p_f||^2 is smallest (and equal to ||f-p_f||^2) exactly when ||p-p_f||^2 = 0, i.e. when p=p_f. (No calculus needed - can you find a simpler argument?!) 03/24 [Last week's HW due before class or to Dragos] Strang 4.4#4,6,8,10,13,14,18,34 [yes, we switched the order of sections!] * * * [513bonus] Suppose {u,v,w} is an orthonormal basis for R^3 and let U, V, W be the reflections in planes orthogonal to u,v,w, respectively. What is the composition (matrix product) of UVW (in any order)? Explain! (This fact is the basis of surveying device I'll describe next week.) 03/29 Strang 4.3#2,12,13,14,[513bonus]29 * * * [513bonus] In class we saw the transpose A':W->V of a linear map A:V->W between inner product spaces V, W is defined by the formula_W = _V where <.,.>_V is the inner product on V, and <.,.>_W on W. Work out the transpose D' for the derivative D=d/dx map on the space V=W=P=R[x] of real polynomials with respect to the L^2-inner product = \int_[-1,1] p(x)q(x)dx (integral from -1 to 1). * * * [513bonus] How would we adapt least-squares data fitting in case the data points cluster around an n-dimensional (affine) hyperplane in R^{n+k}? Try expressing this plane as graph of an affine function R^n -> R^k and then see what shape matrices should go into the AX=B and A'AX=A'B equations. [It may be easier to try the case n=1 first, so we are just fitting to a line, as we discussed in class.] 03/31 [Last week's HW due before class or to Dragos] 5.1#2,4,5,16,18,22,[Matlab challenge problem]33 04/05 5.2#2,4,12,23(look also at 24-26),[513bonus]35 5.3#5,6,[513bonus]11(look also at 12),20 04/07 [Last week's HW due before class or to Dragos] 6.1#1,5,6,[challenge]11 04/12 6.1#2,3,4,9,14,23,[513bonus]34 04/14 [Last week's HW due before class or to Dragos] 6.2#1,2,4,7,15,16,[513bonuses]26,33,35 * * * [challenge] We saw that for a symmetric 2x2 matrix |a b| | | |b c| its characteristic polynomial is t^2 - (a+c)t + ac - b^2, which I claimed has real roots. Why? [Hint: what is the sign of (a+c)^2 - 4(ac-b^2) = (a-c)^2 + 4b^2? We'll come back to this next week....] 04/19 6.3#1,4,5,11,19,26 04/21 [Last week's HW due before class or to Dragos] 6.4#4,11,13,21,[513bonuses]10,23,[challenge]22,29 6.5#7,20,[challenge]32,33 04/26 6.6#5,17 6.7#1,2,3[For a taste of what we didn't cover in class - the discussion of Google's PageRank algorithm on pages 368-369 of Strang is interesting, but check out Moler for more. I also highly recommend studying Example 5 on page 371 of Strang - it illustrates how singular value decomposition compares with eigen' decomposition - for numerical problems the latter can get you in trouble, while the former will save your @$$! Try to explain Example 5 geometrically (hint: there's a very thin parallelogram...).] 04/28 [Last week's HW due before Review or to Dragos] Review Session in our regular classroom/classtime!!! Practice final exam problems: [to be posted Thursday AM] http://www.gang.umass.edu/~kusner/class/513pracfin.pdf 05/02-06 Exam week! [Last HW due to me or to Dragos] Final Exam: 12-2PM Friday 05/06 in DRL 4C6 (1 floor above class) [A single hand-written sheet of notes is OK, but no calculators]