Page 309 - 35Linear Algebra
P. 309

17.2 Singular Value Decomposition                                                             309


                            ∗
                                               ∗
                   Thus LL : W → W and L L : V → V and both have eigenvalue problems.
                                                                 ∗
                                                                             ∗
                   Moreover, as is shown in Chapter 15, both L L and LL have orthonormal
                                                                  T
                                                        T
                   bases of eigenvectors, and both MM and M M can be diagonalized.
                      Next, let us make a simplifying assumption, namely ker L = {0}. This
                   is not necessary, but will make some of our computations simpler. Now
                   suppose we have found an orthonormal basis (u 1 , . . . , u n ) for V composed of
                                     ∗
                   eigenvectors for L L. That is
                                                    ∗
                                                  L Lu i = λ i u i .
                   Then multiplying by L gives

                                                    ∗
                                                 LL Lu i = λ i Lu i .
                                                     ∗
                   I.e., Lu i is an eigenvector of LL . The vectors (Lu 1 , . . . , Lu n ) are linearly
                   independent, because ker L = {0} (this is where we use our simplifying as-
                   sumption, but you can try and extend our analysis to the case where it no
                   longer holds).
                      Lets compute the angles between and lengths of these vectors. For that
                   we express the vectors u i in the bases used to compute the matrix M of L.
                   Denoting these column vectors by U i we then compute

                                                                T
                                                  T
                                              T
                          (MU i ) · (MU j ) = U M MU j = λ j U U j = λ j U i · U j = λ j δ ij .
                                              i
                                                                i
                   We see that vectors (Lu 1 , . . . , Lu n ) are orthogonal but not orthonormal.
                                                  √
                   Moreover, the length of Lu i is  λ i . Normalizing gives the orthonormal and
                   linearly independent ordered set

                                                  Lu 1      Lu n
                                                  √   , . . . , √  .
                                                    λ 1       λ n
                      In general, this cannot be a basis for W since ker L = {0}, dim L(V ) =
                   dim V, and in turn dim V ≤ dim W, so n ≤ m.
                                                                       ∗
                      However, it is a subset of the eigenvectors of LL so there is an orthonor-
                                                   ∗
                   mal basis of eigenvectors of LL of the form

                                      Lu 1
                                                Lu n
                                0
                               O =    √    , . . . , √  , v n+1 , . . . , v m  =: (v 1 , . . . , v m ) .
                                        λ 1       λ n
                   Now lets compute the matrix of L with respect to the orthonormal basis
                                                                         0
                   O = (u 1 , . . . , u n ) for V and the orthonormal basis O = (v 1 , . . . , v m ) for W.

                                                                  309
   304   305   306   307   308   309   310   311   312   313   314