Gram Schmidt Orthogonalization
Gram Schmidt Orthogonalization
Pradosh K. Roy
Asia Pacific Institute of Information Technlogy , Panipat 132103 India [email protected]
In Linear Algebra and Matrix Analysis , the Gram Schmidt Orthogonalization is a process for orthonormalizing a set of vectors in an inner product space , usually the Euclidean Space . The method is named after Jrgen Pedersen Gram (1850-1916) and Erhard Schmidt (1876-1959). Schmidt has also played an important role in the development of Modern Functional Analysis. However , the process appeared earlier in the works of Pierre Simon Laplace(1749-1827) , A. Cauchy and I.J.Bienayme [3]. We develop the Gram Schmidt Orthogonalization ab initio , defining the norms , inner product spaces , Fourier Expansion , Parseval Identity , Bessels Inequality and Orthonormal Set . E. Schmidt J.P.Gram
Definitions
Given a vector x 0 , it is frequently convenient to define a Unit Vector that points in the same direction as x but has unit length. The vector x is normalized by setting . y 2. The scalar terms defined by are called Standard Inner Products for . The standard inner product is also denoted by
2.1 Inner Product Space An inner product on a real vector space is a Function that maps each ordered pairs of vectors x , y to a real scalar such that is real with and i
3. Cauchy Schwartz Inequality The general Cauchy Schwartz Inequality relating the inner product space and the Euclidean Norm is written as or all , ,
In an inner product space , two vectors , are orthogonal to each other whenever . In other words , for with standard inner product i
( ) is orthogonal to ( )
For example
4. Law of Cosines Recalling the Law Of Cosines , rather than the Pythagorean , in or
We get os We can easily extend the law to higher dimensions . If x,y are vectors from any inner product space , the radian measure of the angle between non zero vectors , is [ , ] such that defined to be a number os
Theorem 5.1 Every Orthonormal Set is linearly independent Theorem 5.2 Every Orthonormal set of n vectors from an n dimensional space is an Orthonormal Basis for .
The set
However , the set is not orthonrmal , each vector does not have unit length . However , the set { is orthonormal. If available because , , , } in the representation are readily
This called the Fourier Expansion of x. 6. Fourier Expansion { If then each , , , , } is an orthonormal basis for an inner product space can be expressed as ,
are the coordinates of x with respect to The scalars and they are called the Fourier Coefficients. Geometrically , Fourier Expansion resolves x into n mutually orthogonal vectors , each of which represents the orthogonal projection of x onto the space spanned by .
Problem.
Determine the Fourier Expansion of basis { , , }, ( ) with respect to the standard inner product and the orthonormal ( ), ( ), ( ).
Solution.
The Fourier Coefficients are Therefore , ( ) ( ) ( ) , , ,
6.2 Bessels Inequlity { If space , , , , and , } is an orthonormal set in an n dimensional inner product , then Bessels Inequality states that
The orthonormal bases possess significant advantage over non-orthonormal bases. The pertinent question in this context is whether every finite dimensional space possess an orthonormal basis. How can one be produced ? The GS Procedure answers these questions.
7. Gram Schmidt Orthogonalization. { , , , Let , } is an arbitrary basis for an n dimensional inner product { , , , space . It is required to construct , } the orthonormal basis for { , , , , } for k= 1,2,....n. { , , , Theorem : If , } is a basis for a general inner product space , then the Gram Schmidt Sequence defined by and is an orthonormal basis for . The proof is straightforward. Eric Carlen has provided a simple and elegant proof [5]. for k = 2,....n
Example 7.1
Lets take the space spanned by the following linearly independent vectors
For k =1 ,
k=2
( ),
( )
k= 3 Thus , ( )
( ),
( )
The GS procedure is a powerful theoretical tool , but it is not a good numerical algorithm when implemented in a straightforward or classical sense. When floating point arithmetic is used , the classical algorithm applied to a set of vectors that is already not close to being an orthogonal set can produce a set of vectors that is far from being an orthogonal set [1]. Problem 7.1 [1]
Let
) ,
),
{ } determine the Orthonormal Basis for . Verify that the GS Sequence is indeed an Orthonormal Basis for .
References 1. Carl D. Meyer , Matrix Analysis and Applied Linear Algebra , SIAM , 2000 ,pp. 269- 313. 2. Frank Ayers Jr., Matrices , Schaum Outline Series , 1974 , p.100-109. 3. Steven Leon , Gram Schmidt Orthogonalization :100 Years and More , 2009, http://www.math.uri.edu/~tombella/nenad2009/nenad_leon.pdf 4. L Giraud , 2005 , The Loss of Orthogonality in Gram Schmidt Process, Computers with Mathematics Applications , vol.50 , p.1069-1075, Elsevier, http://www.math.ucdenver.edu/~langou/publications/2005-CMA-giraud-langou-rozloznik.pdf 5. Eric Carlen , Notes on the Gramm-Schmidt Procedure for Constructing Orthonormal Bases, www.ecs.umass.edu/ece/ece313/Online_help/gram.pdf