0% found this document useful (0 votes)
114 views27 pages

Orthogonal Matrix - Wikipedia

An orthogonal matrix is a real square matrix whose columns and rows are orthogonal unit vectors. It has the properties that its transpose is equal to its inverse, it preserves the dot product of vectors, and its determinant is either +1 or -1. The set of orthogonal matrices forms the orthogonal group O(n) under matrix multiplication.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
114 views27 pages

Orthogonal Matrix - Wikipedia

An orthogonal matrix is a real square matrix whose columns and rows are orthogonal unit vectors. It has the properties that its transpose is equal to its inverse, it preserves the dot product of vectors, and its determinant is either +1 or -1. The set of orthogonal matrices forms the orthogonal group O(n) under matrix multiplication.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 27

Orthogonal matrix

In linear algebra, an or thogonal matrix, or or thonormal matrix, is a real square mat rix whose
columns and rows are ort honormal vect ors.

One way t o express t his is

where QT is the transpose of Q and I is


the identity matrix.
This leads t o t he equivalent charact erizat ion: a mat rix Q is ort hogonal if it s t ranspose is equal
t o it s inverse:

where Q−1 is the inverse of Q.


An ort hogonal mat rix Q is necessarily invert ible (wit h inverse Q−1 = QT ), unit ary (Q−1 = Q∗),
where Q∗ is t he Hermit ian adjoint (conjugat e t ranspose) of Q, and t herefore normal
(Q∗Q = QQ∗) over t he real numbers. The det erminant of any ort hogonal mat rix is eit her +1 or
−1. As a linear t ransformat ion, an ort hogonal mat rix preserves t he inner product of vect ors,
and t herefore act s as an isomet ry of Euclidean space, such as a rot at ion, reflect ion or
rot oreflect ion. In ot her words, it is a unit ary t ransformat ion.
The set of n × n ort hogonal mat rices, under mult iplicat ion, forms t he group O(n), known as
t he ort hogonal group. The subgroup SO(n) consist ing of ort hogonal mat rices wit h
det erminant +1 is called t he special ort hogonal group, and each of it s element s is a special
or thogonal matrix. As a linear t ransformat ion, every special ort hogonal mat rix act s as a
rot at ion.

Overview

Visual understanding of multiplication by the


transpose of a matrix. If A is an orthogonal
matrix and B is its transpose, the ij-th element
of the product AAT will vanish if i≠j, because the
i-th row of A is orthogonal to the j-th row of A.

An ort hogonal mat rix is t he real specializat ion of a unit ary mat rix, and t hus always a normal
mat rix. Alt hough we consider only real mat rices here, t he definit ion can be used for mat rices
wit h ent ries from any field. However, ort hogonal mat rices arise nat urally from dot product s,
and for mat rices of complex numbers t hat leads inst ead t o t he unit ary requirement .
Ort hogonal mat rices preserve t he dot product ,[1] so, for vect ors u and v in an n-dimensional
real Euclidean space
where Q is an orthogonal matrix. To see
the inner product connection, consider a
vector v in an n-dimensional real
Euclidean space. Written with respect to
an orthonormal basis, the squared length
of v is vTv. If a linear transformation, in
matrix form Qv, preserves vector lengths,
then

Thus finit e-dimensional linear isomet ries—rot at ions, reflect ions, and t heir combinat ions—
produce ort hogonal mat rices. The converse is also t rue: ort hogonal mat rices imply ort hogonal
t ransformat ions. However, linear algebra includes ort hogonal t ransformat ions bet ween
spaces which may be neit her finit e-dimensional nor of t he same dimension, and t hese have no
ort hogonal mat rix equivalent .

Ort hogonal mat rices are import ant for a number of reasons, bot h t heoret ical and pract ical.
The n × n ort hogonal mat rices form a group under mat rix mult iplicat ion, t he ort hogonal group
denot ed by O(n), which—wit h it s subgroups—is widely used in mat hemat ics and t he physical
sciences. For example, t he point group of a molecule is a subgroup of O(3). Because float ing
point versions of ort hogonal mat rices have advant ageous propert ies, t hey are key t o many
algorit hms in numerical linear algebra, such as QR decomposit ion. As anot her example, wit h
appropriat e normalizat ion t he discret e cosine t ransform (used in MP3 compression) is
represent ed by an ort hogonal mat rix.
Examples
Below are a few examples of small ort hogonal mat rices and possible int erpret at ions.

(identity

transformation)

(rotation

about the origin)

(reflection across x-

axis)
(permutation

of coordinate axes)

Elementary constructions

Lower dimensions
The simplest ort hogonal mat rices are t he 1 × 1 mat rices [1] and [−1], which we can int erpret
as t he ident it y and a reflect ion of t he real line across t he origin.

The 2 × 2 mat rices have t he form

which orthogonality demands satisfy the


three equations
In considerat ion of t he first equat ion, wit hout loss of generalit y let p = cos θ, q = sin θ; t hen
eit her t = −q, u = p or t = q, u = −p. We can int erpret t he first case as a rot at ion by θ (where
θ
θ = 0 is t he ident it y), and t he second as a reflect ion across a line at an angle of 2 .

The special case of t he reflect ion mat rix wit h θ = 90° generat es a reflect ion about t he line at
45° given by y = x and t herefore exchanges x and y; it is a permut at ion mat rix, wit h a single 1
in each column and row (and ot herwise 0):

The ident it y is also a permut at ion mat rix.

A reflect ion is it s own inverse, which implies t hat a reflect ion mat rix is symmet ric (equal t o it s
t ranspose) as well as ort hogonal. The product of t wo rot at ion mat rices is a rot at ion mat rix,
and t he product of t wo reflect ion mat rices is also a rot at ion mat rix.

Higher dimensions
Regardless of t he dimension, it is always possible t o classify ort hogonal mat rices as purely
rot at ional or not , but for 3 × 3 mat rices and larger t he non-rot at ional mat rices can be more
complicat ed t han reflect ions. For example,
represent an inversion t hrough t he origin and a rotoinversion, respect ively, about t he z-axis.

Rot at ions become more complicat ed in higher dimensions; t hey can no longer be complet ely
charact erized by one angle, and may affect more t han one planar subspace. It is common t o
describe a 3 × 3 rot at ion mat rix in t erms of an axis and angle, but t his only works in t hree
dimensions. Above t hree dimensions t wo or more angles are needed, each associat ed wit h a
plane of rot at ion.

However, we have element ary building blocks for permut at ions, reflect ions, and rot at ions t hat
apply in general.

Primitives
The most element ary permut at ion is a t ransposit ion, obt ained from t he ident it y mat rix by
exchanging t wo rows. Any n × n permut at ion mat rix can be const ruct ed as a product of no
more t han n − 1 t ransposit ions.

A Householder reflect ion is const ruct ed from a non-null vect or v as

Here t he numerat or is a symmet ric mat rix while t he denominat or is a number, t he squared
magnit ude of v. This is a reflect ion in t he hyperplane perpendicular t o v (negat ing any vect or
component parallel t o v). If v is a unit vect or, t hen Q = I − 2vvT suffices. A Householder
reflect ion is t ypically used t o simult aneously zero t he lower part of a column. Any ort hogonal
mat rix of size n × n can be const ruct ed as a product of at most n such reflect ions.
A Givens rot at ion act s on a t wo-dimensional (planar) subspace spanned by t wo coordinat e
axes, rot at ing by a chosen angle. It is t ypically used t o zero a single subdiagonal ent ry. Any
n(n − 1)
rot at ion mat rix of size n × n can be const ruct ed as a product of at most 2
such
rot at ions. In t he case of 3 × 3 mat rices, t hree such rot at ions suffice; and by fixing t he
sequence we can t hus describe all 3 × 3 rot at ion mat rices (t hough not uniquely) in t erms of
t he t hree angles used, oft en called Euler angles.

A Jacobi rot at ion has t he same form as a Givens rot at ion, but is used t o zero bot h off-
diagonal ent ries of a 2 × 2 symmet ric submat rix.

Properties

Matrix properties
A real square mat rix is ort hogonal if and only if it s columns form an ort honormal basis of t he
Euclidean space Rn wit h t he ordinary Euclidean dot product , which is t he case if and only if it s
rows form an ort honormal basis of Rn. It might be t empt ing t o suppose a mat rix wit h
ort hogonal (not ort honormal) columns would be called an ort hogonal mat rix, but such
mat rices have no special int erest and no special name; t hey only sat isfy MT M = D, wit h D a
diagonal mat rix.

The det erminant of any ort hogonal mat rix is +1 or −1. This follows from basic fact s about
det erminant s, as follows:

The converse is not t rue; having a det erminant of ±1 is no guarant ee of ort hogonalit y, even
wit h ort hogonal columns, as shown by t he following count erexample.
Wit h permut at ion mat rices t he det erminant mat ches t he signat ure, being +1 or −1 as t he
parit y of t he permut at ion is even or odd, for t he det erminant is an alt ernat ing funct ion of t he
rows.

St ronger t han t he det erminant rest rict ion is t he fact t hat an ort hogonal mat rix can always be
diagonalized over t he complex numbers t o exhibit a full set of eigenvalues, all of which must
have (complex) modulus 1.

Group properties
The inverse of every ort hogonal mat rix is again ort hogonal, as is t he mat rix product of t wo
ort hogonal mat rices. In fact , t he set of all n × n ort hogonal mat rices sat isfies all t he axioms
n(n − 1)
of a group. It is a compact Lie group of dimension 2
, called t he ort hogonal group and
denot ed by O(n).

The ort hogonal mat rices whose det erminant is +1 form a pat h-connect ed normal subgroup of
O(n) of index 2, t he special ort hogonal group SO(n) of rot at ions. The quot ient group
O(n)/SO(n) is isomorphic t o O(1), wit h t he project ion map choosing [+1] or [−1] according t o
t he det erminant . Ort hogonal mat rices wit h det erminant −1 do not include t he ident it y, and so
do not form a subgroup but only a coset ; it is also (separat ely) connect ed. Thus each
ort hogonal group falls int o t wo pieces; and because t he project ion map split s, O(n) is a
semidirect product of SO(n) by O(1). In pract ical t erms, a comparable st at ement is t hat any
ort hogonal mat rix can be produced by t aking a rot at ion mat rix and possibly negat ing one of it s
columns, as we saw wit h 2 × 2 mat rices. If n is odd, t hen t he semidirect product is in fact a
direct product , and any ort hogonal mat rix can be produced by t aking a rot at ion mat rix and
possibly negat ing all of it s columns. This follows from t he propert y of det erminant s t hat
negat ing a column negat es t he det erminant , and t hus negat ing an odd (but not even) number
of columns negat es t he det erminant .

Now consider (n + 1) × (n + 1) ort hogonal mat rices wit h bot t om right ent ry equal t o 1. The
remainder of t he last column (and last row) must be zeros, and t he product of any t wo such
mat rices has t he same form. The rest of t he mat rix is an n × n ort hogonal mat rix; t hus O(n) is
a subgroup of O(n + 1) (and of all higher groups).
Since an element ary reflect ion in t he form of a Householder mat rix can reduce any ort hogonal
mat rix t o t his const rained form, a series of such reflect ions can bring any ort hogonal mat rix t o
t he ident it y; t hus an ort hogonal group is a reflect ion group. The last column can be fixed t o
any unit vect or, and each choice gives a different copy of O(n) in O(n + 1); in t his way
O(n + 1) is a bundle over t he unit sphere Sn wit h fiber O(n).

Similarly, SO(n) is a subgroup of SO(n + 1); and any special ort hogonal mat rix can be
generat ed by Givens plane rot at ions using an analogous procedure. The bundle st ruct ure
persist s: SO(n) ↪ SO(n + 1) → Sn. A single rot at ion can produce a zero in t he first row of t he
last column, and series of n − 1 rot at ions will zero all but t he last row of t he last column of an
n × n rot at ion mat rix. Since t he planes are fixed, each rot at ion has only one degree of
freedom, it s angle. By induct ion, SO(n) t herefore has

degrees of freedom, and so does O(n).


Permut at ion mat rices are simpler st ill; t hey form, not a Lie group, but only a finit e group, t he
order n! symmet ric group Sn. By t he same kind of argument , Sn is a subgroup of Sn + 1. The
even permut at ions produce t he subgroup of permut at ion mat rices of det erminant +1, t he
n!
order 2 alt ernat ing group.
Canonical form
More broadly, t he effect of any ort hogonal mat rix separat es int o independent act ions on
ort hogonal t wo-dimensional subspaces. That is, if Q is special ort hogonal t hen one can always
find an ort hogonal mat rix P, a (rot at ional) change of basis, t hat brings Q int o block diagonal
form:

where t he mat rices R1, ..., Rk are 2 × 2 rot at ion mat rices, and wit h t he remaining ent ries zero.
Except ionally, a rot at ion block may be diagonal, ±I. Thus, negat ing one column if necessary,
and not ing t hat a 2 × 2 reflect ion diagonalizes t o a +1 and −1, any ort hogonal mat rix can be
brought t o t he form
The mat rices R1, ..., Rk give conjugat e pairs of eigenvalues lying on t he unit circle in t he
complex plane; so t his decomposit ion confirms t hat all eigenvalues have absolut e value 1. If n
is odd, t here is at least one real eigenvalue, +1 or −1; for a 3 × 3 rot at ion, t he eigenvect or
associat ed wit h +1 is t he rot at ion axis.

Lie algebra
Suppose t he ent ries of Q are different iable funct ions of t, and t hat t = 0 gives Q = I.
Different iat ing t he ort hogonalit y condit ion

yields

Evaluat ion at t = 0 (Q = I) t hen implies


In Lie group t erms, t his means t hat t he Lie algebra of an ort hogonal mat rix group consist s of
skew-symmet ric mat rices. Going t he ot her direct ion, t he mat rix exponent ial of any skew-
symmet ric mat rix is an ort hogonal mat rix (in fact , special ort hogonal).

For example, t he t hree-dimensional object physics calls angular velocit y is a different ial
rot at ion, t hus a vect or in t he Lie algebra t angent t o SO(3). Given ω = (xθ, yθ, zθ), wit h
v = (x, y, z) being a unit vect or, t he correct skew-symmet ric mat rix form of ω is

The exponent ial of t his is t he ort hogonal mat rix for rot at ion around axis v by angle θ; set t ing
θ θ
c = cos 2 , s = sin 2 ,

Numerical linear algebra

Benefits
Numerical analysis t akes advant age of many of t he propert ies of ort hogonal mat rices for
numerical linear algebra, and t hey arise nat urally. For example, it is oft en desirable t o comput e
an ort honormal basis for a space, or an ort hogonal change of bases; bot h t ake t he form of
ort hogonal mat rices. Having det erminant ±1 and all eigenvalues of magnit ude 1 is of great
benefit for numeric st abilit y. One implicat ion is t hat t he condit ion number is 1 (which is t he
minimum), so errors are not magnified when mult iplying wit h an ort hogonal mat rix. Many
algorit hms use ort hogonal mat rices like Householder reflect ions and Givens rot at ions for t his
reason. It is also helpful t hat , not only is an ort hogonal mat rix invert ible, but it s inverse is
available essent ially free, by exchanging indices.

Permut at ions are essent ial t o t he success of many algorit hms, including t he workhorse
Gaussian eliminat ion wit h part ial pivot ing (where permut at ions do t he pivot ing). However, t hey
rarely appear explicit ly as mat rices; t heir special form allows more efficient represent at ion,
such as a list of n indices.

Likewise, algorit hms using Householder and Givens mat rices t ypically use specialized
met hods of mult iplicat ion and st orage. For example, a Givens rot at ion affect s only t wo rows
of a mat rix it mult iplies, changing a full mult iplicat ion of order n3 t o a much more efficient
order n. When uses of t hese reflect ions and rot at ions int roduce zeros in a mat rix, t he space
vacat ed is enough t o st ore sufficient dat a t o reproduce t he t ransform, and t o do so robust ly.
(Following St ewart (1976), we do not st ore a rot at ion angle, which is bot h expensive and badly
behaved.)

Decompositions
A number of import ant mat rix decomposit ions (Golub & Van Loan 1996) involve ort hogonal
mat rices, including especially:

QR decomposition
M = QR, Q orthogonal, R upper
triangular
Singular value decomposition
M = UΣVT, U and V orthogonal, Σ
diagonal matrix
Eigendecomposition of a symmetric
matrix (decomposition according to
the spectral theorem)
S = QΛQT, S symmetric, Q orthogonal,
Λ diagonal
Polar decomposition
M = QS, Q orthogonal, S symmetric
positive-semidefinite

Examples
Consider an overdet ermined syst em of linear equat ions, as might occur wit h repeat ed
measurement s of a physical phenomenon t o compensat e for experiment al errors. Writ e
Ax = b, where A is m × n, m > n. A QR decomposit ion reduces A t o upper t riangular R. For
example, if A is 5 × 3 t hen R has t he form
The linear least squares problem is t o find t he x t hat minimizes ‖ Ax − b ‖, which is equivalent
t o project ing b t o t he subspace spanned by t he columns of A. Assuming t he columns of A
(and hence R) are independent , t he project ion solut ion is found from AT Ax = AT b. Now AT A is
square (n × n) and invert ible, and also equal t o RT R. But t he lower rows of zeros in R are
superfluous in t he product , which is t hus already in lower-t riangular upper-t riangular fact ored
form, as in Gaussian eliminat ion (Cholesky decomposit ion). Here ort hogonalit y is import ant not
only for reducing AT A = (RT QT )QR t o RT R, but also for allowing solut ion wit hout magnifying
numerical problems.

In t he case of a linear syst em which is underdet ermined, or an ot herwise non-invert ible mat rix,
singular value decomposit ion (SVD) is equally useful. Wit h A fact ored as UΣVT , a sat isfact ory
solut ion uses t he Moore-Penrose pseudoinverse, VΣ +UT , where Σ + merely replaces each non-
zero diagonal ent ry wit h it s reciprocal. Set x t o VΣ +UT b.

The case of a square invert ible mat rix also holds int erest . Suppose, for example, t hat A is a
3 × 3 rot at ion mat rix which has been comput ed as t he composit ion of numerous t wist s and
t urns. Float ing point does not mat ch t he mat hemat ical ideal of real numbers, so A has
gradually lost it s t rue ort hogonalit y. A Gram–Schmidt process could ort hogonalize t he
columns, but it is not t he most reliable, nor t he most efficient , nor t he most invariant met hod.
The polar decomposit ion fact ors a mat rix int o a pair, one of which is t he unique closest
ort hogonal mat rix t o t he given mat rix, or one of t he closest if t he given mat rix is singular.
(Closeness can be measured by any mat rix norm invariant under an ort hogonal change of basis,
such as t he spect ral norm or t he Frobenius norm.) For a near-ort hogonal mat rix, rapid
convergence t o t he ort hogonal fact or can be achieved by a "Newt on's met hod" approach due
t o Higham (1986) (1990), repeat edly averaging t he mat rix wit h it s inverse t ranspose. Dubrulle
(1999) has published an accelerat ed met hod wit h a convenient convergence t est .
For example, consider a non-ort hogonal mat rix for which t he simple averaging algorit hm t akes
seven st eps

and which acceleration trims to two


steps (with γ = 0.353553, 0.565685).

Gram-Schmidt yields an inferior solut ion, shown by a Frobenius dist ance of 8.28659 inst ead of
t he minimum 8.12404.

Randomization
Some numerical applicat ions, such as Mont e Carlo met hods and explorat ion of high-
dimensional dat a spaces, require generat ion of uniformly dist ribut ed random ort hogonal
mat rices. In t his cont ext , "uniform" is defined in t erms of Haar measure, which essent ially
requires t hat t he dist ribut ion not change if mult iplied by any freely chosen ort hogonal mat rix.
Ort hogonalizing mat rices wit h independent uniformly dist ribut ed random ent ries does not
result in uniformly dist ribut ed ort hogonal mat rices, but t he QR decomposit ion of independent
normally dist ribut ed random ent ries does, as long as t he diagonal of R cont ains only posit ive
ent ries (Mezzadri 2006). St ewart (1980) replaced t his wit h a more efficient idea t hat Diaconis
& Shahshahani (1987) lat er generalized as t he "subgroup algorit hm" (in which form it works just
as well for permut at ions and rot at ions). To generat e an (n + 1) × (n + 1) ort hogonal mat rix,
t ake an n × n one and a uniformly dist ribut ed unit vect or of dimension n + 1. Const ruct a
Householder reflect ion from t he vect or, t hen apply it t o t he smaller mat rix (embedded in t he
larger size wit h a 1 at t he bot t om right corner).

Nearest orthogonal matrix


The problem of finding t he ort hogonal mat rix Q nearest a given mat rix M is relat ed t o t he
Ort hogonal Procrust es problem. There are several different ways t o get t he unique solut ion,
t he simplest of which is t aking t he singular value decomposit ion of M and replacing t he
singular values wit h ones. Anot her met hod expresses t he R explicit ly but requires t he use of a
mat rix square root :[2]

This may be combined wit h t he Babylonian met hod for ext ract ing t he square root of a mat rix
t o give a recurrence which converges t o an ort hogonal mat rix quadrat ically:

where Q0 = M.
These it erat ions are st able provided t he condit ion number of M is less t han t hree.[3]

Using a first -order approximat ion of t he inverse and t he same init ializat ion result s in t he
modified it erat ion:
Spin and pin
A subt le t echnical problem afflict s some uses of ort hogonal mat rices. Not only are t he group
component s wit h det erminant +1 and −1 not connect ed t o each ot her, even t he +1
component , SO(n), is not simply connect ed (except for SO(1), which is t rivial). Thus it is
somet imes advant ageous, or even necessary, t o work wit h a covering group of SO(n), t he spin
group, Spin(n). Likewise, O(n) has covering groups, t he pin groups, Pin(n). For n > 2, Spin(n) is
simply connect ed and t hus t he universal covering group for SO(n). By far t he most famous
example of a spin group is Spin(3), which is not hing but SU(2), or t he group of unit
quat ernions.

The Pin and Spin groups are found wit hin Clifford algebras, which t hemselves can be built
from ort hogonal mat rices.

Rectangular matrices
If Q is not a square mat rix, t hen t he condit ions QT Q = I and QQT = I are not equivalent . The
condit ion QT Q = I says t hat t he columns of Q are ort honormal. This can only happen if Q is an
m × n mat rix wit h n ≤ m (due t o linear dependence). Similarly, QQT = I says t hat t he rows of Q
are ort honormal, which requires n ≥ m.

There is no st andard t erminology for t hese mat rices. They are variously called "semi-
ort hogonal mat rices", "ort honormal mat rices", "ort hogonal mat rices", and somet imes simply
"mat rices wit h ort honormal rows/columns".

For t he case n ≤ m, mat rices wit h ort honormal columns may be referred t o as ort hogonal k-
frames and t hey are element s of t he St iefel manifold.
See also

Biorthogonal system

Notes

1. "Paul's online math notes" (http://tutorial.


math.lamar.edu/Classes/LinAlg/Orthogon
alMatrix.aspx) , Paul Dawkins, Lamar
University, 2008. Theorem 3(c)

2. "Finding the Nearest Orthonormal Matrix"


(http://people.csail.mit.edu/bkph/articles/
Nearest_Orthonormal_Matrix.pdf) ,
Berthold K.P. Horn, MIT.

3. "Newton's Method for the Matrix Square


Root" (http://www.maths.manchester.ac.u
k/~nareports/narep91.pdf) Archived (htt
ps://web.archive.org/web/201109291313
30/http://www.maths.manchester.ac.uk/~
nareports/narep91.pdf) 2011-09-29 at
the Wayback Machine, Nicholas J.
Higham, Mathematics of Computation,
Volume 46, Number 174, 1986.

References

Diaconis, Persi; Shahshahani, Mehrdad


(1987), "The subgroup algorithm for
generating uniform random variables",
Probability in the Engineering and
Informational Sciences, 1: 15–32,
doi:10.1017/S0269964800000255 (htt
ps://doi.org/10.1017%2FS0269964800
000255) , ISSN 0269-9648 (https://ww
w.worldcat.org/issn/0269-9648) ,
S2CID 122752374 (https://api.semanti
cscholar.org/CorpusID:122752374)
Dubrulle, Augustin A. (1999), "An
Optimum Iteration for the Matrix Polar
Decomposition" (http://etna.mcs.kent.e
du/) , Electronic Transactions on
Numerical Analysis, 8: 21–25

Golub, Gene H.; Van Loan, Charles F.


(1996), Matrix Computations (3/e ed.),
Baltimore: Johns Hopkins University
Press, ISBN 978-0-8018-5414-9
Higham, Nicholas (1986), "Computing
the Polar Decomposition—with
Applications" (http://eprints.maths.ma
nchester.ac.uk/694/1/high86p.pdf)
(PDF), SIAM Journal on Scientific and
Statistical Computing, 7 (4): 1160–
1174, doi:10.1137/0907079 (https://do
i.org/10.1137%2F0907079) ,
ISSN 0196-5204 (https://www.worldca
t.org/issn/0196-5204)
Higham, Nicholas; Schreiber, Robert
(July 1990), "Fast polar decomposition
of an arbitrary matrix", SIAM Journal on
Scientific and Statistical Computing, 11
(4): 648–655,
CiteSeerX 10.1.1.230.4322 (https://cite
seerx.ist.psu.edu/viewdoc/summary?d
oi=10.1.1.230.4322) ,
doi:10.1137/0911038 (https://doi.org/
10.1137%2F0911038) , ISSN 0196-
5204 (https://www.worldcat.org/issn/0
196-5204) , S2CID 14268409 (https://a
pi.semanticscholar.org/CorpusID:1426
8409) [1] (https://web.archive.org/we
b/20051016153437/http://www.ma.m
an.ac.uk/~higham/pap-mf.html)
Stewart, G. W. (1976), "The Economical
Storage of Plane Rotations",
Numerische Mathematik, 25 (2): 137–
138, doi:10.1007/BF01462266 (https://
doi.org/10.1007%2FBF01462266) ,
ISSN 0029-599X (https://www.worldca
t.org/issn/0029-599X) ,
S2CID 120372682 (https://api.semanti
cscholar.org/CorpusID:120372682)
Stewart, G. W. (1980), "The Efficient
Generation of Random Orthogonal
Matrices with an Application to
Condition Estimators", SIAM Journal on
Numerical Analysis, 17 (3): 403–409,
Bibcode:1980SJNA...17..403S (https://
ui.adsabs.harvard.edu/abs/1980SJN
A...17..403S) , doi:10.1137/0717034 (h
ttps://doi.org/10.1137%2F0717034) ,
ISSN 0036-1429 (https://www.worldca
t.org/issn/0036-1429)
Mezzadri, Francesco (2006), "How to
generate random matrices from the
classical compact groups", Notices of
the American Mathematical Society, 54,
arXiv:math-ph/0609050 (https://arxiv.o
rg/abs/math-ph/0609050) ,
Bibcode:2006math.ph...9050M (http
s://ui.adsabs.harvard.edu/abs/2006m
ath.ph...9050M)
External links

Wikiversity introduces the orthogonal


matrix.

"Orthogonal matrix" (https://www.ency


clopediaofmath.org/index.php?title=Or
thogonal_matrix) , Encyclopedia of
Mathematics, EMS Press, 2001 [1994]

Tutorial and Interactive Program on


Orthogonal Matrix (http://people.revole
du.com/kardi/tutorial/LinearAlgebra/M
atrixOrthogonal.html)

Retrieved from
"https://en.wikipedia.org/w/index.php?
title=Orthogonal_matrix&oldid=1202509034"
This page was last edited on 2 February 2024, at
22:52 (UTC). •
Content is available under CC BY-SA 4.0 unless
otherwise noted.

You might also like