Sie sind auf Seite 1von 6

Term Paper

OF
MTH 101

Prove that the product of two orthogonal matrices is orthogonal, and so is the
inverse of an orthogonal matrix.What does this mean in terms of rotations?

Submitted to: Submitted By:

Miss Nivedita Sudhanshu

Roll no 90

Orthogonal Matrices
Definition:A matrix Q is orthogonal if QTQ = I, where I is the identity matrix and QT
denotes the transpose of Q.

It is a matrix that is the inverse of its transpose so that any two rows or any two columns are
orthogonal vectors

A square matrix A over F is said to be orthogonal if AAT = I;where here and hereafter I denotes an
identity matrix of appropriate dimension. Equivalently, A is orthogonal just when each row of A is or-

thogonal to every other row of A but has a scalar product of 1 with itself. Orthogonal matrices are
also characterized by their inverses.

A regular matrix (one whose determinant is not equal to zero) M is said to be


orthogonal if when multiplied by its transpose the identity matrix I is obtained; i.e.,
M*MT = I

How to recognize an Orthogonal Matrix.

A matrix A is orthogonal if and only if its columns form an orthonormal basis of R3 . Concretely, A is
orthogonal if and only if

〈Aei , Aej 〉 ={1 if i = j

{0 if i is not equal to j.

If M is orthogonal:

1. its transpose and inverse are identical: MT = M-1.


2. when multiplied by its transpose the product is commutative: M*MT = MT*M.
3. its transpose is also an orthogonal matrix.
4. when multipled by an orthogonal matrix the product is an orthogonal matrix.
5. its determinant is +/- 1. The reverse is not necessarily true; i.e., not all matrices whose
determinant is +/- 1 are orthogonal.
6. the sum of the square of the elements in a given row or column is equal to 1.
7. when multiplied, the corresponding elements in two rows or columns-i.e., dot product- is
equal to zero.

A square matrix (one with same number of rows and columns) is orthogonal if the following
conditions both exist:

1. the sum of the square of the elements in every row or column is equal to 1.
2. the sum of the products of corresponding elements in every pair of rows or columns -i.e.,
dot products- is equal to zero.

Example of Orthogonal Matrices


-7 4 4

4 -1 8
A=1/9
4 8 -1

If A is an orthogonal matrix, then the determinant of A is


either 1 or -1
Proof:

By definition, A is orthogonal means that A^T = A^(-1) (read as the


transpose of A is equal to its inverse).

Of course, from the definition of inverse of a matrix,

A * A^(-1) = I,

where I denotes the usual identity matrix.

It follows that for this orthogonal matrix A,

A * A^T = I.

Now taking determinant on both sides,

det (A * A^T) = det I

Then, invoking Thm A, det A * det A^T = 1 (since det I = 1. Recall


that the determinant of a diagonal matrix is the product of its
diagonal terms.)

Next, invoking Thm B, det A * det A = 1 (since det A^T = det A).

It follows that (det A)^2 = 1. Hence, det A = 1 or -1. (proven)


The product AB of two orthogonal n × n matrices A and B is orthogonal.

Proof-We use the characterization of orthogonal

matrices as those whose inverse equals their transpose. Thus, if we are given that

A and B are orthogonal, that means AAT = I and BB T = I, so we have

(AB)(AB)T = ABB T AT = AIAT = AAT = I,

so AB is orthogonal. Likewise,

(A−1 )(A−1 )T = (A−1 )(AT )T = A−1 A = I,

so A−1 is orthogonal.

The inverse A−1 of an orthogonal n×n matrix A is orthogonal.

Proof:

A matrix M is orthogonal if MT=M-1


Or multiply both sides by M and you have
1) M MT=I
or
2) MTM=I

Where I is the identity matrix.

So our definition tells us a matrix is orthogonal if its transpose equals its inverse or if the product
( left or right) of the the matrix and its transpose is the identity.

Now we want to show why the inverse of an orthogonal matrix is also orthogonal.
Let A be orthogonal. We are assuming it is square since it has an inverse.

Now we want to show that A-1 is orthogonal.


We need to show that the inverse is equal to the transpose.
Since A is orthogonal, A=AT
Let's multiply both sides by A-1

A-1 A= A-1 AT
Or A-1 AT =I
Compare this to the definition above in 1) (M MT=I)
do you see how A-1 now fits the definition of orthogonal?
Or course we could have multiplied on the left and then we would have arrived at 2) above.

In rotation
• An orthogonal matrix with determinant +1 is a rotation. However, a rotation

in three dimensions is not simply determined by its angle of rotation; one

must determine the axis of rotation, and specify from which direction you

are measuring the angle.

• Orthogonal matrices with determinant −1 are not rotations, but most of

them are not reflections either.

Some Properties and Applications of Simple Orthogonal Matrices


Conditions are found for a general transformation in the plane of two vectors u and v to be
orthogonal. The results characterize a rotation in the (u, v)-plane by the angle ø between u and v
and the angle of rotation. When ø = π/2 the Jacobi rotation matrix is a special case, but other
choices of ø are interesting. The rotation that maps a single vector x into a vector y of the same
size, by rotating in the (x, y)-plane, is found and this may be used in much the same way that
Householder transforms are used. If (x1, y1) and (x2, y2) are pairs of vectors compatible in size
and angle, the orthogonal matrix that rotates in a suitably chosen plane so that x1 → x2 and y1 →
y2 is found. This has applications in mapping two columns of a matrix to a simple form, similar
to Householder operations on a single column.

References:-

mathworld.wolfram.com/OrthogonalMatrix.html

tutorial.math.lamar.edu/Classes/LinAlg/OrthogonalMatrix.aspx

euclideanspace.com/maths/algebra/matrix/orthogonal/index.htm

inf.uni-konstanz.de/cgip/lehre/na_08/Lab1/1_Preliminaries/html/OrthogonalMatrices.html

unapologetic.wordpress.com/2009/08/07/unitary-and-orthogonal-matrices-and-orthonormal-
bases/

thestudentroom.co.uk/showthread.php?t=796092

Das könnte Ihnen auch gefallen