Sie sind auf Seite 1von 8

Module 1: Matrices and Linear Algebra

Lesson 6

Eigenvalues and Eigenvectors of Matrices

6.1 Introduction

The concept of eigenvalues and eigenvectors of matrices is very basic and having
wide application in science and engineering. Eigenvalues are useful in studying
differential equations and continuous dynamical systems. They provide critical
information in engineering design and also naturally arise in fields such as physics
and chemistry.

6.2 Eigenvalues and Eigenvectors

Let A be square matrix of size n over a real or complex field F. An element λ in F


is called an eigenvalue of A if there exists a non-zero vector x in Fn (or a n × 1
matrix) such that Ax = λx.

If λ is an eigenvalue of A then all the non-zero vectors x satisfying Ax = λx are


called eigenvectors corresponding to λ. For a single eigenvalue there may be
several eigenvectors associated with it. In fact all these eigenvectors form a
subspace as we shall see below.

Theorem 6.2.1: Let A be an n × n matrix, λ be an eigenvalue of A, and S be the set


of all eigenvectors corresponding to λ. Then SU{0} is a subspace of Fn.

Proof: Let x1, x2 be eigenvectors corresponding to λ. Then

A(x1 + x2) = Ax1 + Ax2 = λx1 + λx2 = λ(x1 + x2).


Eigenvalues and Eigenvectors of Matrices

A(αx1) = αAx1 = αλx1 = λ(αx1). So x1 + x2 , αx1 ϵ S and hence the result.

If S is the set of all eigenvectors corresponding to an eigenvalue λ then the


subspace SU{0} is called the eigenspace corresponding to the eigenvalue λ.
5 4
Example 6.2.1: For the matrix A =   over the real field , 6 is an
1 2
 4
eigenvalue because for the vector x =   in 2
1

 5 4  4   24   4
Ax =    =   = 6  1  = 6x.
 1 2  1   6   

2
Similarly y =  1
 
 is also an eigenvector of A corresponding to the eigenvalue 6.
2

Next we shall find all eigenvalues and associated eigenvectors of a matrix


systematically.

6.2.1 Method to find Eigenvalues and Eigenvectors

If λ is an eigenvalue of A and x is a corresponding eigenvector then

Ax = λx or (A − λI) x = 0 (6.1)

where I is the n × n identity matrix. Note that (6.1) is a homogeneous system of


linear equations. If (6.1) has a non-zero solution then rank (A − λI) < n. Then A –
Eigenvalues and Eigenvectors of Matrices

λI is not invertible and one gets that det (A − λI) = 0. Therefore if λ is an


eigenvalue of A then it satisfies the equation det (A - λI) = 0 (because it will have a
non-zero eigenvector). Since det (A − λI) is a polynomial in λ of degree n, we
obtain all values of λ by solving det (A − λI) = 0, and this equation will have n
solutions with counting multiplicities. We summarize the above discussions as
follows:

1. Eigenvalues of A are the solutions of det (A − λI) = 0.

2. If A is of size n then A has n number of eigenvalues with counting multiplicities.

3. If λ is an eigenvalue of A then all non-zero solutions of the system (A − λI) x =


0 are the eigenvectors of A corresponding to λ, here x = (x1, x2, . . . , xn)T.

Eigenvalues of matrices are sometimes called characteristic values. The equation


det (A − λI) = 0 is called the characteristic equation and det (A − λI) is called the
characteristic polynomial associated with A.

We explain this method of finding eigenvalues and eigenvectors of a matrix


through an example below.

Example 6.2.2: Find all eigenvalues and their corresponding eigenvectors of the
matrix
5 4 2
 
A = 4 5 2 .
2 2 
 2

Solution: The characteristic polynomial of A is


Eigenvalues and Eigenvectors of Matrices

5−λ 4 2
det (A - λI) = 4 5−λ 2 .
2 2 2−λ

= − ( λ − 10 )( λ − 1 )
2
.

So the characteristic equation is (λ − 10) (λ − 1)2 = 0 and the eigenvalues λ are


λ = 10, 1, 1.

Eigenvectors Corresponding to λ = 10: Here we solve the system (A – 10I) x = 0.

 −5 4 2   x1 
  
4 −5 2   x2  = 0
or 
 2 −8   x 3 
 2
 x1 
where x =  x 2  .
x 
 3

 −5 4 2
 
Echelon form of the co-efficient matrix is  0 −9 18  .
 0 0 
 0

So the given system of equations will be

– 5x1 + 4x2 + 2x3 = 0.

– 9x2 + 18x3 = 0.
Eigenvalues and Eigenvectors of Matrices

Here x3 is free variable. So let x3 = α, α ≠ 0, α ϵ . Then we get x2 = 2α and x1 =


x2 = 2α. So the set of all eigenvectors corresponding to λ = 10 is {(2α, 2α, α) : α ϵ
, α ≠ 0}.

Eigenvectors corresponding to λ = 1: Here we have to solve the system


(A - I) x = 0.

 4 4 2   x1 
  
or  4 4 2   x 2  = 0 .
 2 2 1 x 
  3 

 4 4 2
 
Echelon form of the co-efficient matrix is  0 0 0  . So the system will be
0 0 0
 
4x1 + 4x2 + 2x3 = 0.
or 2x1 + 2x2 + 2x3 = 0.

Here x2 and x3 are both free variables. So let x2 = α, x3 = β, α, β ϵ , and α = 0,

β = 0 cannot hold simultaneously. Then x1 = – (2α + β). The set of all

eigenvectors corresponding to λ = 1 is {(– (2α + β), α, β) : α, β ϵ , α and β do

not take the zero value simultaneously}

6.2.2 Properties of Eigenvalues and Eigenvectors

In the following we present some properties of eigenvalues and eigenvectors of


matrices:
Eigenvalues and Eigenvectors of Matrices

(1) The sum of the eigenvalues of a matrix A is equal to the sum of all diagonal
entries of A (called trace of A). This property provides a procedure for
checking eigenvalues.

(2) A matrix is invertible if and only it has non-zero eigenvalues.

This can be verified easily as det A = (A – 0I) = 0 if and only if 0 is an


eigenvalues of A. Also recall that det A = 0 if and only if A is not invertible.

(3) The eigenvalues of an upper (or lower) triangular matrix are the elements on
the main diagonal.

This is true because determinant of an upper (or lower) triangular matrix is


equal to the product of the (main) diagonal entries.

(4) If λ is an eigenvalue of A and if A is invertible then is an eigenvalue of A− 1.

Further if x is an eigenvector of A corresponding to λ then it is also an


eigenvector of A− 1 corresponding to .

The above is true because if x is an eigenvector of A corresponding to the


eigenvalue λ then Ax = λx. Multiplying both sides by A− 1, x = λ A− 1 x or

A− 1 x = x.

(5) If λ is an eigenvalue of A then αλ is an eigenvalue of αA where α is any real or


complex number. Further if x is an eigenvector of A corresponding to the
eigenvalue λ then x is also an eigenvector of αA corresponding to eigenvalue
αλ. This is true because (αA) x = (αλ) x.

(6) If λ is an eigenvalue of A then λk is an eigenvalue of Ak for any positive


integer k. Further if x is an eigenvector of A corresponding to the eigenvalue λ
Eigenvalues and Eigenvectors of Matrices

then x is also an eigenvector of Ak corresponding to the eigenvalue λk. This is


true because if x is an eigenvector of A corresponding the eigenvalue λ then

Ak x = Ak − 1(Ax) = Ak − 1(λx) = λ (Ak − 1x) = λ2 (Ak − 2x) = . . . = λk x.

(7) If λ is an eigenvalue of A, then for any real or complex number c, λ – c is an


eigenvalue of A – cI. Further if x is an eigenvector of A corresponding to the
eigenvalue λ then x is also an eigenvector of A – cI corresponding to the
eigenvalue λ – c

This is true because (A − cI) x = Ax – cx = λx – cx = (λ − c) x for an


eigenvalue λ and its corresponding eigenvector x of A.

(8) Every eigenvalue of A is also an eigenvalue of AT. One verifies this from the
fact that determinant of a matrix is same as the determinant of this transpose
and

A − λI | = | (AT)T − λIT | = | (AT − λI)T | = | AT − Iλ |.

(9) The product of all the eigenvalues (with counting multiplicity) of a matrix
equals the determinant of the matrix.

(10) Eigenvectors corresponding to distinct eigenvalues are linearly independent.

6.3 Conclusions

Some more properties of eigenvalues and eigenvectors will be discussed in the


next lecture. In a subsequent lecture we shall show that eigenvalues and
eigenvectors are used for diagonalization of matrices.

Keywords: Characteristic equation, eigenvalues, eigenvectors, properties of


eigenvalues and eigenvectors.
Eigenvalues and Eigenvectors of Matrices

Suggested Readings:

Linear Algebra, Kenneth Hoffman and Ray Kunze, PHI Learning pvt. Ltd., New
Delhi, 2009.

Linear Algebra, A. R. Rao and P. Bhimasankaram, Hindustan Book Agency, New


Delhi, 2000.

Linear Algebra and Its Applications, Fourth Edition, Gilbert Strang, Thomson
Books/Cole, 2006.

Matrix Methods: An Introduction, Second Edition, Richard Bronson, Academic


press, 1991.

Das könnte Ihnen auch gefallen