Sie sind auf Seite 1von 8

UCSD ECE269 Handout #18

Prof. Young-Han Kim Monday, March 19, 2018

Final Examination
(Total: 130 points)

There are 5 problems, each with multiple parts. Your answer should be as clear and succinct as
possible. You may include a simple justification to your answer in the space provided, but please
note that an erroneous argument could count against an otherwise correct answer.

1. True or false (5 points for each correct answer, -5 points for each wrong answer, 0 point for
each blank) Fill in each blank with “true” or “false”. A statement is considered “true” if it
holds for all matrices satisfying the conditions stated, and “false” otherwise. For example,

• A nonsingular matrix is invertible. True


• A strictly tall matrix is onto. False

Here the matrix dimensions are such that each expression makes sense, but they are otherwise
unspecified.

(a) A diagonalizable matrix A is nonsingular. False


As a counterexample, the matrix A = 0 is diagonalizable but singular.

(b) A nonsingular matrix A is diagonalizable. False


 
1 1
As a counterexample, the matrix A = is nonsingular but is not diagonalizable.
0 1

(c) A positive square matrix A is positive definite. False


 
3 5 h
−1 √1
i
As a counterexample, the matrix A = is positive, but for x = √ 2 2
, xT Ax =
5 3
−2. Thus, A is not positive definite.

(d) A positive definite and Hermitian matrix A is invertible. True


For a positive definite Hermitian matrix A = U ΛU H , A−1 is given by U Λ−1 U H .

(e) A square matrix A with real and positive eigenvalues is positive definite. False
 
1 −10
As a counterexample, consider A = . The eigenvalue of A is 1 (with multiplicity
0 1
 
1
two) but for x = , xT Ax = −8. Hence A is not positive definite.
1

1
(f) For a square matrix A, rank(A) ≤ rank(eA ). True
Using properties of the matrix exponential from Homework #5, e−A eA = eA−A = e0 = I.
Thus, eA is invertible, which implies that eA is full-rank and hence rank(eA ) ≥ rank(A).

(g) For a square matrix A, keA − Ik ≤ ekAk − 1. True


A2
Using the definition of eA , eA −I = A+ 2! + · · · . From properties of the spectral norm
it follows that
A2 A2 kAk2
keA − Ik = kA + + · · · k ≤ kAk + k k + · · · ≤ kAk + + · · · = ekAk − 1
2! 2! 2!

(h) If A is nilpotent, then I − A is invertible. True


The only unique eigenvalue of a nilpotent matrix A is 0 (cf. Question 4, Homework #6)
and thus the only unique eigenvalue of I − A is 1. This implies that I − A has no zero
eigenvalue, and is thus invertible.

(i) If A ∈ Cn×n such that kAk < 1, then I − A is invertible. True


For I − A to be non-invertible, A has to have 1 as an eigenvalue. Assume 1 is an
eigenvalue of A with the associated eigenvector v. Then,

kAxk kAvk kvk


kAk = max ≥ = = 1,
x6=0 kxk kvk kvk

which contradicts the fact that kAk < 1. Thus, A cannot have an eigenvalue of 1,
implying that I − A is invertible.

(j) Every eigenvalue λ of a complex square matrix A satisfies |λ| ≤ kAk. True
Consider an eigenvalue λ of A with associated eigenvector v. By definition of kAk,

kAxk kAvk |λ|kvk


kAk = max ≥ = = |λ|
x6=0 kxk kvk kvk

Thus for all eigenvalues λ of A, λ ≤ kAk.

2
2. Perpendicular distances (30 points). For v, w ∈ Rn with ||w|| = 1, define the “line” L in Rn
that goes through v in the direction w as

L = {v + wt : t ∈ R}.

(a) Let z ∈ Rn . Which point x∗ on the line L is the closest to z? Your answer should be in
terms of v, w, and z.
Solution:

x∗ = v + wwT (z − v) = v + hw, z − viw.

This can can be formulated as a least-squares problem

min ||wt − (z − v)||2 ,


t

whose solution is
1
t∗ = (wT w)−1 wT (z − v) = wT (z − v) = wT (z − v),
||w||2

which leads to
x∗ = v + wt∗ = v + wwT (z − v).

(b) Find the distance from z to the closest point x∗ on L in terms of v, w, and z.
Solution:
q
d(L, z) = (z − v)T (I − wwT ) (z − v).

We have

d(L, z)2 = ||z − x∗ ||2


= ||z − v − wwT (z − v)||2
= ||z − v||2 − (z − v)T wwT (z − v)
= (z − v)T I − wwT (z − v).


3
Now consider two lines
L1 = {v1 + w1 t1 : t1 ∈ R}
and
L2 = {v2 + w2 t2 : t2 ∈ R},
where ||w1 || = ||w2 || = 1.

(c) Find the distance between the two lines, that is, the smallest distance between a point
x1 on L1 and a point x2 on L2 , in terms of the vectors v1 , v2 , w1 , w2 , and any matrices
involving them. (Hint: You can consider the case w1 = ±w2 separately.)
Solution:

(p
(v2 − v1 )T (I − W (W T W )−1 W T ) (v2 − v1 ), w1 6= ±w2 ,
d(L1 , L2 ) = q
(v2 − v1 )T I − w1 w1T (v2 − v1 ),

w1 = ±w2 ,

where W := [w1 −w2 ].

The square of the distance can be found as the optimal valuefor the following least-
squares problem:
min ||W t − b||2 ,
t∈R2

where  
W := w1 −w2 ,
t := [t1 t2 ]T , and b := v2 − v1 . Therefore, we have

d(L1 , L2 )2 = ||b − ProjR(W ) (b)||2 = ||b − W W + b||2 = bT I − W W + b.




Now, if w1 6= ±w2 , then W is tall and full-rank, therefore W W + = W (W T W )−1 W T ,


whence
d(L1 , L2 )2 = bT I − W (W T W )−1 W T b.


If w1 = ±w2 , then W + = (1/2)W T , so that W W + = (1/2)W W T , whence

d(L1 , L2 )2 = bT I − (1/2)W W T b.


4
3. Matrix limits (20 points).
(a) Let  
1/3 2/3
A= .
2/3 1/3
Find the limit
lim An .
n→∞
Solution:

n 1/2 1/2
lim A = .
n→∞ 1/2 1/2

n
For any primitive matrix A with Perron–Frobenius eigenvalue λ, lim A λ → vwT
n→∞
where v and w are respectively the right and left eigenvectors associated with λ, nor-
malized so that v T w = 1. In this case, since A is the transition
  matrix of an irreducible
1
and aperiodic Markov chain, λ = 1, v is the all-ones vector , and w is the stationary
1
 
distribution 1/2 1/2 . Then,
 n    
A n 1   1/2 1/2
lim = lim A = 1/2 1/2 =
n→∞ 1 n→∞ 1 1/2 1/2
.

(b) For 0 < α, β, 1 − α − β < 1, let


 
α β 1−α−β
B = 1 − α − β α β .
β 1−α−β α
Find lim B n .
n→∞
Solution:
 
1/3 1/3 1/3
lim B n = 1/3 1/3 1/3 .
n→∞
1/3 1/3 1/3

Following the reasoning in Part (a), B is also the transition matrix of an irreducible,
aperiodic Markov
 chain with a state space of size 3. By symmetry, it is straightforward
to verify that 1/3 1/3 1/3 is the stationary distribution. Hence, in this case λ = 1,
 
1  
v = 1 and w = 1/3 1/3 1/3 . Then,

1
   
1  1/3 1/3 1/3
lim B n = 1 1/3 1/3 1/3 = 1/3 1/3 1/3

n→∞
1 1/3 1/3 1/3
.

5
4. Jordan canonical form (20 points). Suppose that a matrix A ∈ Rn×n has distinct eigenvalues
λ1 , . . . , λn . Let  
A I
B= .
0 A

(a) Find the eigenvalues of B (including multiplicity).


Solution:

B has eigenvalues λ1 , . . . , λn , each with multiplicity 2.

Since A has distinct eigenvalues, A is diagonalizable and we can write

A = T ΛT −1 ,

where Λ := diag(λ1 , . . . , λn ). We therfore have


 
A I
B=
0 A
T 0 Λ I T −1
   
0
=
0 T 0 Λ 0 T −1
   −1
T 0 Λ I T 0
= .
0 T 0 Λ 0 T

Therefore, B is similar to the upper-triangular matrix


 
Λ I
0 Λ

and therefore has the eigenvalues λ1 , . . . , λn , each with multiplicity 2.

(b) Find the Jordan blocks of B.


Solution:
 
J1
 J2 
J = ,
 
..
 . 
Jn
with  
λi 1
Ji := .
0 λi

We have    −1
T 0 Λ I T 0
B= .
0 T 0 Λ 0 T

6
Letting v1 , . . . , vn be the columns of T, this relation can also be written as

B = T̃ J T̃ −1 ,

where  
v 0 v2 0 · · · vn 0
T̃ = 1
0 v1 0 v2 · · · 0 vn
and  
J1
 J2 
J = ,
 
..
 . 
Jn
with  
λi 1
Ji := .
0 λi

7
5. Matrix polynomial (10 points). Is there a real square matrix A of any size such that

A2 + 2A + 5I = 0?

If yes, find one such matrix. If not, prove that no such A exists.
Solution:
 
−1 −2
Yes, real square matrices satisfying A2 + 5A + I exist. One example is A = .
2 −1

By the Cayley–Hamilton theorem, any matrix satisfies its own characteristic equation. Con-
sider all 2 × 2 matrices. For a 2 × 2 matrix A to satisfy A2 + 2A + 5I = 0, it needs to have
tr(A) = −2 and det(A) = 5. One such possible matrix is
 
−1 −2
A= .
2 −1

It can be verified
 that for this A, A2 +2A+5I = 0. Note that this is not a unique solution—as
−1 −4
an example, is also a valid solution.
1 −1

Das könnte Ihnen auch gefallen