Sie sind auf Seite 1von 5

Theorem 1.

Let Σ be a k × k real symmetric matrix with eigendecomposition QΛQ0 , where diagonal


elements of Λ are ranked in decreasing order. Let Qn be a sequence of k × k matrices and Λn be a
sequence of diagonal k × k matrices with elements ranked in decreasing order. Suppose that
(i) ΣQn − Qn Λn → 0 as n → ∞
0
(ii) Qn Qn → Ik as n → ∞
where convergence is with respect to the Frobenius norm. Then Λn → Λ as n → ∞. Moreover, if the
eigenvalues of Σ are distinct, then given a choice of sign for the columns of Q, there exist a sequence
of diagonal matrices Dn with elements ±1 such that Qn Dn → Q as n → ∞.
Proof. I will use the following facts from real analysis.
Fact 1. Let (xn ) be a bounded sequence in Rk . Then (xn ) has at least one cluster point x ∈ Rk . By
cluster point I mean that each neighborhood of x contains xn for infinitely many n. Equivalently,
there exist a subsequence (xm ) of (xn ) such that (xm ) converges to x.
Fact 2. Let (xn ) be a bounded sequence in Rk and suppose that (xn ) has exactly one cluster point
x. Then (xn ) converges to x.
j j
Let
Qj n denote the jth column of Qjn and λn denote the jth diagonal element of Λjn . From (ii)
we
have Qn → 1 and so the sequence Qn is bounded for each j. From (i)
we have ΣQn − λjn Qjn → 0

and so the sequence ΣQjn − λjn Qjn is bounded
for each j. Also, ΣQjn ≤ kΣk Qjn and Qjn
bounded implies that the sequence ΣQjn is bounded for each j. Next, using the triangle and
reverse triangle inequalities we have

j j
λn Qn ≤ ΣQjn − λjn Qjn + ΣQjn ≤ ΣQjn − λjn Qjn + ΣQjn

(1)
j j
and so we get that the sequence λn Qn is bounded for each j. Since Qjn → 1, we have that

|λjn | = λjn Qjn / Qjn is bounded for all sufficiently large n, and so we conclude that the sequence
λjn is bounded for each j. We have shown that the matrix sequences Λn and Qn are bounded.
By fact 1, we know that Λn has at least one cluster point, say Λ∗ , a diagonal matrix. Let Λm be
a subsequence of Λn converging to Λ∗ , and consider the corresponding subsequence Qm of Qn . The
sequence Qm is bounded, and so again by fact 1 has at least one cluster point, say Q∗ . Let Ql be a
subsequence of Qm converging to Q∗ . Then we have that Λl → Λ∗ and Ql → Q∗ .
Now, using continuity of matrix multiplication we get from (i) and (ii) that

ΣQ∗ − Q∗ Λ∗ = lim ΣQl − Ql Λl = lim ΣQn − Qn Λn = 0 (2)

0 0 0
Q∗ Q∗ = lim Ql Ql = lim Qn Qn = Ik (3)
∗ ∗−1 ∗ ∗
and so Λ = Q ΣQ . We deduce that the diagonal elements of Λ are the eigenvalues of Σ.
Moreover, since the diagonal elements of Λl are ranked in decreasing order, the diagonal elements of
Λ∗ must be ranked in decreasing order as well. Hence Λ∗ = Λ. We have shown that the only cluster
point of the sequence Λn is Λ, and so by fact 2 we have shown that Λn → Λ.
Next we show convergence of Qn . Suppose the eigenvalues of Σ are distinct. Then the columns
of Q are unique up to a sign. Choose
√ a sign for each column and let Q be the resulting matrix.
Since Λn − Λ → 0 and kQn k → k, we have Qn (Λn − Λ) → 0. Therefore (i) implies

ΣQn − Qn Λ = ΣQn − Qn Λn + Qn (Λn − Λ) → 0

1
Next, for each n, let Rn = Q−1 Qn . Then Rn is a sequence of k × k matrices. Using the eigende-
composition of Σ, continuity of matrix multiplication and the previous result we get

ΛRn − Rn Λ = ΛQ−1 Qn − Q−1 Qn Λ = Q−1 ΣQn − Q−1 Qn Λ = Q−1 (ΣQn − Qn Λ) → 0

Let Rnij with i 6= j be any off diagonal element of the matrix Rn in position (i, j). Then the previous
calculation shows that
(λi − λj )Rnij → 0
and, because eigenvalues are assumed to be distinct, we conclude Rnij → 0. Also, by (ii) we have
that
0 0 0 0
Rn Rn = Qn (Q−1 ) Q−1 Qn = Qn Qn → Ik
so we deduce that, for each i,

(Rnii )2 → 1
Now define the ith diagonal element of Dn , denoted din , as follows

din = 1{Rnii ≥0} − 1{Rnii <0}


Then din = ±1. Furthermore, continuity of the square root function implies
p
din Rnii = |Rnii | = (Rnii )2 → 1

for each i. We have shown that Rn Dn → Ik . Finally, continuity of matrix multiplication allows us
to conclude

Qn Dn = QRn Dn → Q

Theorem 2. Let Σ be a k × k real symmetric matrix with eigendecomposition QΛQ0 , where diagonal
elements of Λ are ranked in decreasing order. Let Qn be a sequence of random k × k matrices and Λn
be a sequence of random k × k diagonal matrices with elements ranked in decreasing order. Suppose
that
(i) ΣQn − Qn Λn → 0 almost surely
0
(ii) Qn Qn → Ik almost surely
where convergence is with respect to the Frobenius norm. Then Λn → Λ almost surely. Moreover,
if the eigenvalues of Σ are distinct, then given a choice of sign for the columns of Q, there exist a
sequence of random diagonal matrices Dn with elements ±1 such that Qn Dn → Q almost surely.
0 0
Proof. ΣQn −Qn Λn → 0 almost surely and Qn Qn → Ik almost surely implies (ΣQn −Qn Λn , Qn Qn ) →
(0, Ik ) jointly almost surely. Therefore, by Theorem 1, we get Λn → Λ almost surely. Suppose the
eigenvalues of Σ are distinct and choose a sign for the columns of Q. Let Dn be as in Theorem 1
0
for each state such that ΣQn − Qn Λn → 0 and Qn Qn → Ik holds, and let Dn = Ik in other states.
Then Qn Dn → Q almost surely.

2
Theorem 3. Let Σ be a k × k real symmetric matrix with eigendecomposition QΛQ0 , where diagonal
elements of Λ are ranked in decreasing order. Let Qn be a sequence of random k × k matrices and Λn
be a sequence of random k × k diagonal matrices with elements ranked in decreasing order. Suppose
that
(i) ΣQn − Qn Λn → 0 in probability
0
(ii) Qn Qn → Ik in probability
where convergence is with respect to the Frobenius norm. Then Λn → Λ in probability. Moreover,
if the eigenvalues of Σ are distinct, then given a choice of sign for the columns of Q, there exist a
sequence of random diagonal matrices Dn with elements ±1 such that Qn Dn → Q in probability.
Proof. I will use the following characterization of convergence in probability.
Fact 3. Xn converges to X in probability if and only if every subsequence of Xn has a sub-
subsequence that converges to X almost surely.
0 0
ΣQn − Qn Λn → 0 in probability and Qn Qn → Ik in probability implies (ΣQn − Qn Λn , Qn Qn ) →
(0, Ik ) jointly in probability. Let Λm be a subsequence of Λn , and consider the corresponding
0
joint subsequence (ΣQm − Qm Λm , Qm Qm ). Then fact 3 implies that there exist a sub-subsequence
0 0
(ΣQl − Ql Λl , Ql Ql ) such that (ΣQl − Ql Λl , Ql Ql ) → (0, Ik ) almost surely. Then Theorem 2 implies
Λl → Λ almost surely. We conclude by fact 3 that Λn → Λ in probability. Suppose the eigenvalues
of Σ are distinct and choose a sign for the columns of Q. We now reuse the arguments√ of Theorem
1, with all limits taken as probability limits. Since Λn − Λ → 0 and kQn k → k in probability, we
have Qn (Λn − Λ) → 0 in probability. Therefore (i) implies

ΣQn − Qn Λ = ΣQn − Qn Λn + Qn (Λn − Λ) → 0


in probability. For each n, let Rn = Q−1 Qn . Then Rn is a sequence of random k × k matrices.
Using the eigendecomposition of Σ, continuity of matrix multiplication and the previous result we
get

ΛRn − Rn Λ = ΛQ−1 Qn − Q−1 Qn Λ = Q−1 ΣQn − Q−1 Qn Λ = Q−1 (ΣQn − Qn Λ) → 0

in probability. Let Rnij with i 6= j be any off diagonal element of the matrix Rn in position (i, j).
Then the previous calculation shows that

(λi − λj )Rnij → 0

in probability and, because eigenvalues are assumed to be distinct, we conclude Rnij → 0 in proba-
bility. Also, by (ii) we have that
0 0 0 0
Rn Rn = Qn (Q−1 ) Q−1 Qn = Qn Qn → Ik
so we deduce that, for each i,

(Rnii )2 → 1
in probability. Now define the ith diagonal element of Dn , denoted din , as follows

din = 1{Rnii ≥0} − 1{Rnii <0}

3
Then din = ±1 in any state. Furthermore, continuity of the square root function implies
p
din Rnii = |Rnii | = (Rnii )2 → 1

in probability for each i. We have shown that Rn Dn → Ik in probability. Finally, continuity of


matrix multiplication allows us to conclude

Qn Dn = QRn Dn → Q
in probability.

Theorem 4. Let Σ be a k × k real symmetric matrix with eigendecomposition QΛQ0 , where diagonal
elements of Λ are ranked in decreasing order. Let Qn,T be a double sequence of random k×k matrices
and Λn,T be a double sequence of random k × k diagonal matrices with elements ranked in decreasing
order. Suppose that
(i) ΣQn,T − Qn,T Λn,T → 0 in probability
0
(ii) Qn,T Qn,T → Ik in probability
where convergence is with respect to the Frobenius norm. Then Λn,T → Λ in probability. Moreover,
if the eigenvalues of Σ are distinct, then given a choice of sign for the columns of Q, there exist a
double sequence of random diagonal matrices Dn,T with elements ±1 such that Qn,T Dn,T → Q in
probability.
Proof. I will use the following characterization of convergence in probability for double sequences.
Fact 4. Xn,T converges to X in probability as n, T → ∞ if and only if for any two strictly increasing
sequences of natural numbers (nl ) and (Tl ) we have that Xnl ,Tl converges to X in probability as
l → ∞.
Let (nl ) and (Tl ) be strictly increasing sequences of natural numbers. Then (i), (ii) and fact 4
0
imply that ΣQnl ,Tl − Qnl ,Tl Λnl ,Tl → 0 and Qnl ,Tl Qnl ,Tl → Ik in probability as l → ∞. Therefore by
Theorem 3 we deduce that Λnl ,Tl → Λ in probability as l → ∞. By fact 4 again we conclude that
Λn,T → Λ in probability as n, T → ∞. Suppose the eigenvalues of Σ are distinct and choose a sign
for the columns of Q. We now reuse the arguments√of Theorem 3, with all probability limits taken
as n, T → ∞. Since Λn,T − Λ → 0 and kQn,T k → k in probability, we have Qn,T (Λn,T − Λ) → 0
in probability. Therefore (i) implies

ΣQn,T − Qn,T Λ = ΣQn,T − Qn,T Λn,T + Qn,T (Λn,T − Λ) → 0


in probability. For each n, T , let Rn,T = Q−1 Qn,T . Then Rn,T is a double sequence of random k × k
matrices. Using the eigendecomposition of Σ, continuity of matrix multiplication and the previous
result we get

ΛRn,T − Rn,T Λ = ΛQ−1 Qn,T − Q−1 Qn,T Λ = Q−1 ΣQn,T − Q−1 Qn,T Λ = Q−1 (ΣQn,T − Qn,T Λ) → 0
ij
in probability. Let Rn,T with i 6= j be any off diagonal element of the matrix Rn,T in position (i, j).
Then the previous calculation shows that
ij
(λi − λj )Rn,T →0

4
ij
in probability and, because eigenvalues are assumed to be distinct, we conclude Rn,T → 0 in prob-
ability. Also, by (ii) we have that
0 0 0 0
Rn,T Rn,T = Qn,T (Q−1 ) Q−1 Qn,T = Qn,T Qn,T → Ik
so we deduce that, for each i,
ii
(Rn,T )2 → 1
in probability. Now define the ith diagonal element of Dn,T , denoted din,T , as follows

din,T = 1{Rn,T
ii ≥0} − 1{Rii <0}
n,T

Then din,T = ±1 in any state. Furthermore, continuity of the square root function implies
q
din,T Rn,T
ii ii
= |Rn,T |= ii )2 → 1
(Rn,T

in probability for each i. We have shown that Rn,T Dn,T → Ik in probability. Finally, continuity of
matrix multiplication allows us to conclude

Qn,T Dn,T = QRn,T Dn,T → Q


in probability.

Das könnte Ihnen auch gefallen