Sie sind auf Seite 1von 6

4.

MULTIVARIATE NORMAL DISTRIBUTION (Part I) 1

Lecture 3 Review:
Random vectors: vectors of random variables.

The expectation of a random vector is just the vector of expectations.

cov(X, Y) is a matrix with i, j entry cov(Xi , Yj )

cov(AX, BY) = Acov(X, Y)B0

We introduced quadratic forms X0 AX, where X is a random vector and A is


a matrix. More to come . . .

4.1 Definition of the Multivariate Normal Distribution


The following are equivalent definitions of the multivariate normal distribution (MVN).
Given a vector and p.s.d. matrix , Y Nn (, ) if:

Definition 1: For p.d. , the density function of Y is


1
fY (y) = (2)n/2 ||1/2 exp{ (y )0 1 (y )}.
2

Definition 2: The moment generating function (m.g.f.) of Y is

0 1
MY (t) E[et Y ] = exp{0 t + t0t}.
2

Definition 3: Y has the same distribution as

AZ + ,

where Z = (Z1 , . . . , Zk ) are independent N(0, 1) random variables and Ank satisfies
AA0 = .

COMMENT: You may be inclined to focus on definition 1, but the others are more
useful.
2 4. MULTIVARIATE NORMAL DISTRIBUTION (Part I)

Theorem: Definitions 1, 2, and 3 are equivalent for > 0. Definitions 2 and 3 are
equivalent for 0
Proof of Def 3 Def 2:

For Zi N(0, 1),


Z zi2 /2 Z 2 /2
t i Zi zi ti e t2i /2 e(zi ti ) 2
MZi (ti ) = E[e ]= e dzi = e dzi = eti /2.
2 2

If Z = (Z1 , . . . , Zk ) is a random sample from N(0, 1), then


k k k k
Y ind Y Y X
MZ (t) = E[ei zi ti ] = E[ ezi ti ] = E[ezi ti ] = MZi (ti ) = exp{ t2i /2} = exp{t0t/2}.
i=1 i=1 i=1 i=1

If Y = AZ + ,

MY (t) E[exp{Y0 t}]


= E[exp{(AZ + )0t}]
= exp{0 t}E[exp{(AZ)0 t}]
= exp{0 t}MZ (A0 t)
1
= exp{0 t} exp{ (A0 t)0(A0 t)}
2
1
= exp{0 t} exp{ t0(AA0 )t}
2
1
= exp{0 t + t0 t}.
2
4. MULTIVARIATE NORMAL DISTRIBUTION (Part I) 3

Proof of Def 2 Def 3:

Since 0 (and = 0 ), there exists an orthogonal matrix, Tnn , such that


T0T = , where is diagonal with non-negative elements. Therefore,

= TT0
= T1/2 1/2T0
= (T1/2)(T1/2 )0
= AA0 .

In other words, let A = T1/2. Now, in the previous proof we showed the m.g.f. of
AZ + is
1
exp{0 t + t0t},
2
the same as Y. Because the m.g.f. uniquely determines the distribution (when the
m.g.f. exists in a neighbourhood of t = 0), Y has the same distribution as AZ + .
4 4. MULTIVARIATE NORMAL DISTRIBUTION (Part I)

Proof of Def 3 Def 1: (for p.d. ).

Because is positive definite, there is a non-singular Ann such that AA0 =


(lecture notes # 2, page 10). Let Y = AZ + , where Z = (Z1 , . . . , Zn ) is a random
sample from N(0, 1). The density of Z is
n
Y 1 1
fZ (z) = (2)1/2 exp{ Zi2} = (2)n/2 exp{ Z0 Z}.
i=1
2 2

The density function of Y is

fY (y) = fZ (z(y))|J |,

where J is the Jacobian


 
Zi
J=
= |A1 | = |A|1 ,
Yj

because Z = A1 (Y ). Therefore,

fY (y) = fZ (A1 (y ))|A|1


1
= (2)n/2 |A|1 exp{ [A1 (y )]0[A1 (y )]}
2
1
= (2)n/2 |AA0 |1/2 exp{ (y )0(AA0 )1 (y )}
2
1
= (2)n/2 ||1/2 exp{ (y )01 (y )}
2
1 1 1 1 1 1
(Using: |A|1 = |A| 2 |A| 2 = |A| 2 |A0 | 2 = (|A||A0 |) 2 = |AA0 | 2 )

Proof of Def 1 Def 2 (for p.d. ): Exercise: Use pdf in Def 1 and solve directly
for mgf.
4. MULTIVARIATE NORMAL DISTRIBUTION (Part I) 5

4.2 Properties of the Multivariate Normal Distribution

1. E[Y] = , cov(Y) = (verify using Definition 3 and properties of means and


covariances of random vectors)

2. If Z = (Z1 , . . . , Zn ) is a random sample from N(0, 1) then Z has the Nn (0n , Inn )
distribution (use Definition 3).

3. If is not p.d. then Y has a singular MVN distribution and no density function
exists.

Example: A singular MVN distribution. Let Z = (Z1 , Z2 )0 N2 (0, I), and let A be
1
2
21
the linear transformation matrix A = 1 1 .
2 2
Let Y = (Y1 , Y2 )0 be the linear transformation
 
(Z1 Z2 )/2
Y = AZ = .
(Z2 Z1 )/2

By Definition 3, Y N(0, ), where = AA0 .


 1 1
 1 1
  1 1

2
2 2
2 2
2
= AA =0
=
21 1
2
12 1
2
12 1
2
 
1 1
corr= . Makes sense!
1 1
6 4. MULTIVARIATE NORMAL DISTRIBUTION (Part I)

4.3 Linear Transformations of MVN Vectors

1. If Y Nn (, ) and Cpn is a matrix of rank p, then CY Np (C, CC0 ).


Proof: By Def 3, Y = AZ + , where AA0 = . Then

CY = C(AZ + )
= CAZ + C
N(C, CA(CA)0 ) (by Def 3)
= N(C, C(AA0 )C)
= N(C, CC0 ).

2. Y is MVN if and only if a0 Y is normally distributed for all non-zero vectors a.


Proof: If Y Nn (, ) then a0 Y N(a0 , a0a) by 4.3.1 (above).
Conversely, assume that X = a0 Y is univariate normal for all non-zero a. In
other words, X N(a0 , a0a), where = E[Y] and = cov(Y). Using the
form of the m.g.f. of a univariate normal random variable, the m.g.f. of X is
1
E[exp(Xt)] = MX (t) = exp{(a0 )t + (a0 a)t2 }
2
for all t. Setting t = 1 in MX (t) gives MY :
1
E[exp(a0 Y)] = MX (t = 1) = exp{(a0) + (a0 a)} = MY (a),
2
which is the m.g.f. of Nn (, ). Therefore, Y Nn (, ) by Def 2.

In words, a random vector is MVN iff every linear combination of its random
variable components is a normal random variable.

Das könnte Ihnen auch gefallen