Beruflich Dokumente
Kultur Dokumente
MODULE 3
FUNCTION OF A RANDOM VARIABLE AND ITS
DISTRIBUTION
LECTURE 14
Topics
3.3 EXPECTATION AND MOMENTS OF A RANDOM
VARIABLE
Some special kinds of expectations which are frequently used are defined below.
Definition 3.2
(i) 𝜇1′ = 𝐸 𝑋 , provided it is finite, is called the mean of the (distribution of)
random variable 𝑋;
(ii) For 𝑟 ∈ 1, 2 ⋯ , 𝜇𝑟′ = 𝐸 𝑋 𝑟 , provided it is finite, is called the r-th moment
of the (distribution of) random variable 𝑋;
(iii) For 𝑟 ∈ 1, 2 ⋯ , 𝐸 𝑋 𝑟 , provided it is finite, is called the r-th absolute
moment of the (distribution of) random variable 𝑋;
𝑟
(iv) For 𝑟 ∈ 1, 2 ⋯ , 𝜇𝑟 = 𝐸 𝑋 − 𝜇1′ , provided it is finite, is called the r-th
central moment of the (distribution of) random variable 𝑋;
2
(v) 𝜇2 = 𝐸 𝑋 − 𝜇1′ , provided it is finite, is called the variance of the
(distribution of) random variable 𝑋. The variance of a random variable 𝑋 is
denoted by Var 𝑋 . The quantity 𝜎 = 𝜇2 = 𝐸( 𝑋 − 𝜇 2 ) is called the
standard deviation of the (distribution of) random variable 𝑋.
(vi) Suppose that the distribution function 𝐹𝑋 of a random variable X can be
decomposed as
𝐹𝑋 𝑥 = 𝛼𝐹𝑑 𝑥 + 1 − 𝛼 𝐹𝐴𝐶 𝑥 , 𝑥 ∈ ℝ, 𝛼 ∈ 0,1 ,
where 𝐹𝑑 is a distribution function of a discrete type random variable (say 𝑋𝑑 ) and 𝐹𝐴𝐶 is
a distribution function of an absolutely continuous type random variable (say 𝑋𝐴𝐶 ). Then,
for a Borel function : ℝ → ℝ, the expectation of 𝑋 is defined by
𝐸 𝑋 = 𝛼𝐸 𝑋𝑑 + 1 − 𝛼 𝐸 𝑋𝐴𝐶
Theorem 3.3
(i) If 1 and 2 are Borel functions such that 𝑃 1 (𝑋) ≤ 2 (𝑋) = 1 , then
𝐸 1 (𝑋) ≤ 𝐸 2 (𝑋) , provided the involved expectations are finite;
(ii) If, for real constants 𝑎 and 𝑏 with 𝑎 ≤ 𝑏, 𝑃 𝑎 ≤ 𝑋 ≤ 𝑏 = 1 , then 𝑎 ≤
𝐸(𝑋) ≤ 𝑏;
(iii) If 𝑃 𝑋 ≥ 0 = 1 and 𝐸 𝑋 = 0, then 𝑃 𝑋 = 0 = 1;
(iv) If 𝐸 𝑋 is finite, then 𝐸(𝑋) ≤ 𝐸 𝑋 ;
(v) For real constants 𝑎 and 𝑏, 𝐸 𝑎𝑋 + 𝑏 = 𝑎𝐸 𝑋 + 𝑏, provided the involved
expectations are finite;
(vi) If 1 , … , 𝑚 are Borel function then
𝑚 𝑚
𝐸 𝑖 𝑋 = 𝐸 𝑖 𝑋 ,
𝑖=1 𝑖=1
provided the involved expectations are finite.
Proof. We will provide the proof for the situation when 𝑋 is of absolutely continuous
type. The proof for the discrete case is analogous and is left as an exercise. Also
assertions (iv)-(vi) follow directly from the definition of the expectation of a random
variable and using elementary properties of integrals. Therefore we will provide the
proofs of only first three assertions.
Then 𝑔 𝑥 ≥ 0, ∀𝑥 ∈ ℝ, 𝑃 𝑋 ∈ 𝐴𝑐 = 0, 𝑃 𝑋 ∈ 𝑆𝑋 ∩ 𝐴𝑐 = 0.
𝑃 𝑋 ∈ 𝑆𝑋∗ = 𝑃 𝑋 ∈ 𝑆𝑋 ∩ 𝐴
= 𝑃 𝑋 ∈ 𝑆𝑋 ∩ 𝐴 + 𝑃 𝑋 ∈ 𝑆𝑋 ∩ 𝐴𝑐
= 𝑃 𝑋 ∈ 𝑆𝑋
= 1,
∞ ∞
𝑔 𝑥 𝑑𝑥 = 𝑓𝑋 𝑥 𝐼𝑆𝑋∗ 𝑥 𝑑𝑥
−∞ −∞
= 𝑃 𝑋 ∈ 𝑆𝑋∗
=1
and, for any 𝐵 ∈ ℬ1 ,
𝑃 𝑋 ∈ 𝐵 = 𝑃 𝑋 ∈ 𝑆𝑋 ∩ 𝐵 (since 𝑃 𝑋 ∈ 𝑆𝑋 = 1)
= 𝑃 𝑋 ∈ 𝑆𝑋 ∩ 𝐴 ∩ 𝐵 (since 𝑃 𝑋 ∈ 𝑆𝑋 ∩ 𝐴𝑐 ∩ 𝐵 = 0)
= 𝑃({𝑋 ∈ 𝑆𝑋∗ ∩ 𝐵})
∞
= 𝑔 𝑥 𝐼𝐵 𝑥 𝑑𝑥.
−∞
It follows that 𝑔 is also a p.d.f. of 𝑋 with support 𝑆𝑋∗ = 𝑆𝑋 ∩ 𝐴 ⊆ 𝐴. The
above discussion suggests that, without loss of generality, we may take
𝑆𝑋 ⊆ 𝐴 = {𝑥 ∈ ℝ: 1 𝑥 ≤ 2 (𝑥)} (otherwise replace 𝑓𝑋 (⋅) by 𝑔(⋅) and 𝑆𝑋 by
𝑆𝑋∗ ). Then
1 𝑥 𝐼𝑆𝑋 𝑥 𝑓𝑋 𝑥 ≤ 2 𝑥 𝐼𝑆𝑋 𝑥 𝑓𝑋 𝑥 , ∀𝑥 ∈ ℝ
⟹ 𝐸 1 𝑋
∞ ∞
= 1 𝑥 𝐼𝑆𝑋 𝑥 𝑓𝑋 𝑥 𝑑𝑥 ≤ 2 𝑥 𝐼𝑆𝑋 𝑥 𝑓𝑋 𝑥 𝑑𝑥 = 𝐸 2 𝑋 .
−∞ −∞
0=𝐸 𝑋
0 ∞
= 𝑥 𝑓𝑋 𝑥 𝑑𝑥 + 𝑥 𝑓𝑋 𝑥 𝑑𝑥
−∞ 0
= 𝑥 𝑓𝑋 𝑥 𝑑𝑥
0
≥ 𝑥 𝑓𝑋 𝑥 𝑑𝑥
1
𝑛
∞
1
≥ 𝑓𝑋 𝑥 𝑑𝑥
𝑛
1
𝑛
1 1
= 𝑃 𝑋≥
𝑛 𝑛
1
⟹𝑃 𝑋≥ = 0, ∀ 𝑛 ∈ 1, 2, ⋯
𝑛
1
⟹ lim 𝑃 𝑋≥ =0
𝑛→∞ 𝑛
∞
1 1
⟹𝑃 𝑋≥ =0 since 𝑥 ≥ ↑
𝑛 𝑛
𝑛=1
⟹𝑃 𝑋>0 =0
Corollary 3.1
Let 𝑋 be random variable with finite first two moments and let 𝐸 𝑋 = 𝜇. Then,
2
(i) Var 𝑋 = 𝐸 𝑋 2 − 𝐸 𝑋 ;
(ii) Var 𝑋 ≥ 0. Moreover, Var 𝑋 = 0 if, and only if, 𝑃 𝑋 = 𝜇 = 1;
2
(iii) 𝐸 𝑋2 ≥ 𝐸 𝑋 (Cauchy – Schwarz inequality);
(iv) For real constants 𝑎 and 𝑏, Var 𝑎𝑋 + 𝑏 = 𝑎2 Var 𝑋 .
Proof.
(i) Note that 𝜇 = 𝐸 𝑋 is a fixed real number. Therefore, using Theorem 3.3 (v)-(vi),
we have
Var 𝑋 = 𝐸 𝑋 − 𝜇 2
= 𝐸 𝑋 2 − 2 𝜇𝐸 𝑋 + 𝜇 2
= 𝐸 𝑋 2 − 𝜇2
2
= 𝐸 𝑋2 − 𝐸 𝑋 .
𝑌−𝐸 𝑌 = 𝑎 𝑋−𝐸 𝑋
2
and Var 𝑌 = 𝐸 𝑌−𝐸 𝑌
2
= 𝐸 𝑎2 𝑋 − 𝐸 𝑋
2
= 𝑎2 𝐸 𝑋−𝐸 𝑋
= 𝑎2 Var 𝑋 ⋅ ▄
Example 3.5
1
, if − 2 < 𝑥 < −1
2
𝑓𝑋 𝑥 = 𝑥 ⋅
, if 0 < 𝑥 < 3
9
0, otherwise
𝐸 𝑌1𝑟 = 𝐸 max 𝑋, 0 𝑟
∞
𝑟
= max 𝑥, 0 𝑓𝑋 𝑥 𝑑𝑥
−∞
3
𝑥 𝑟+1
= 𝑑𝑥
9
0
3𝑟
= ∙
𝑟+2
(iii) We have
∞
𝐸 𝑋 = 𝑥 𝑓𝑋 𝑥 𝑑𝑥
−∞
−1 3
𝑥 𝑥2
= 𝑑𝑥 + 𝑑𝑥
2 9
−2 0
1
=
4
and
∞
𝐸 𝑒 − max 𝑋,0
= 𝑒 − max 𝑥,0
𝑓𝑋 𝑥 𝑑𝑥
−∞
−1 3
1 𝑥 −𝑥
= 𝑑𝑥 + 𝑒 𝑑𝑥
2 9
−2 0
11 − 8 𝑒 −3
= ∙
18
Therefore,
𝐸 𝑌2 = 𝐸 2𝑋 + 3𝑒 − max 𝑋,0
+4
= 2 𝐸 𝑋 + 3𝐸 𝑒 − max 𝑋,0
+4
19 − 4𝑒 −3
= .▄
3
Example 3.6
where 𝑛 ∈ 1, 2, ⋯ , , 𝑝 ∈ 0, 1 and 𝑞 = 1 − 𝑝.
Solution.
𝑛
𝑛!
= 𝑥 𝑥−1 ⋯ 𝑥−𝑟+1 𝑝 𝑥 𝑞 𝑛 −𝑥
𝑥! (𝑛 − 𝑥)!
𝑥=𝑟
𝑛
𝑟 𝑛 − 𝑟 𝑥−𝑟 𝑛−𝑟 − 𝑥−𝑟
=𝑛 𝑛−1 ⋯ 𝑛−𝑟+1 𝑝 𝑝 𝑞
𝑥−𝑟
𝑥=𝑟
𝑛−𝑟
𝑟 𝑛−𝑟 𝑥 𝑛 −𝑟 −𝑥
=𝑛 𝑛−1 ⋯ 𝑛−𝑟+1 𝑝 𝑝 𝑞
𝑥
𝑥=0
= 𝑛 𝑛 − 1 ⋯ 𝑛 − 𝑟 + 1 𝑝𝑟 𝑞 + 𝑝 𝑛−𝑟
= 𝑛 𝑛 − 1 ⋯ 𝑛 − 𝑟 + 1 𝑝𝑟 ∙
𝑛
𝑛
= 𝑝𝑒 𝑡 𝑥 𝑛−𝑥
𝑞
𝑥
𝑥=0
= 𝑞 + 𝑝𝑒 𝑡 𝑛
.
Therefore ,
𝐸 𝑇 = 𝐸 𝑒 𝑋 + 2𝑒 −𝑋 + 6𝑋 2 + 3𝑋 + 4
= 𝐸 𝑒 𝑋 + 2 𝐸 𝑒 −𝑋 + 6 𝐸 𝑋 2 + 3 𝐸 𝑋 + 4
𝑛
= 𝑞 + 𝑝𝑒 + 2𝑒 −𝑛 𝑞𝑒 + 𝑝 𝑛
+ 6𝑛 𝑛 − 1 𝑝2 + 3𝑛𝑝 + 4. ▄
We are familiar with the Laplace transform of a given real-valued function defined on ℝ.
We also know that, under certain conditions, the Laplace transform of a function
determines the function almost uniquely. In probability theory the Laplace transform of a
p.d.f./p.m.f. of a random variable 𝑋 plays an important role and is referred to as moment
generating function (of probability distribution) of random variable 𝑋.
Definition 3.3
𝑀𝑋 𝑡 = 𝐸 𝑒 𝑡𝑋 , 𝑡 ∈ 𝐴.
(i) We call the function 𝑀𝑋 (⋅) the moment generating function (m.g.f.) (of
probability distribution) of random variable 𝑋;
(ii) We say that the m.g.f. of a random variable 𝑋 exists if there exists a positive
real number 𝑎 such that – 𝑎, 𝑎 ⊆ 𝐴 (i.e., if 𝑀𝑋 𝑡 = 𝐸 𝑒 𝑡𝑋 is finite in an
interval containing 0). ▄
The name moment generating function to the transform 𝑀𝑋 is derived from the fact that
𝑀𝑋 (⋅) can be used to generate moments of random variable 𝑋 , as illustrated in the
following theorem.
Theorem 3.4
Let 𝑋 be a random variable with m.g.f. 𝑀𝑋 that is finite on an interval −𝑎, 𝑎 , for some
𝑎 > 0 (i.e., m.g.f. of 𝑋 exists). Then,
Proof. We will provide the proof for the case where 𝑋 is of absolutely continuous type.
The proof for the case of discrete type 𝑋 follows in the similar fashion with integral signs
replaced by summation sings.
0 ∞
−𝑡|𝑥|
⇒ 𝑒 𝑓𝑋 𝑥 𝑑𝑥 < ∞, ∀𝑡 ∈ −𝑎, 𝑎 and 𝑒 𝑡|𝑥| 𝑓𝑋 𝑥 𝑑𝑥 < ∞, ∀𝑡 ∈ −𝑎, 𝑎
−∞ 0
0 ∞
0 ∞
= 𝑥 𝑟 𝑓𝑋 𝑥 𝑑𝑥 + 𝑥 𝑟 𝑓𝑋 𝑥 𝑑𝑥
𝑥 ≤𝐴𝑟,𝑡 𝑥 >𝐴𝑟,𝑡
≤ 𝐴𝑟𝑟 ,𝑡 𝑓𝑋 𝑥 𝑑𝑥 + 𝑒 |𝑡𝑥 | 𝑓𝑋 𝑥 𝑑𝑥
𝑥 ≤𝐴𝑟,𝑡 𝑥 >𝐴𝑟,𝑡
≤ 𝐴𝑟𝑟 ,𝑡 + 𝑒 |𝑡𝑥 | 𝑓𝑋 𝑥 𝑑𝑥
−∞
< ∞.
𝑀𝑋 𝑡 = 𝑒 𝑡𝑥 𝑓𝑋 𝑥 𝑑𝑥
−∞
∞
(𝑟) 𝑑𝑟
and 𝑀𝑋 𝑡 = 𝑟 𝑒 𝑡𝑥 𝑓𝑋 𝑥 𝑑𝑥.
𝑑𝑡
−∞
= 𝑥 𝑟 𝑒 𝑡𝑥 𝑓𝑋 𝑥 𝑑𝑥
−∞
∞
(𝑟)
⇒ 𝑀𝑋 0 = 𝑥 𝑟 𝑓𝑋 𝑥 𝑑𝑥 = 𝐸 𝑋 𝑟 .
−∞
𝑀𝑋 𝑡 = 𝑒 𝑡𝑥 𝑓𝑋 𝑥 𝑑𝑥
−∞
∞ ∞
𝑡𝑟 𝑥𝑟
= 𝑓𝑋 𝑥 𝑑𝑥.
𝑟!
−∞ 𝑟=0
∞ ∞
𝑡𝑟
𝑀𝑋 𝑡 = 𝑥 𝑟 𝑓𝑋 𝑥 𝑑𝑥
𝑟!
𝑟=0 −∞
∞
𝑡𝑟 ′
= 𝜇 . ▄
𝑟! 𝑟
𝑟=0
Corollary 3.2
𝑀𝑋1 0
𝜓𝑋1 0 = =𝐸 𝑋 ,
𝑀𝑋 0
2
𝑀𝑋 0 𝑀𝑋2 0 − 𝑀𝑋1 0
and 𝜓𝑋2 0 = 2
𝑀𝑋 0
2
= 𝐸 𝑋2 − 𝐸 𝑋
= Var 𝑋 . ▄
Example 3.7
𝑒 −𝜆 𝜆𝑥
𝑓𝑋 𝑥 = , if 𝑥 ∈ 0, 1, 2, ⋯ ,
𝑥!
0, otherwise
where 𝜆 > 0.
Solution.
(i) We have
∞ ∞
𝑡𝑋 𝑡𝑥
𝑒 −𝜆 𝜆𝑥 (𝜆𝑒 𝑡 )𝑥 𝑡 𝑒 𝑡 −1
𝑀𝑋 𝑡 = 𝐸 𝑒 = 𝑒 = 𝑒 −𝜆 = 𝑒 −𝜆 𝑒 𝜆𝑒 = 𝑒 𝜆 , ∀𝑡 ∈ ℝ.
𝑥! 𝑥!
𝑥=0 𝑥=0
Since 𝐴 = 𝑠 ∈ ℝ: 𝐸 𝑒 𝑠𝑋 < ∞ = ℝ , by Theorem 3.4 (i), for every 𝑟 ∈
1, 2, ⋯ , 𝜇𝑟′ = 𝐸 𝑋 𝑟 is finite. Clearly
𝑒 𝑡 −1 𝑒 𝑡 −1
𝑀𝑋1 𝑡 = 𝜆𝑒 𝑡 𝑒 𝜆 and 𝑀𝑋2 𝑡 = 𝜆𝑒 𝑡 𝑒 𝜆 1 + 𝜆𝑒 𝑡 , 𝑡 ∈ ℝ.
Therefore,
𝐸 𝑋 = 𝑀𝑋1 0 = 𝜆,
𝐸 𝑋 2 = 𝑀𝑋2 0 = 𝜆 1 + 𝜆 ,
2
and Var 𝑋 = 𝐸 𝑋 2 − 𝐸 𝑋 = 𝜆.
(ii) We have, for 𝑡 ∈ ℝ,
𝜓𝑋 𝑡 = ln 𝑀𝑋 𝑡 = 𝜆 𝑒 𝑡 − 1 ,
⇒ 𝜓𝑋1 𝑡 = 𝜓𝑋2 𝑡 = 𝜆𝑒 𝑡 .
Therefore,
𝐸 𝑋 = 𝜓𝑋1 0 = 𝜆 and Var 𝑋 = 𝜓𝑋2 0 = 𝜆.
(iii) We have
𝑒 𝑡 −1
𝑀𝑋3 𝑡 = 𝜆𝑒 𝑡 𝑒 𝜆 𝜆2 𝑒 2𝑡 + 3𝜆𝑒 𝑡 + 1 , 𝑡 ∈ ℝ
⇒ 𝜇3′ = 𝐸 𝑋 3 = 𝑀𝑋3 0 = 𝜆 𝜆2 + 3𝜆 + 1 .
𝑡2 𝑡3
𝑀𝑋 𝑡 = 1 + 𝜇1′ 𝑡 + 𝜇2′ + 𝜇3′ +⋯
2! 3!
𝑡2 2
𝑡3
= 1 + 𝜆𝑡 + 𝜆 𝜆 + 1 + 𝜆 𝜆 + 3𝜆 + 1 + ⋯ , 𝑡 ∈ ℝ. ▄
2! 3!
Example 3.8
𝑒 −𝑥 , if 𝑥 > 0
𝑓𝑋 𝑥 = .
0, otherwise
Solution.
(i) We have
∞ ∞
𝑡𝑋 𝑡𝑥 −𝑥
𝑀𝑋 𝑡 = 𝐸 𝑒 = 𝑒 𝑒 𝑑𝑥 = 𝑒 − 1−𝑡 𝑥 𝑑𝑥 < ∞, if 𝑡 < 1.
0 0
Clearly 𝐴 = 𝑠 ∈ ℝ: 𝐸 𝑒 𝑠𝑋 < ∞ = −∞, 1 ⊃ −1, 1 and 𝑀𝑋 𝑡 =
−1
1 − 𝑡 , 𝑡 < 1. By Theorem 3.4 (i), for every 𝑟 ∈ 1, 2, ⋯ , 𝜇𝑟′ is finite.
Clearly
𝑀𝑋1 𝑡 = 1 − 𝑡 −2
and 𝑀𝑋2 𝑡 = 2 1 − 𝑡 −3
, 𝑡 < 1,
𝐸 𝑋 = 𝑀𝑋1 0 = 1,
𝐸 𝑋2 = 𝑀𝑋2 0 = 2,
2
and Var 𝑋 = 𝐸 𝑋 2 − 𝐸 𝑋 = 1.
(ii) We have
𝜓𝑋 𝑡 = ln 𝑀𝑋 𝑡 = − ln 1 − 𝑡 , 𝑡 < 1
1 1
⇒ 𝜓𝑋1 𝑡 = and 𝜓𝑋2 𝑡 = , 𝑡<1
1−𝑡 1−𝑡 2
⇒ 𝐸 𝑋 = 𝜓𝑋1 0 = 1 and Var 𝑋 = 𝜓𝑋2 0 = 1.
(iii) We have
∞
−1
𝑀𝑋 𝑡 = 1 − 𝑡 = 𝑡 𝑟 , 𝑡 ∈ −1, 1 .
𝑟=0
= 𝑟! . ▄
Example 3.9
1 1
𝑓𝑋 𝑥 = . , −∞ < 𝑥 < ∞.
π 1 + 𝑥2
Show that the m.g.f. of 𝑋 does not exist.
Solution. From Example 3.4 we know that the expected value of 𝑋 is not finite.
Therefore, using Theorem 3.4 (i), we conclude that the m.g.f. of 𝑋 does not exist. ▄