Sie sind auf Seite 1von 13

28 CHAPTER 1.

ELEMENTS OF PROBABILITY DISTRIBUTION THEORY


1.10 Two-Dimensional Random Variables
Denition 1.14. Let be a sample space and X
1
, X
2
be functions, each assigning
a real number X
1
(), X
2
() to every outcome , that is X
1
: X
1
R
and X
2
: X
2
R. Then the pair X = (X
1
, X
2
) is called a two-dimensional
random variable. The induced sample space (range) of the two-dimensional ran-
dom variable is
X = {(x
1
, x
2
) : x
1
X
1
, x
2
X
2
} R
2
.

We will denote two-dimensional (bi-variate) random variables by bold capital let-


ters.
Denition 1.15. The cumulative distribution function of a two-dimensional rv
X = (X
1
, X
2
) is
F
X
(x
1
, x
2
) = P(X
1
x
1
, X
2
x
2
) (1.10)

1.10.1 Discrete Two-Dimensional Random Variables


If all values of X = (X
1
, X
2
) are countable, i.e., the values are in the range
X = {(x
1i
, x
2j
), i = 1, 2, . . . , j = 1, 2, . . .}
then the variable is discrete. The cdf of a discrete rv X = (X
1
, X
2
) is
F
X
(x
1
, x
2
) =

x
2j
x
2

x
1i
x
1
p
X
(x
1i
, x
2j
)
where p
X
(x
1i
, x
2j
) denotes the joint probability mass function and
p
X
(x
1i
, x
2j
) = P(X
1
= x
1i
, X
2
= x
2j
).
As in the univariate case, the joint pmf satises the following conditions.
1. p
X
(x
1i
, x
2j
) 0 , for all i, j
1.10. TWO-DIMENSIONAL RANDOM VARIABLES 29
2.

X
2

X
1
p
X
(x
1i
, x
2j
) = 1
Example 1.18. Consider an experiment of tossing two fair dice and noting the
outcome on each die. The whole sample space consists of 36 elements, i.e.,
= {
ij
= (i, j) : i, j = 1, . . . , 6}.
Now, with each of these 36 elements associate values of two random variables,
X
1
and X
2
, such that
X
1
sum of the outcomes on the two dice,
X
2
| difference of the outcomes on the two dice |.
That is,
X(
i,j
) = (X
1
(
i,j
), X
2
(
i,j
)) = (i + j, |i j|) i, j = 1, 2, . . . , 6.
Then, the bivariate rv X = (X
1
, X
2
) has the following joint probability mass
function (empty cells mean that the pmf is equal to zero at the relevant values of
the rvs).
x
1
2 3 4 5 6 7 8 9 10 11 12
0
1
36
1
36
1
36
1
36
1
36
1
36
1
1
18
1
18
1
18
1
18
1
18
2
1
18
1
18
1
18
1
18
x
2
3
1
18
1
18
1
18
4
1
18
1
18
5
1
18

Expectations of functions of bivariate random variables are calculated the same


way as of the univariate rvs. Let g(x
1
, x
2
) be a real valued function dened on X.
Then g(X) = g(X
1
, X
2
) is a rv and its expectation is
E[g(X)] =

X
g(x
1
, x
2
)p
X
(x
1
, x
2
).
30 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY
Example 1.19. Let X
1
and X
2
be random variables as dened in Example 1.18.
Then, for g(X
1
, X
2
) = X
1
X
2
we obtain
E[g(X)] = 2 0
1
36
+ . . . + 7 5
1
18
=
245
18
.

Marginal pmfs
Each of the components of the two-dimensional rv is a random variable and so
we may be interested in calculating its probabilities, for example P(X
1
= x
1
).
Such a uni-variate pmf is then derived in a context of the distribution of the other
random variable. We call it the marginal pmf.
Theorem 1.12. Let X = (X
1
, X
2
) be a discrete bivariate random variable with
joint pmf p
X
(x
1
, x
2
). Then the marginal pmfs of X
1
and X
2
, p
X
1
and p
X
2
, are
given respectively by
p
X
1
(x
1
) = P(X
1
= x
1
) =

X
2
p
X
(x
1
, x
2
) and
p
X
2
(x
2
) = P(X
2
= x
2
) =

X
1
p
X
(x
1
, x
2
).
Proof. For X
1
:
Let us denote by A
x
1
= {(x
1
, x
2
) : x
2
X
2
}. Then, for any x
1
X
1
we may
write
P(X
1
= x
1
) = P(X
1
= x
1
, x
2
X
2
)
= P
_
(X
1
, X
2
) A
x
1
_
=

(x
1
,x
2
)Ax
1
P(X
1
= x
1
, X
2
= x
2
)
=

X
2
p
X
(x
1
, x
2
).
For X
2
the proof is similar.

Example 1.20. The marginal distributions of the variables X


1
and X
2
dened in
Example 1.18 are following.
1.10. TWO-DIMENSIONAL RANDOM VARIABLES 31
x
1
2 3 4 5 6 7 8 9 10 11 12
P(X
1
= x
1
)
1
36
1
18
1
12
1
9
5
36
1
6
5
36
1
9
1
12
1
18
1
36
x
2
0 1 2 3 4 5
P(X
2
= x
2
)
1
6
5
18
2
9
1
6
1
9
1
18

Exercise 1.13. Students in a class of 100 were classied according to gender (G)
and smoking (S) as follows:
S
s q n
G male 20 32 8 60
female 10 5 25 40
30 37 33 100
where s, q and n denote the smoking status: now smokes, did smoke but quit
and never smoked, respectively. Find the probability that a randomly selected
student
1. is a male;
2. is a male smoker;
3. is either a smoker or did smoke but quit;
4. is a female who is a smoker or did smoke but quit.
32 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY
1.10.2 Continuous Two-Dimensional Random Variables
If the values of X = (X
1
, X
2
) are elements of an uncountable set in the Euclidean
plane, then the variable is jointly continuous. For example the values might be in
the range
X = {(x
1
, x
2
) : a x
1
b, c x
2
d}
for some real a, b, c, d.
The cdf of a continuous rv X = (X
1
, X
2
) is dened as
F
X
(x
1
, x
2
) =
_
x
2

_
x
1

f
X
(t
1
, t
2
)dt
1
dt
2
, (1.11)
where f
X
(x
1
, x
2
) is the joint probability density function such that
1. f
X
(x
1
, x
2
) 0 for all (x
1
, x
2
) R
2
2.
_

f
X
(x
1
, x
2
)dx
1
dx
2
= 1.
The equation (1.11) implies that

2
f
X
(x
1
, x
2
)
x
1
x
2
= f
X
(x
1
, x
2
). (1.12)
Also
P(a X
1
b, c X
2
d) =
_
d
c
_
b
a
f
X
(x
1
, x
2
)dx
1
dx
2
.
The marginal pdfs of X
1
and X
2
are dened similarly as in the discrete case, here
using integrals.
f
X
1
(x
1
) =
_

f
X
(x
1
, x
2
)dx
2
, for < x
1
< ,
f
X
2
(x
2
) =
_

f
X
(x
1
, x
2
)dx
1
, for < x
2
< .
1.10. TWO-DIMENSIONAL RANDOM VARIABLES 33
Example 1.21. Calculate P
_
X A
_
, where A = {(x
1
, x
2
) : x
1
+ x
2
1} and
the joint pdf of X = (X
1
, X
2
) is dened by
f
X
(x
1
, x
2
) =
_
6x
1
x
2
2
for 0 < x
1
< 1, 0 < x
2
< 1,
0 otherwise.
The probability is a double integral of the pdf over the region A. The region is
however limited by the domain in which the pdf is positive.
We can write
A = {(x
1
, x
2
) : x
1
+ x
2
1, 0 < x
1
< 1, 0 < x
2
< 1}
= {(x
1
, x
2
) : x
1
1 x
2
, 0 < x
1
< 1, 0 < x
2
< 1}
= {(x
1
, x
2
) : 1 x
2
< x
1
< 1, 0 < x
2
< 1}.
Hence, the probability is
P(X A) =
_ _
A
f
X
(x
1
, x
2
)dx
1
dx
2
=
_
1
0
_
1
1x
2
6x
1
x
2
2
dx
1
dx
2
= 0.9
Also, we can calculate marginal pdfs.
f
X
1
(x
1
) =
_
1
0
6x
1
x
2
2
dx
2
= 2x
1
x
3
2
|
1
0
= 2x
1
,
f
X
2
(x
2
) =
_
1
0
6x
1
x
2
2
dx
1
= 3x
2
1
x
2
2
|
1
0
= 3x
2
2
.
These functions allow us to calculate probabilities involving only one variable.
For example
P
_
1
4
< X
1
<
1
2
_
=
_ 1
2
1
4
2x
1
dx
1
=
3
16
.

Analogously to the discrete case, the expectation of a function g(X) is given by


E[g(X)] =
_

g(X)f
X
(x
1
, x
2
)dx
1
dx
2
.
Similarly as in the case of univariate rvs the following linear property for the
expectation holds for bi-variate rvs.
E[ag(X) + bh(X) + c] = a E[g(X)] + b E[h(X)] + c, (1.13)
where a, b and c are constants and g and h are some functions of the bivariate rv
X = (X
1
, X
2
).
34 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY
1.10.3 Conditional Distributions
Denition 1.16. Let X = (X
1
, X
2
) denote a discrete bivariate rv with joint
pmf p
X
(x
1
, x
2
) and marginal pmfs p
X
1
(x
1
) and p
X
2
(x
2
). For any x
1
such that
p
X
1
(x
1
) > 0, the conditional pmf of X
2
given that X
1
= x
1
is the function of x
2
dened by
p
X
2
|x
1
(x
2
) =
p
X
(x
1
, x
2
)
p
X
1
(x
1
)
.
Analogously, we dene the conditional pmf of X
1
given X
2
= x
2
p
X
1
|x
2
(x
1
) =
p
X
(x
1
, x
2
)
p
X
2
(x
2
)
.

It is easy to check that these functions are indeed pdfs. For example,

X
2
p
X
2
|x
1
(x
2
) =

X
2
p
X
(x
1
, x
2
)
p
X
1
(x
1
)
=

X
2
p
X
(x
1
, x
2
)
p
X
1
(x
1
)
=
p
X
1
(x
1
)
p
X
1
(x
1
)
= 1.
Example 1.22. Let X
1
and X
2
be dened as in Example 1.18. The conditional
pmf of X
2
given X
1
= 5, is
x
2
0 1 2 3 4 5
p
X
2
|X
1
=5
(x
2
) 0
1
2
0
1
2
0 0

Exercise 1.14. Let S and G denote the smoking status an gender as dened in
Exercise 1.13. Calculate the probability that a randomly selected student is
1. a smoker given that he is a male;
2. female, given that the student smokes.
Analogously to the conditional distribution for discrete rvs, we dene the condi-
tional distribution for continuous rvs.
1.10. TWO-DIMENSIONAL RANDOM VARIABLES 35
Denition 1.17. Let X = (X
1
, X
2
) denote a continuous bivariate rv with joint
pdf f
X
(x
1
, x
2
) and marginal pdfs f
X
1
(x
1
) and f
X
2
(x
2
). For any x
1
such that
f
X
1
(x
1
) > 0, the conditional pdf of X
2
given that X
1
= x
1
is the function of x
2
dened by
f
X
2
|x
1
(x
2
) =
f
X
(x
1
, x
2
)
f
X
1
(x
1
)
.
Analogously, we dene the conditional p.d.f. of X
1
given X
2
= x
2
f
X
1
|x
2
(x
1
) =
f
X
(x
1
, x
2
)
f
X
2
(x
2
)
.

Here too, it is easy to verify that these functions are pdfs. For example,
_
X
2
f
X
2
|x
1
(x
2
)dx
2
=
_
X
2
f
X
(x
1
, x
2
)
f
X
1
(x
1
)
dx
2
=
_
X
2
f
X
(x
1
, x
2
)dx
2
f
X
1
(x
1
)
=
f
X
1
(x
1
)
f
X
1
(x
1
)
= 1.
Example 1.23. For the random variables dened in Example 1.21 the conditional
pdfs are
f
X
1
|x
2
(x
1
) =
f
X
(x
1
, x
2
)
f
X
2
(x
2
)
=
6x
1
x
2
2
3x
2
2
= 2x
1
and
f
X
2
|x
1
(x
2
) =
f
X
(x
1
, x
2
)
f
X
1
(x
1
)
=
6x
1
x
2
2
2x
1
= 3x
2
2
.

The conditional pdfs allow us to calculate conditional expectations. The condi-


tional expected value of a function g(X
2
) given that X
1
= x
1
is dened by
E[g(X
2
)|x
1
] =
_

X
2
g(x
2
)p
X
2
|x
1
(x
2
) for a discrete r.v.,
_
X
2
g(x
2
)f
X
2
|x
1
(x
2
)dx
2
for a continuous r.v..
(1.14)
36 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY
Example 1.24. The conditional mean and variance of the X
2
given a value of X
1
,
for the variables dened in Example 1.21 are

X
2
|x
1
= E(X
2
|x
1
) =
_
1
0
x
2
3x
2
2
dx
2
=
3
4
,
and

2
X
2
|x
1
= var(X
2
|x
1
) = E(X
2
2
|x
1
) [E(X
2
|x
1
)]
2
=
_
1
0
x
2
2
3x
2
2
dx
2

_
3
4
_
2
=
3
80
.

Lemma 1.2. For random variables X and Y dened on support X and Y, re-
spectively, and a function g() whose expectation exists, the following result holds
E[g(Y )] = E{E[g(Y )|X]}.
Proof. From the denition of conditional expectation we can write
E[g(Y )|X = x] =
_
Y
g(y)f
Y |x
(y)dy.
This is a function of x whose expectation is
E
X
{E
Y
[g(Y )|X]} =
_
X
__
Y
g(y)f
Y |x
(y)dy
_
f
X
(x)dx
=
_
X
_
Y
g(y)f
Y |x
(y)f
X
(x)
. .
=f
(X,Y )
(x,y)
dydx
=
_
Y
g(y)
_
X
f
(X,Y )
(x, y)dx
. .
=f
Y
(y)
dy
= E[g(Y )].

Exercise 1.15. Show the following two equalities which result from the above
lemma.
1. E(Y ) = E{E[Y |X]};
2. var(Y ) = E[var(Y |X)] + var(E[Y |X]).
1.10. TWO-DIMENSIONAL RANDOM VARIABLES 37
1.10.4 Independence of Random Variables
Denition 1.18. Let X = (X
1
, X
2
) denote a continuous bivariate rv with joint
pdf f
X
(x
1
, x
2
) and marginal pdfs f
X
1
(x
1
) and f
X
2
(x
2
). Then X
1
and X
2
are
called independent random variables if, for every x
1
X
1
and x
2
X
2
f
X
(x
1
, x
2
) = f
X
1
(x
1
)f
X
2
(x
2
). (1.15)

We dene independent discrete random variables analogously.


If X
1
and X
2
are independent, then the conditional pdf of X
2
given X
1
= x
1
is
f
X
2
|x
1
(x
2
) =
f
X
(x
1
, x
2
)
f
X
1
(x
1
)
=
f
X
1
(x
1
)f
X
2
(x
2
)
f
X
1
(x
1
)
= f
X
2
(x
2
)
regardless of the value of x
1
. Analogous property holds for the conditional pdf of
X
1
given X
2
= x
2
.
Example 1.25. It is easy to notice that for the variables dened in Example 1.21
we have
f
X
(x
1
, x
2
) = 6x
1
x
2
2
= 2x
1
3x
2
2
= f
X
1
(x
1
)f
X
2
(x
2
).
So, the variables X
1
and X
2
are independent.

In fact, two rvs are independent if and only if there exist functions g(x
1
) and h(x
2
)
such that for every x
1
X
1
and x
2
X
2
,
f
X
(x
1
, x
2
) = g(x
1
)h(x
2
)
and the support for one variable does not depend on the support of the other vari-
able.
Theorem 1.13. Let X
1
and X
2
be independent random variables. Then
38 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY
1. For any A R and B R
P(X
1
A, X
2
B) = P(X
1
A)P(X
2
B),
that is, {X
1
A} and {X
2
B} are independent events.
2. For g(X
1
), a function of X
1
only, and for h(X
2
), a function of X
2
only, we
have
E[g(X
1
)h(X
2
)] = E[g(X
1
)] E[h(X
2
)].
Proof. Assume that X
1
and X
2
are continuous random variables. To prove the
theorem for discrete rvs we follow the same steps with sums instead of integrals.
1. We have
P(X
1
A, X
2
B) =
_
B
_
A
f
X
(x
1
, x
2
)dx
1
dx
2
=
_
B
_
A
f
X
1
(x
1
)f
X
2
(x
2
)dx
1
dx
2
=
_
B
__
A
f
X
1
(x
1
)dx
1
_
f
X
2
(x
2
)dx
2
=
_
A
f
X
1
(x
1
)dx
1
_
B
f
X
2
(x
2
)dx
2
= P(X
1
A)P(X
2
B).
2. Similar arguments as in Part 1 give
E[g(X
1
)h(X
2
)] =
_

g(x
1
)h(x
2
)f
X
(x
1
, x
2
)dx
1
dx
2
=
_

g(x
1
)h(x
2
)f
X
1
(x
1
)f
X
2
(x
2
)dx
1
dx
2
=
_

__

g(x
1
)f
X
1
(x
1
)dx
1
_
h(x
2
)f
X
2
(x
2
)dx
2
=
__

g(x
1
)f
X
1
(x
1
)dx
1
___

h(x
2
)f
X
2
(x
2
)dx
2
_
= E[g(X
1
)] E[h(X
2
)].

In the following theorem we will apply this result for the moment generating func-
tion of a sum of independent random variables.
1.10. TWO-DIMENSIONAL RANDOM VARIABLES 39
Theorem 1.14. Let X
1
and X
2
be independent random variables with moment
generating functions M
X
1
(t) and M
X
2
(t), respectively. Then the moment gener-
ating function of the sum Y = X
1
+ X
2
is given by
M
Y
(t) = M
X
1
(t)M
X
2
(t).
Proof. By the denition of the mgf and by Theorem 1.13, part 2, we have
M
Y
(t) = Ee
tY
= Ee
t(X
1
+X
2
)
= E
_
e
tX
1
e
tX
2
_
= E
_
e
tX
1
_
E
_
e
tX
2
_
= M
X
1
(t)M
X
2
(t).

Note that this result can be easily extended to a sum of any number of mutually
independent random variables.
Example 1.26. Let X
1
N(
1
,
2
1
) and X
2
N(
2
,
2
2
). What is the distribution
of Y = X
1
+ X
2
?
Using Theorem 1.14 we can write
M
Y
(t) = M
X
1
(t)M
X
2
(t)
= exp{
1
t +
2
1
t
2
/2} exp{
2
t +
2
2
t
2
/2}
= exp{(
1
+
2
)t + (
2
1
+
2
2
)t
2
/2}.
This is the mgf of a normal rv with E(Y ) =
1
+
2
and var(Y ) =
2
1
+
2
2
.
Exercise 1.16. A part of an electronic systemhas two types of components in joint
operation. Denote by X
1
and X
2
the random length of life (measured in hundreds
of hours) of component of type I and of type II, respectively. Assume that the joint
density function of two rvs is given by
f
X
(x
1
, x
2
) =
1
8
x
1
exp
_

x
1
+ x
2
2
_
I
X
,
where X = {(x
1
, x
2
) : x
1
> 0, x
2
> 0}.
1. Calculate the probability that both components will have a life length longer
than 100 hours, that is, nd P(X
1
> 1, X
2
> 1).
2. Calculate the probability that a component of type II will have a life length
longer than 200 hours, that is, nd P(X
2
> 2).
3. Are X
1
and X
2
independent? Justify your answer.
40 CHAPTER 1. ELEMENTS OF PROBABILITY DISTRIBUTION THEORY
4. Calculate the expected value of so called relative efciency of the two com-
ponents, which is expressed by
E
_
X
2
X
1
_
.

Das könnte Ihnen auch gefallen