Sie sind auf Seite 1von 30

Outline

Joint Distribution Functions


Covariance and Correlation: Quantifying Dependence
Multiple Random Variables
Michael Akritas
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Joint Distribution Functions
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Covariance and Correlation: Quantifying Dependence
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models

The joint cumulative distribution function of X and Y is


F(a, b) = P[X a, Y b], < a, b < .
Not as handy to use as in the univariate case. For example:
P(a
1
< X b
1
, a
2
< Y b
2
)
= F(b
1
, b
2
) F(a
1
, b
2
) F(b
1
, a
2
) + F(a
1
, a
1
)
while
P
_
(X, Y) {(x, y) : x
2
+ y
2
r
2
}
_
is not expressible in terms of F(x, y).
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
If X and Y are jointly continuous such probabilities can be given in
terms of the joint probability density function, which is a function
f : R
2
R such that
P((X, Y) A) =
_
A
_
f (x, y)dxdy, holds A R
2
In particular,

F(x, y) =
_

f (s, t)dsdt, and by the F.T.C.


2
F(x,y)
xy
= f (x, y), at continuity points of f .
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
If X and Y are jointly discrete probabilities P((X, Y) A) can be
given in terms of the joint probability mass function, which is
dened as
p(x, y) = P(X = x, Y = y)
The marginal distributions of X and Y are given by:
1. F
X
(a) = F(a, ), F
Y
(b) = F(, b).
2. p
X
(x) =

y:p(x,y)>0
p(x, y), p
Y
(y) =

x:p(x,y)>0
p(x, y).
3. f
X
(x) =
_

f (x, y)dy, f
Y
(y) =
_

f (x, y)dx.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
45% of customers will purchase a basic at screen TV, 15% will
purchase a high resolution at screen, and the rest will either just
browse or buy something else. Out of the next 5 customers, let X
1
denote the number who buy the basic model, and X
2
the number
who buy the high resolution model.
1. Find the joint distribution of X
1
, X
2
.
2. Find the marginal distributions of X
1
and X
2
.

This is an example of a Multinomial Distribution. See


Example 1f, page 240.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
45% of customers will purchase a basic at screen TV, 15% will
purchase a high resolution at screen, and the rest will either just
browse or buy something else. Out of the next 5 customers, let X
1
denote the number who buy the basic model, and X
2
the number
who buy the high resolution model.
1. Find the joint distribution of X
1
, X
2
.
2. Find the marginal distributions of X
1
and X
2
.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
Customers enter a department store according to a Poisson process
with rate per hour. Two thirds of the customers are female and
1/3 are male. Let X
1
, X
2
denote the number of female and male
customers entering the department store during the next hour.
1. Find the joint distribution of X
1
, X
2
.
2. Find the marginal distributions of X
1
and X
2
.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
5 transistors, 2 of which are defective, are tested one at a time until
both defective transistors are identied. Let N
1
be the number of
tests until the 1st defective is found, and N
2
be the number of
additional tests performed until the second defective is found.
1. Find the joint distribution of N
1
, N
2
.
2. Find the marginal distributions of N
1
and N
2
.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Denition
X, Y are independent if,
P(X A, Y B) = P(X A)P(Y B)
This is equivalent to
1. F(a, b) = F
X
(a)F
Y
(b)
2. p(x, y) = p
X
(x)p
Y
(y) if X, Y jointly discrete.
3. f (x, y) = f
X
(x)f
Y
(y) if X, Y jointly continuous.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Proposition
X, Y are independent i
f (x, y) = h(x)g(y)
Example
1. f (x, y) = 6e
2x
e
3y
I (0 < x < , 0 < y < )
2. f (x, y) = 24xyI (0 < x + y < 1)
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Sums of Independent Random Variables
If X, Y are independent, then
F
X+Y
(a) =
_

F
X
(a y)f
Y
(y)dy
f
X+Y
(a) =
_

f
X
(a y)f
Y
(y)dy
F
X+Y
is called the convolution of F
X
, F
Y
.
Example
If X U(0, 1), Y U(0, 1) are independent then
f
X+Y
(a) = aI (0 a 1) + (2 a)I (1 a 2)
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Denition (The Moment Generating Function of X)
M
X
(t) = E(e
tX
).
The basic facts about moment generating functions are:

If M
X
(t) and M
Y
(t) are nite and equal for all h < t < h,
some h > 0, then F
X
= F
Y
.

EX
n
= M
(n)
X
(0) =
d
n
dt
n
M
X
(t)

t=0

M
aX+b
(t) = e
bt
M
X
(at)

If X, Y are independent, M
X+Y
(t) = M
X
(t)M
Y
(t)
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
1. If X Bin(n, p) then M
X
(t) = [pe
t
+ (1 p)]
n
.
2. If X Poisson() then M
X
(t) = exp[(e
t
1)].
3. If X Gamma(, ), i.e.
f
X
(x) =

()
x
1
e
x
, > 0, > 0,
then M
X
(t) =
_

t
_

, for t < .
4. If X Exp() then M
X
(t) = /( t), for t < .
5. If X N(0, 1) then M
X
(t) = e
t
2
/2
.
See also pages 358, 359.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
1. The sum of two independent binomial rvs is a binomial rv.
2. The sum of two independent Poisson rvs is a Poisson rv.
3. The sum of two independent Normal rvs is a Normal rv.
Example
1. Find the rst two moments of a binomial random variable.
2. Find the rst two moments of an exponential random variable.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
The joint pmf of X = amount of drug administered to a randomly
selected rat, and Y = the number of tumors the rat develops, is:
y
p(x, y) 0 1 2
0.0 mg/kg .388 .009 .003
x 1.0 mg/kg .485 .010 .005
2.0 mg/kg .090 .008 .002
For jointly discrete X, Y,
P(Y = y|X = x) =
P(X = x, Y = y)
P(X = x)
=
p(x, y)
p
X
(x)
,
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
Find the conditional pmf of the number of tumors when the dosage
is 0 mg and when the dosage is 2 mg.
Solution:
y 0 1 2
p
Y|X
(y|X = 0) .388/.4=.97 .009/.4=.0225 .003/.4=.0075
p
Y|X
(y|X = 2) .090/.1=.9 .008/.1=.08 .002/.1=.02
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
Let X
1
Poisson(
1
) and X
2
Poisson(
2
) be independent. Find
the conditional distribution of X
1
given X
1
+ X
2
= n.
Answer:
P(X
1
= k|X
1
+ X
2
= n) =
_
n
k
__

1

1
+
2
_
k
_

2

1
+
2
_
nk
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Conditional Expected Value and Variance

The conditional pmf is a proper pmf. Thus,


p
Y|X
(y
j
|x) 0, for all j = 1, 2, . . ., and

j
p
Y|X
(y
j
|x) = 1.

The conditional expected value of Y given X = x is the


mean of the conditional pmf of Y given X = x. It is denoted
by E(Y|X = x) or
Y|X=x
or
Y|X
(x).

The conditional variance of Y given X = x is the variance


of the conditional pmf of of Y given X = x. It is denoted by

2
Y|X=x
or
2
Y|X
(x).
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
Find the conditional expected value and variance of the number of
tumors when X = 2.
Solution. Using the conditional pmf that we found before, we have,
E(Y|X = 2) = 0 (.9) + 1 (.08) + 2 (.02) = .12.
Compare this with E(Y) = .047. Next,
E(Y
2
|X = 2) = 0 (.9) + 1 (.08) + 2
2
(.02) = .16.
so that
2
Y|X
(2) = .16 .12
2
= .1456
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
The Regression Function

For a bivariate r.v. (X, Y), the regression function of Y on X


shows how the conditional mean of Y changes with the
observed value of X. More precisely,
Denition
The conditional expected value of Y given that X = x, i.e.

Y|X
(x) = E(Y|X = x),
when considered as a function of x, is called the regression
function of Y on X.

The regression function of Y on X is the fundamental


ingredient for predicting Y from and observed value of X.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
In the example where X = amount of drug administered and Y =
number of tumors developed we have
y 0 1 2
p
Y|X=0
(y) .97 .0225 .0075
p
Y|X=1
(y) .97 .02 .01
p
Y|X=2
(y) .9 .08 .02
Find the regression function of Y on X.
Solution: Here, E(Y|X = 0) = 0.0375, E(Y|X = 1) = 0.04,
E(Y|X = 2) = 0.12. Thus the regression function of Y on X is
x 0 1 2

Y|X
(x) 0.0375 0.04 0.12
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Proposition (Law of Total Probability for Expectations)
E(Y) =

xS
X
E(Y|X = x)p
X
(x).
Example
In the previous example we have
x 0 1 2
p
X
(x) .4 .5 .1.
Find
Y
.
Solution: 0.0375 .4 + 0.04 .5 + 0.12 .1 = .047
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models

If (X, Y) have joint pdf f (x, y),


f
Y|X
(y|x) =
f (x, y)
f
X
(x)
.

For each xed x, f


Y|X
(y|x) is a proper pdf:
P(Y A|X = x) =
_
A
f
Y|X
(y|x)dy
F
Y|X
(a|x) = P(Y a|X = x) =
_
a

f
Y|X
(y|x)dy
Note: We have just dened conditional probability given an event
of probability zero!!
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
If
f (x, y) =
e
x/y
e
y
y
I (x > 0)I (y > 0)
nd P(X > 1|Y = y) and E(X|Y = y).
Answer:
P(X > 1|Y = y) = e
1/y
, E(X|Y = y) = y.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models

The joint distribution can be specied from the conditional


and marginal distributions:
p(x, y) = p
Y|X
(y|x)p
X
(x), f (x, y) = f
Y|X
(y|x)f
X
(x)
This is called hierarchical modeling.

With hierarchical modeling the conditional and marginal


distributions do not have to be both continuous or both
discrete:
p
Y|X
(y|x)f
X
(x), or f
Y|X
(y|x)p
X
(x)
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Independent Random Variables
Moment Generating Functions
Conditional Distributions
Hierarchical Models
Example
Let X be the number of eggs an insect lays and Y the number of
eggs that survive. Suppose each egg survives with probability p.
The joint distribution of X, Y can be specied hierarchically:
Y|X Bin(X, p), X Poisson()
Then the distribution of the number Y surviving eggs can be
found from
P(Y = y) =

x=0
_
x
y
_
p
y
(1 p)
xy
e

x
x!
= e
p
(p)
y
y!
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence

Cov(X, Y) = E[(X
X
)(Y
Y
)] = E(XY)
X

Properties:
1. If X, Y are independent, Cov(X, Y) = 0.
2. Cov(X, Y) =Cov(Y, X)
3. Cov(X, Y) = Cov(X, Y)
4. Cov(aX + b, cY + d) = acCov(X, Y)
5. Cov(

n
i =1
X
i
,

m
j =1
Y
j
) =

n
i =1

m
j =1
Cov(X
i
, Y
j
)


X,Y
=Cov(X, Y)/(
X

Y
)

Properties:
1.
aX+b,cY+d
=
X,Y
2. 1
X,Y
1
3. = 1 (or 1) i P(Y = +X) = 1 with > 0
(or < 0).
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence

In order to show that if X, Y are independent then


Cov(X, Y) = 0 (in which case
X,Y
= 0 too), we made use of
Proposition
If X, Y are independent then
E[g(X)h(Y)] = E[g(X)]E[h(Y)]

In order to show that = 1 (or 1) i P(Y = +X) = 1


with > 0 (or < 0), we made use of
Proposition
Let X be a random variable with mean . If Var(X) = 0, then
P(X = ) = 1, i.e. X is a constant.
Michael Akritas Multiple Random Variables
Outline
Joint Distribution Functions
Covariance and Correlation: Quantifying Dependence
Proposition (Consequences of a Linear Relationship)
If E(Y|X) = +X then
1. = E(Y) E(X)
2. Cov(X, Y) = Var(X)
3. =
X,Y

X
Proof: 1. It follows from E(Y) = E[E(Y|X)].
2. Use E(W) = E[E(W|V)] with W = XY and V = X to get
E(XY) = E[E(XY|X)] = E[XE(Y|X)]
= E[X( +X)] = E(X) +E(X
2
)
= E(Y)E(X) E(X)
2
+E(X
2
)
= E(Y)E(X) +Var(X)
Michael Akritas Multiple Random Variables

Das könnte Ihnen auch gefallen