Sie sind auf Seite 1von 6

Joint probability distributions:

1) Problem:
Table: Joint probability distribution of X and Y
Y

X


30


60


90


X=x
i
0 10/56 0 0 10/56
30 19/56 18/56 0 37/56
60 0 6/56 9/56 15/56
90 0 0 1/56 1/56
Y=y
i
29/56 24/56 10/56 1

X takes the value 0, 30, 60, 90
Y takes values 30, 60, 90

P
ij
= P[ X=x
i
Y=y
j
] = (i, j)
th
cell value

P[X=x
i
] : Marginal distribution of X

P[Y= y
i
]

: Marginal distribution of Y
Example:
Let x
i
= 0. Thus, P[X= 0] = P[X=0Y=30]+P[X=0Y=60]+P[X=0Y=90]
= 10/56 + 0 + 0 = 10/56

Q1. Find the following:
P[X=30], P[X=60], P[X=90]

Marginal distribution of Y; Marginal expectation of X; Marginal expectation of Y;
Marginal variance of X; marginal variance of Y.




Joint pmf of X and Y:
If the tabular representation of the joint distribution is inadequate (e.g. with infinite
number of values of X or Y or both), then the cell probabilities could be generated from a
function f(x,y)= P[X=x, Y=y] so that

X Y
f(x,y) = 1, f(x,y) 0

Example:
1. Is the following function a pmf? ( )


Solution: ( ) ( )

[ ] . Hence it is a pmf.
2. ( )

(Work yourself: Hint p


x

q
1-x
is pmf of
Bernoulli distribution)
Marginal pmf:
The marginal pmf of Y is obtained from the joint pmf as follows:
( )

() [ ]
Similarly, () can be defined.
Marginal expectation of X:
[] ()

, where f (x) is the marginal pmf.


Example:
( )

The marginal distribution of X is given as


()

. E[X] = p.
Conditional Probability:
P[X=x|Y=y]=
[]
[]
.


Eaxmple: From the above table we observe the following:
P[ X=x/Y=30] = 10/29, if x=0
19/29, if x=30
0 , if x=60
0 , if x=9
Q2. Find the following:
P[X=x|Y=60], P[X=x| Y=90] (Work out on your own)
Conditional Mean and Variance:
E(X|Y=y)= m(y)= [ ]

, V(X/Y=y) = h(y)= [

[ ]
Joint distribution characteristics:
E (XY) =

X Y
XY P[X=x, Y=y]
= 0[30* P[X=0,Y=30] + ] + 30[ 30. P[X=30, Y= 30] + .] + 60[60.P[X=30,
Y=60].]+ 90[90.P[X=30, Y=90]+.]
COVARIANCE:
Motivation: If all the points are distributed over quadrants I and III (i.e of same sign), then their
product XY >0. This indicates that X and Y change in the same direction. Similarly, if they are
distributed over quadrats II and IV, then they change in opposite direction.
The following figure indicates how X & Y correspond to each other.
Fig:






If X and Y are centred about their mean then expectation of the centred variables is known as
covariance.
Cov (X, Y) =
xy
=

X Y
[x-E(X)] [y-E(y)] f(x,y)
=

X Y
xy f(x,y) - E(X)E(Y)


Example:
1. Calculate Cov(x,y) in the example in the table given (Use EXCEL) (Work on your own)

Correlation Coefficient:
If X and Y are standardized to make them unit free and comparable, then the covariance between
the standardized X & Y is called the correlation coefficient (

).

( )


where

are standard deviations of X and Y respectively.


Important Results:
a.

is symmetric
b.

ranges within [-1,1]


c.

measures the strength of linear relationship




[]

[]

[]

( []) ,
i. e linear relationship.

INDEPENDENCE OF RANDOM VARIABLES:
X and Y are independent if
P[X=xY=y] = P[X=x] P[Y=y]
i.e. ( ) ()()
Consequences:
E(XY) =

X Y
xy P[X=x, Y=y]
=

X Y
xy P[X=x] P[Y=y]
=

X
x P[X=x]

Y
yP[Y=y]
=E(x) E(y)

Sum law of expectation: E(aX+bY) =aE(X)+bE(Y)
Extension to n variables:
E(C
1
X
1
+ C
2
X
2
+ C
3
X
3
+..+ C
n
X
n
) =

n
i 1
C
i
E(X
i
) H.W
Variance of sum:
V(X+Y) =V(X)+V(Y)+2COV(X,Y) (H.W)
Extension:
V(C
1
X
1
+ C
2
X
2
+ C
3
X
3
+..+ C
n
X
n
) =

n
i 1
C
i
2

V(X
i
) + 2

j i
C
i
C
j
Cov(X
i
, Y
j
)
( ) , but not the converse.
Continuous case for multiple variables
Joint pdf, marginal pdf and conditional pdfs are defined in a similar way to joint pmf and related
discrete counterparts. Here the sum is replaced by integration and pdf does not indicate
probability. It is just a function integrating which over a set A one may get the probability of the
set .
Joint pdf of X and Y:
Suppose X & Y are continuous and is any set. The probability that ( ) could be
obtained by integrating a function f(x,y) so that

f(x,y) 0 & ( )


Similarly other results from discrete distributions can be extended here.


Likelihood of independent rvs
Likelihood is the joint pdf/pmf of the variables.
f(x
1
, x
2
..,x
n
) =

n
i 1
f(x
i
)
Function of a Random variable:
E[g(x)] = {
()[ ]

()()

V[g(x)] = E[g
2
(x)] E
2
[g(x)]

Das könnte Ihnen auch gefallen