Sie sind auf Seite 1von 5

UNIVERSITY OF CALIFORNIA, DAVIS

Department of Electrical and Computer Engineering


EEC161 Probabilistic Analysis of Electrical and Computer Systems Spring 2012
Midterm #2 Solutions
Problem 1
a) In order to ensure that f
X Y
(x, y) is a valid PDF, we must have
1 =
_

f
X Y
(x, y)dxdy
= C
_
1
0
x[
_
x
0
y
2
dy]dx
=
C
3
_
1
0
x
4
dx =
C
15
,
so C = 15.
b) The marginal probability density f
X
(x) of X is given by
f
X
(x) =
_
x
0
f
X Y
(x, y)dy = 15
x
4
3
= 5x
4
for 0 x 1 and f
X
(x) = 0 otherwise. Similarly, the marginal PDF of Y is given by
f
Y
(y) =
_
1
y
f
X Y
(x, y)dx =
15
2
(1 y
2
)y
2
for 0 y 1 and f
Y
(y) = 0 otherwise.
c) Since
f
X
(x)f
Y
(y) =
75
2
x
4
(1 y
2
)y
2
for 0 x, y 1, we conclude that
f
X
(x)f
Y
(y) = f
X Y
(x, y)
so X and Y are not independent.
d) We have
f
Y |X
(y|x) =
f
X Y
(x, y)
f
X
(x)
= 3
y
2
x
3
for 0 y x and f
Y |X
(y|x) = 0 otherwise.
EEC161 Probabilistic Analysis of Electrical and Computer Systems Spring 2012
e) The conditional expectation
E[Y |X = x] =
_

yf
Y |X
(y|x)dy
=
3
x
3
_
x
0
y
3
dy =
3
4
x .
Problem 2:
a) For n 1, N = n visits to replacement parts stores are needed if the rst n1 stores do
not carry the desired replacement part (each with probability 1p), and the n-th store has
the needed part (with probability p). Thus N is a type-1 geometric random variable with
P[N = n] = p(1 p)
n1
for n 1. The moment generating function is
G
N
(z) = pz

N=1
((1 p)z)
n1
=
pz
1 (1 p)z
,
for (1 p)|z| < 1. Its rst and second derivatives are given by
G
N
(z) =
p(1 (1 p)z) +pz(1 p)
(1 (1 p)z)
2
=
p
(1 (1 p)z)
2
d
2
G
N
dz
2
=
2p(1 p)
(1 (1 p)z)
3
,
so
m
N
=
dG
N
dz
(1) =
1
p
E[N
2
] =
d
2
G
N
dz
2
(1) +
dG
N
dz
(1) =
2(1 p)
p
2
+
1
p
,
and
K
N
= E[N
2
] m
2
N
=
1 p
p
2
.
b) The generating function of X is
M
X
(s) =
_

e
sx
f
X
(x)dx =
_

0
e
(s)x
dx
=

s
2
EEC161 Probabilistic Analysis of Electrical and Computer Systems Spring 2012
for (s) < . The rst and second derivatives of M
X
(s) are given by
dM
X
ds
=

( s)
2
d
2
M
X
ds
2
=
2
( s)
3
,
so
m
X
=
dM
X
ds
(0) =
1

E[X
2
] =
d
2
M
X
ds
2
(0) =
2

2
,
and
K
X
= E[X
2
] m
2
X
=
1

2
.
c) If Charles visits N = n stores until he nds the desired part, the time spent is
Y =
n

i=1
X
i
and since the random variables X
i
are independent
M
Y |N
(s|n) = E[e
sY
|N = n] =
n

i=1
E[e
sX
i
] = (M
X
(s))
n
.
d) By using the principle of total probability
M
Y
(s) =

n=1
E[e
sY
|N = n]P[N = n]
= pM
X
(s)

n=1
(M
X
(s)(1 p))
n1
=
pM
X
(s)
1 (1 p)M
X
(s)
for values of s such that (1 p)M
X
(s) < 1. This gives
M
Y
(s) =
p/( s)
1 (1 p)/( s)
=
p
p s
.
e) Comparing M
Y
(s) with M
X
(s), we conclude that Y is therefore an exponential random
variable with parameter p. Its mean m
Y
and variance K
Y
are therefore
m
Y
=
1
p
, K
Y
=
1
(p)
2
.
3
EEC161 Probabilistic Analysis of Electrical and Computer Systems Spring 2012
Problem 3:
a) Since X and Y are independent, their joint density is given by
f
X Y
(x, y) = f
X
(x)f
Y
(y) = exp((x +y))u(x)u(y) .
b) The transformation can be written in matrix form as
_
U
V
_
= A
_
X
Y
_
, (1)
where
A =
_
1 1
1 1
_
.
The transformation (1) is one-to-one and its inverse is given by
_
X
Y
_
=
_
h
1
(U, V )
h
2
(U, V )
_
= A
1
_
U
V
_
with
A
1
=
1
2
_
1 1
1 1
_
.
The Jacobian J(u, v) of the transformation (1) is given by
J = | det A
1
| =
1
2
,
and the joint density of U and V can be expressed as
f
U V
(u, v) = J(u, v)f
X Y
(h
1
(u, v), h
2
(u, v))
=
1
2
exp(u)u(u +v)u(u v)
=
_
1
2
exp(u) u > |v|
0 otherwise .
c) The marginal density of U is
f
U
(u) =
_

f
U V
(u, v)dv
=
1
2
exp(u)u(u)
_
u
u
dv = uexp(u)u(u) ,
which is an Erlang distribution of order 1 and parameter = 1, as expected since U is the
sum of two exponential random variables with parameter = 1. The marginal density of
V is
f
V
(v) =
_

f
U V
(u, v)du
=
1
2
_

|v|
exp(u)du =
1
2
exp(|v|) ,
4
EEC161 Probabilistic Analysis of Electrical and Computer Systems Spring 2012
so that V has a Laplace distribution with parameter = 1.
d) The joint density
f
U,V
(u, v) = f
U
(u)f
V
(v) ,
so U and V are not independent.
e) The random variables U and V have for mean
_
m
U
m
V
_
= A
_
m
X
m
Y
_
=
_
1 1
1 1
_ _
1
1
_
=
_
2
0
_
,
and for covariance
_
K
U
K
UV
K
V U
K
V
_
= E
_
_
U m
U
V m
V
_
_
U m
U
V m
V

_
= A
_
1 0
0 1
_
A
T
=
_
2 0
0 2
_
.
Since K
UV
= E[(U m
U
)(V m
V
)] = 0, we conclude that U and V are uncorrelated even
though they are not independent.
Problem 4
a) By the central limit theorem, for large n, the random variable
Y
n
=
S
n
m
X
n

K
X
n
=
S
n
3n
3

n
is N(0, 1) distributed. For n = 100, we nd therefore
P[S
n
> 360] = P[Y
n
>
60
30
] = P[Y
n
> 2]
= Q(2) = 2.275.10
2
,
so there is roughly only a 2% chance the parking lot capacity will be exceeded after 100
days.
b) On day N, we have
P[S
N
> 360] = P
_
Y
N
>
360 3N
3

= Q
_
120 N

N
_
=
1
2
,
so that
120 N

N
= Q
1
(1/2) = 0
and thus N = 120.
5

Das könnte Ihnen auch gefallen