Sie sind auf Seite 1von 13

Solutions to Homework Set #3

(Prepared by Yu Xiang)

1. Time until the n-th arrival. Let the random variable N (t) be the number of packets arriving
during time (0, t]. Suppose N (t) is Poisson with pmf
(λt)n −λt
pN (n) = e for n = 0, 1, 2, . . . .
n!
Let the random variable Y be the time to get the n-th packet. Find the pdf of Y .

Solution: To find the pdf fY (t) of the random variable Y , note that the event {Y ≤ t}
occurs iff the time of the nth packet is in [0, t], that is, iff the number N (t) of packets arriving
in [0, t] is at least n. Alternatively, {Y > t} occurs iff {N (t) < n}. Hence, the cdf FY (t) of Y
is given by

X (λt)k −λt
FY (t) = P{Y ≤ t} = P {N (t) ≥ n} = e .
k!
k=n

Differentiating FY (t) with respect to t, we get the pdf fY (t) as


∞  k k−1

−λt (λt) −λt (λt)
X
fY (t) = −λe + λe
k! (k − 1)!
k=n
∞ ∞
(λt)n−1 X −λt (λt)k X (λt)k−1
= λe−λt − λe + λe−λt
(n − 1)! k! (k − 1)!
k=n k=n+1
(λt)n−1
= λe−λt
(n − 1)!
for t > 0.
Or we can use another way. Since we know that the time interval T between packet arrivals
is an exponential random variable with pdf
 −λt
λe , if t ≥ 0,
fT (t) =
0, otherwise.

Let Ti denote the i.i.d. exponential interarrival times, then Y = T1 + T2 + · · · + Tn . By


convolving fT (t) with itself n − 1 times, which can be also computed by its Fourier transform
(characteristic function), we can show that the pdf of Y is given by
( n−1
λe−λt (λt)
(n−1)! , if t ≥ 0,
fY (t) =
0, otherwise.

2. Diamond distribution. Consider the random variables X and Y with the joint pdf
 √
c if |x| + |y| ≤ 1/ 2
fX,Y (x, y) =
0 otherwise,

where c is a constant.

1
(a) Find c.
(b) Find fX (x) and fX|Y (x|y).
(c) Are X and Y independent random variables? Justify your answer.

Solution:

(a) The integral of the pdf fX,Y (x, y) over −∞ < x < ∞, −∞ < y < ∞ is c, and therefore
by the definition of joint density
c = 1.
(b) The marginal pdf is obtained by integrating the joint pdf with respect to y. For 0 ≤
x ≤ √12 ,
Z √1 −x  
2 1
fX (x) = c dy = 2 √ − x ,
− √1 +x
2
2

and for − √12 ≤ x ≤ 0,

√1 +x  
1
Z
2
fX (x) = c dy = 2 √ +x .
− √1 −x
2
2

So the marginal pdf may be written as


( √
2 − 2|x| |x| ≤ √12
fX (x) =
0 otherwise.

Now since fXY (x, y) is symmetrical, fY (y) = fX (y). Thus,

fX,Y (x, y)
fX|Y (x|y) =
fY (y)
(
√ 1 |x| + |y| ≤ √1 , |y| ≤ √1
= 2−2|y| 2 2
0 otherwise.

(c) X and Y are not independent since

fX,Y (x, y) 6= fX (x)fY (y).

Alternatively, X and Y are not independent since fX|Y (x|y) depends on the value of y.

3. First available teller. Consider a bank with two tellers. The service times for the tellers are
independent exponentially distributed random variables X1 ∼ Exp(λ1 ) and X2 ∼ Exp(λ2 )
respectively. You arrive at the bank and find that both tellers are busy but that nobody else
is waiting to be served. You are served by the first available teller once he/she is free. What
is the probability that you are served by the first teller?

Solution: From the memoryless property of the exponential distribution, the remaining
services for the tellers are also independent exponentially distributed random variables with

2
parameters λ1 and λ2 , respectively. The probability that you will be served by the first teller
is the probability that the first teller finishes the service before the second teller does. Thus,
Z
P{X1 < X2 } = fX1 ,X2 (x1 , x2 ) dx2 dx1
x2 >x1
Z ∞ Z ∞
= λ1 e−λ1 x1 λ2 e−λ2 x2 dx2 dx1
Z x∞
1 =0 x2 =x1

= λ1 e−(λ1 +λ2 )x1 dx1


x1 =0
λ1
= .
λ1 + λ2

4. Coin with random bias. You are given a coin but are not told what its bias (probability
of heads) is. You are told instead that the bias is the outcome of a random variable P ∼
Unif[0, 1]. To get more infromation about the coin bias, you flip it independently 10 times.
Let X be the number of heads you get. Thus X ∼ B(10, P ). Assuming that X = 9, find and
sketch the a posteriori probability of P , i.e., fP |X (p|9).
Solution: In order to find the conditional pdf of P, apply Bayes’ rule for mixed random
variables to get

pX|P (x|p) pX|P (x|p)


fP |X (p|x) = fP (p) = R 1 fP (p).
pX (x)
0 pX|P (x|p)fP (p) dp

Now it is given that X = 9, thus for 0 ≤ p ≤ 1

p9 (1 − p)
fP |X (p|9) = R 1
9
0 p (1 − p) dp
p9 (1 − p)
= 1
110
9
= 110p (1 − p).

Figure 1 compares the unconditional and the conditional pdfs for P . It may be seen that given
the information that 10 independent tosses resulted in 9 heads, the pdf is shifted towards the
9
value 10 .

3
4.5

fP(p)
4 f (p|9)
P|X

3.5

2.5

1.5

0.5

0
0 0.1 0.2 0.3 0.4 0.5 0.6 0.7 0.8 0.9 1

Figure 1: Comparison of a priori and a posteriori pdfs of P

5. Optical communication channel. Let the signal input to an optical channel be:

λ0 with probability 12

X=
λ1 with probability 12 .

The conditional pmf of the output of the channel Y |{X = λ0 } ∼ Poisson(λ0 ), i.e., Poisson
with intensity λ0 and Y |{X = λ1 } ∼ Poisson(λ1 ).
Show that the MAP rule reduces to:
λ0 , y < y ∗

D(y) =
λ1 , otherwise.

Find y ∗ and the corresponding probability of error.

Solution: The MAP rule



λ0 pX|Y (λ0 |y) > pX|Y (λ1 |y)
D(y) =
λ1 otherwise

minimizes the probability of decoding error. Since the a priori probabilities for the two X
values are equal, the MAP rule is equivalent to the ML rule
p (y|λ0 )
(
λ0 pY |X (y|λ1 ) > 1
D(y) = Y |X
λ1 otherwise.

Now,

pY |X (y|1) e−λ0 λ0 y /y!


= −λ1 y
pY |X (y|10) e λ1 /y!
= eλ1 −λ0 −y ln(λ1 /λ0 ) .

4
λ1 −λ0
This ratio is greater than 1 if y < ln(λ1 )−ln(λ0 ) . Therefore, when λ0 = 1 and λ1 = 2, we have

1
1 y < ln(2)

D(y) =
2 otherwise

and
1
y∗ = = 1.44.
ln(2)

The probability of error is

Pe = P{D(Y ) 6= X}
= P{Y > y ∗ |X = 1}P{X = 1} + P{Y < y ∗ |X = 2}P{X = 2}
∞ −1 1
X e X e−2 2y
= × 0.5 + × 0.5
y=2
y! y=0
y!
= 0.335.

Repeating this for λ0 = 1 and λ1 = 100, we have


99
1 y < ln(100)

D(y) =
2 otherwise

and
1
y∗ = = 21.497.
ln(2)
The probability of error is

Pe = P{D(Y ) 6= X}
∞ 21 −100
X e−1 X e (100)y
= × 0.5 + × 0.5
y=22
y! y=0
y!
e−100 (100)21
≤ .5(P (Y > 22|X = 1) + 22 × ) ≈ .5(1/22 + 1.6 × 10−20 ) . .025.
21!
where the inequality is obtained by using Markov inequality as well as the shape of the PMF
of Poisson random variable (increasing before k∗ ). This can be further tightened up (using
Chebyshev’s Inequality):

e−100 (100)21
Pe ≤ .5(P (|Y − 1| > 21|X = 1) + 22 × ) ≈ .5(1/(21)2 + 1.6 × 10−20 ) . .0011.
21!

6. Iocane or Sennari. An absent-minded chemistry professor forgets to label two identically


looking bottles. One contains a chemical named “Iocane” and the other contains a chemical
named “Sennari”. It is well known that the radioactivity level of “Iocane” has the Unif[0, 1]
distribution, while the radioactivity level of “Sennari” has the Exp(1) distribution.

5
(a) Let X be the radioactivity level measured from one of the bottles. What is the optimal
decision rule (based on the measurement X) that maximizes the chance of correctly
identifying the content of the bottle?
(b) What is the associated probability of error?

Solution: Let Θ = 0 denote the case in which the content of the bottle is “Iocane” and let
Θ = 1 denote the case in which the content of the bottle is “Sennari”. Implicit in the problem
statement is that P(Θ = 0) = P(Θ = 1) = 1/2.

(a) The optimal MAP rule is equivalent to the ML rule


(
0, fX|Θ (x|0) > fX|Θ (x|1),
D(x) =
1, otherwise.

Since the Unif(0, 1) pdf fX|Θ (x|0) is larger than the Exp(1) pdf fX|Θ (x|1) for 0 < x < 1,
we have (
0, 0 < x < 1,
D(x) =
1, otherwise.

(b) The probability of error is given by


1 1
P(Θ 6= D(X)) = P(Θ 6= D(X)|Θ = 0) + P(Θ 6= D(X)|Θ = 1)
2 2
1 1
= P(X > 1|Θ = 0) + P(0 < X < 1|Θ = 1)
2 2
1
= (1 − e−1 ).
2

7. Two independent uniform random variables.


Let X and Y be independently and uniformly drawn from the interval [0, 1].

(a) Find the pdf of U = max(X, Y ).


(b) Find the pdf of V = min(X, Y ).
(c) Find the pdf of W = U − V .
(d) Find the probability P{|X − Y | ≥ 1/2}.

Solution:

(a) We have

FU (u) = P{U ≤ u}
= P{min(X, Y ) ≤ u}
= P{X ≤ u, Y ≤ u}
= P{X ≤ u}P{Y ≤ u}
= u2

6
for 0 ≤ u ≤ 1. Hence, (
2u, 0 ≤ u ≤ 1,
fU (u) =
0, otherwise.

(b) Similarly,

1 − FV (v) = P{V > v}


= P{max(X, Y ) > v}
= P{X > v, Y > v}
= P{X > v}P{Y > v}
= (1 − v)2 ,

or equivalently, FV (v) = 1 − (1 − v)2 , for 0 ≤ v ≤ 1. Hence,


(
2(1 − v), 0 ≤ v ≤ 1,
fV (v) =
0, otherwise.

(c) First note that W = U − V = |X − Y |. (Why?) Hence,

P{W ≤ w} = P{|X − Y | ≤ w}
= P (−w ≤ X − Y ≤ w).

Since X and Y are uniformly distributed over [0, 1], the above integral is equal to the
area of the shaded region in the following figure:

w 1 x
0
The area can be easily calculated as 1 − (1 − w)2 for 0 ≤ w ≤ 1. Hence FW (w) =
1 − (1 − w)2 and (
2(1 − w), 0 ≤ w ≤ 1,
fW (w) =
0, otherwise.

7
(d) From the figure above,

P{|X − Y | ≥ 1/2} = P{W ≥ 1/2} = 1/4.

8. Waiting time at the bank. Consider a bank with two tellers. The service times for the
tellers are independent exponentially distributed random variables X1 ∼ Exp(λ1 ) and X2 ∼
Exp(λ2 ), respectively. You arrive at the bank and find that both tellers are busy but that
nobody else is waiting to be served. You are served by the first available teller once he/she
becomes free. Let the random variable Y denote your waiting time. Find the pdf of Y .
Solution: First observe that Y = min(X1 , X2 ). Since

P{Y > y} = P{X1 > y, X2 > y}


= P{X1 > y}P{X2 > y}
= e−λ1 y × e−λ2 y
= e−(λ1 +λ2 )y

for y ≥ 0, Y is an exponential random variable with pdf


(
(λ1 + λ2 )e−(λ1 +λ2 )y , y ≥ 0,
fY (y) =
0, otherwise.

8
Additional Exercises
Do not turn in solutions to these problems.

1. Independence. Let X ∈ X and Y ∈ Y be two independent discrete random variables.

(a) Show that any two events A ⊆ X and B ⊆ Y are independent.


(b) Show that any two functions of X and Y separately are independent; that is, if U = g(X)
and V = h(Y ) then U and V are independent.

Solution:
P
(a) Recall that the probability of any event A ⊆ X is given by P {X ∈ A} = x∈A∩X pX (x).
Because of the independence of X and Y , we have
X X
P {X ∈ A, Y ∈ B} = pX,Y (x, y)
x∈A∩X y∈B∩Y
X X
= pX (x)pY (y)
x∈A∩X y∈B∩Y
X X
= pX (x) pY (y)
x∈A∩X y∈B∩Y

= P {X ∈ A}P {Y ∈ B}.

Therefore, A and B are independent.


(b) Let Ax = {x : g(x) < u} and By = {y : h(y) < v}. Then the joint distribution of U and
V is

FU,V (u, v) = P {U ≤ u, V ≤ v} = P {g(X) ≤ u, h(Y ) ≤ v} = P {X ∈ Ax , Y ∈ By }.

However, because of the independence of X and Y ,

FU,V (u, v) = P {X ∈ Ax , Y ∈ By {
= P {X ∈ Ax }P {Y ∈ By }
= P {g(X) < u}P {h(Y ) < v}
= P {U < u}P {V < v}
= F (u)F (v),

so that Z and W are independent.

2. Family planning. Alice and Bob choose a number X at random from the set {2, 3, 4} (so the
outcomes are equally probable). If the outcome is X = x, they decide to have children until
they have a girl or x children, whichever comes first. Assume that each child is a girl with
probability 1/2 (independent of the number of children and gender of other children). Let Y
be the number of children they will have.

(a) Find the conditional pmf pY |X (y|x) for all possible values of x and y.
(b) Find the pmf of Y .

9
Solution:

(a) Note that Y ∈ {1, 2, 3, 4}. The conditional pmf is as follows


1 1
pY |X (1|2) = , pY |X (2|2) = ,
2 2
1 1 1
pY |X (1|3) = , pY |X (2|3) = , p (3|3) = ,
2 4 Y |X 4
1 1 1 1
pY |X (1|4) = , pY |X (2|4) = , p (3|4) = , pY |X (4|4) = .
2 4 Y |X 8 8

(b) The pmf of Y is:


4
X 4
X
pY (1) = pX (x)pY |X (1|x) = 1/2, pY (2) = pX (x)pY |X (2|x) = 1/3,
x=2 x=2
4
X
pY (3) = pX (x)pY |X (3|x) = 1/8, pY (4) = pX (4)pY |X (4|4) = 1/24.
x=3

3. Joint cdf or not. Consider the function


(
1 if x + y ≥ 0
G(x, y) =
0 otherwise.

Can G be a joint cdf for a pair of random variables? Justify your answer.

Solution: No. Note that for every x,

lim G(x, y) = 1.
y→∞

But for any genuine marginal cdf,

lim FX (x) = 0 6= 1 .
x→−∞

Therefore G(x, y) is not a cdf. Alternatively, assume that G(x, y) is a joint cdf for X and Y ,
then

P{−1 < X ≤ 2, −1 < Y ≤ 2} = G(2, 2) − G(−1, 2) − G(2, −1) + G(−1, −1)


= 1 − 1 − 1 + 0 = −1 .

But this violates the property that the probability of any event must be nonnegative.

10
fY|S(y|−1)
fY|S(y|0)
0.5 fY|S(y|1)

0.4

0.3

0.2

0.1

0
−3 −2 −1 0 1 2 3

Figure 2: fY |S (y, s) for λ = 1

4. Ternary signaling. Let the signal S be a random variable defined as follows:



1
−1 with probability 3


S= 0 with probability 13

+1 with probability 1 .

3

The signal is sent over a channel with additive Laplacian noise Z, i.e., Z is a Laplacian
random variable with pdf
λ −λ|z|
fZ (z) = e , −∞ < z < ∞ .
2
The signal S and the noise Z are assumed to be independent and the channel output is their
sum Y = S + Z.

(a) Find fY |S (y|s) for s = −1, 0, +1 . Sketch the conditional pdfs on the same graph.
(b) Find the optimal decoding rule D(Y ) for deciding whether S is −1, 0 or +1. Give your
answer in terms of ranges of values of Y .
(c) Find the probability of decoding error for D(y) in terms of λ.

Solution:

(a) We use a trick here that is used several times in the lecture notes. Since Y = S + Z and
Z and S are independent, the conditional pdf is

fY |S (y|s) = fZ (y − s) = 12 λe−λ|y−s| .

The plots are shown for λ = 1 in Figure 2 on page 11.

11
(b) The optimal decoding rule is MAP: D(y) = s where s maximizes

f (y|s)p(s)
p(s|y) = .
f (y)

Since pS (s) is the same for s = −1, 0, +1, the MAP rule becomes the maximum-likelihood
decoding rule: D(y) = arg max f (y|s). The conditional pdfs are plotted in Figure 2. By
s
inspection, the ML rule reduces to

−1 y < − 12




g(y) = 0 − 12 < y < + 12


+1 y > + 21 .

(c) Inspection of Figure 2 shows how to calculate the probability of error.


X
Pe = P{error|i sent}P{i sent}
i
X
1
= 3 P{error|i sent}
i
= 31 (1 − P{− 12 < S + Z < + 21 |S = 0}) +
1 1 1 1
 
3 P S + Z > − 2 |S = −1 + 3 P S + Z < + 2 |S = +1
= 13 1 − P{− 12 < Z < 21 } + 13 P{Z < − 21 } + 13 P{Z > + 21 }


= 13 P{Z < − 21 } + 13 P{Z > + 21 } + P{Z < − 21 } + 13 P{Z > + 21 }


= 23 P{Z < − 12 } + P{Z > + 21 }


= 43 P{Z > + 21 } (by symmetry)


Z ∞
1
−λ|z|
= 43 1
2 λe dz = 32 e− 2 λ .
1
2

5. Signal or no signal. Consider a communication system that is operated only from time to
time. When the communication system is in the “normal” mode (denoted by M = 1), it
transmits a random signal S = X with
(
+1, with probability 1/2,
X=
−1, with probability 1/2.

When the system is in the “idle” mode (denoted by M = 0), it does not transmit any signal
(S = 0). Both normal and idle modes occur with equal probability. Thus
(
X, with probability 1/2,
S=
0, with probability 1/2.

The receiver observes Y = S + Z, where the ambient noise Z ∼ U[−1, 1] is independent of S.

12
(a) Find and sketch the conditional pdf fY |M (y|1) of the receiver observation Y given that
the system is in the normal mode.
(b) Find and sketch the conditional pdf fY |M (y|0) of the receiver observation Y given that
the system is in the idle mode.
(c) Find the optimal decoder D(y) for deciding whether the system is normal or idle. Provide
the answer in terms of intervals of y.
(d) Find the associated probability of error.

Solution:

(a) If M = 1, (
1 + Z, with probability 1/2,
Y =
−1 + Z, with probability 1/2.
Hence, we have
(
1
2 fZ (y − 1) + 12 fZ (y + 1) = 41 , −2 ≤ y ≤ 2,
fY |M (y|1) =
0, otherwise.

(b) If M = 0, Y = Z, so
(
fZ (y) = 12 , −1 ≤ y ≤ 1,
fY |M (y|0) =
0, otherwise.

(c) Since both modes are equally likely, the optimal MAP decoding rule reduces to the ML
rule, in which
(
0, if fY |M (y|0) > fY |M (y|1),
d(y) =
1, otherwise
(
0, if − 1 < y < 1,
=
1, otherwise.

(d) The probability of error is given by

P{M 6= d(Y )} = P{M = 1, −1 < Y < 1}


= P{M = 1}P{−1 < Y < 1|M = 1}
1 1 1
= · = .
2 2 4

13

Das könnte Ihnen auch gefallen