Sie sind auf Seite 1von 4

# Homework 8

SS-1. Problem 4.11 in Stark and Woods (see problems at end of PDF)
Ans: 5.8
SS-2. Problem 4.12 in Stark and Woods (see problems at end of PDF)
Ans:
SS-3. Problem 4.23 in Stark and Woods (see problems at end of PDF)
1. Consider the relation between E[ f (X)] and f (E[X]). Are they equal?
(a) First conduct the following experiment. Generate 100,000 random variables Xthat are
exponentially distributed with parameter = 10. Create new variables Y = X and
Z = X 3 . Estimate E[X], E[Y
p], and E[Z] using the sample3 mean (i.e., the average of the
data). Compare E[Y ] with E[X] and E[Z] with (E[X]) .
(b) Find E[Y ] and E[Z] analytically and compare with your answers to part (a).
2. Let X be a Gaussian random variable with mean and variance 2 . Determine E[(2X + 3)2 ]
in terms of and 2 .
SS-4. (Kay, Prob. 11.25) Determine the mean and variance for the indicator random variable IA (X)
as a function of P[A], where
(
1, x A
IA (x) =
0, x
/A
SS-5. (Modified from Kay, Prob. 11.26) Let the input to a rectifier circuit be a Gaussian random
variable with mean zero and variance 2 . Determine the expected output power (if the output
is Y , determine the expected output power as E[Y 2 ]) for the following two scenarios.
(a) The rectifier is a half-wave rectifier that passes positive voltages undisturbed but outputs
zero when the input voltage is negative.
(b) The rectifier is a full-wave rectifier that outputs the absolute value of the input voltage.
3. (Kay, Prob. 11.32) Provide a counterexample to disprove that Var[g1 (X)+g2 (X)] = Var[g1 (X)]+
Var[g2 (X)].
SS-6. Consider a traffic light that uses a sensor to detect the presence of cars at the light. Once
the light turns green in one particular direction, it stays green until all of the cars that are
queued in that direction have passed through the light. Given that there are n cars at the
light, the time in seconds that the light remains green is an exponential random variable with
parameter 1/(10n). The number of cars at the light is Poisson with mean 20.
Let T be the time the light stays green. Find the mean and variance of T .
Ans: E[T ] = 200, Var(T ) = 4.4 104

SS-7. Problem 4.14 in Stark and Woods (see problems at end of PDF)
SS-8. Show that for optimal linear prediction of X given Y, the mean-square error is X2 (1 2 ).
4. Let X and Y be random variables with joint density
(
15x2 y, 0 x y 1
fXY (x, y) =
0,
otherwise
(a) Find the best MMSE estimator for X given Y .
(b) Find the best linear MMSE estimator for X given Y .
SS-9. (Leon-Garcia, Problem 5.111) Let X and Y be jointly Gaussian random variables with pdf
 

exp 21 x2 + 4y2 3xy + 3y 2x + 1
fXY (x, y) =
for all x, y.
2
Find E[X], E[Y ], Var[X], Var[Y ],Cov(X,Y ).
Answers: E[X] = 1, E[Y ] = 0, Var[X] = 1, Var[Y ] = 14 ,Cov(X,Y ) = 38 .
5. (Leon-Garcia, Problem 5.112) Let X and Y be jointly Gaussian random variables with
E[Y ] = 0, X = 1, Y = 2, and E[X|Y ] = Y /4 + 1. Find the joint density function for X
and Y .

## 4.11. A particular color TV model is manufactured in three different plants, say, A, ,

and C of the same company. Because the workers at A, B, and C are not equally
experienced, the quality of the sets differs from plant to plant. The pdf's of the
time-to-failure, X, in years are
fx(x) = o-exp{-x/5)u(x) for A
/xW = 0y. 0 exp(-x/6.5)(x) for B
fx(x) = ^ exp(-x/10)7i(x) for C,
where u(x) is the unit step. Plant A produces three times as many sets as B, which
produces twice as many as C. The sets are all sent to a central warehouse, inter
mingled, and shipped to retail stores all around the country. What is the expected
lifetime of a set purchased at random?
4.12. A source transmits a signal G with pdf

9^i s27r
jfe(B\
K =
' i\ (2^)_1
0 , o t he<r w
e.
Because of additive Gaussian noise, the pdf of the received signal Y when 6 = 0 is

i fy-o
fv\e(y\0) =

/ 2 ^

2 V a

Compute E[Y}.
4.13. Compute the variance of X if X is (a) Bernoulli; (b) binomial; (c) Poisson; (d) Gaus
sian; (e) Rayleigh.
4.14. Let X and Y be independent r.v.'s, each iV(0,l). Find the mean and variance of
Z = yfiP + Y*.
4.15. (Papoulis [4-3]). Let Y = h(X). We wish to compute approximation to E[h{X)\
and E[h2(X)]. Assume that h(x) admits to a power series expansions, that is, aU
derivatives exist. Assume further that all derivatives above the second are small
enough to be omitted. Show that (X = /z, Var(X) = a2).
(a) E[h(X)}~h{fi) + h"{n)a2/2;
(b) E[h2(X)\ ~ h2(fi) + ([/I'M]2 + Hp)hf'Qi))o*.
4.16. Let fxv{x,y) be given by
x2 + y2 - 2pxy
fxvfay) =

2lT(J2J\-p2

2c72(l-p2)

where \p\ < 1. Show that [F] = 0 but E[Y\X = x] = px. What does this result say
about predicting the value of Y upon observing the value of XI

4.17. Show that in the joint Gaussian pdf with X = F = 0 and ax = cry = cr, when
P- 1, /xy(x,y) - -^exp [-i g)2] *(y-x).
4.18. Consider a probability space *= (fl.^P). Let O = {Ci, , Cs} = {-1, ~i 0, |, 1}
with P[{<}] = |, i = 1,.. , 5. Define two r.v.'s on ^ as follows:
X(0 = C and Y(C) = C2.
(a) Show that X and Y are dependent r.v.'s.
(b) Show that X and Y are uncorrelated.
4.19. We wish to estimate the pdf of X with a function p{x) that maximizes the entropy
/ o-cx>
o

## p(x) lnp(x) <te.

It is known from measurements that E[X] = \i and Var[X] = a2. Find the maximum
entropy estimate of the pdf of X.
4.20. Let X: iV(0, a2). Show that
n = E[Xn] = 1 3... (n - l)an n even
n

odd.

## 4.21. With X = E[X] and F = [F], show that if mu = ^20^02, then

E ( T * L ( X - X ) - { Y- Y )

= 0.

\m20

Use this result to show that when |p| = 1, then Y is a linear function of X, that is,
Y = aX + /?. Relate a, /? to the moments of X and Y.
4.22. Show that in the optimum linear predictor in Example 4.3-4 the smallest mean-square
error is
min = 4(l-/>2)'
Explain why e^in = 0 when |p| = 1.
4.23. Let E[Xi] = jjl, VarpQ] = a2. We wish to estimate \x with the sample mean
A 1

^=mJ1x
i=l
Compute the mean and variance of /ijv assuming the X{ for i = 1,..., N are inde
pendent.