Sie sind auf Seite 1von 2

ECE 534 RANDOM PROCESSES FALL 2011

PROBLEM SET 7 Due Tuesday, December 6


7. Random Processes in Linear Systems and Spectral Analysis
Assigned Reading: Sections 8.1-8.4 and 9.1 of the notes.
Reminder: The nal exam will be on Friday, December 16, 7-10 p.m., in Room 103 Talbot.
Problems to be handed in:
1 Some ltering
Suppose X is a continuously m.s. dierentiable WSS random process with total power 10. Let Y
be the solution to Y

= Y +X

X.
(a) Find the total power of Y.
(b) For the given information, what is the maximum possible value of E[(X
t
Y
t
)
2
]?
(c) For the given information, what is the inmum of possible values of E[(X
t
Y
t
)
2
]?
2 Synthesizing a random process with specied spectral density
This problem deals with Monte Carlo simulation of a Gaussian stationary random process with a
specied power spectral density function. Give a representation of a random process X with the
power spectral density function S
X
shown,
f
S(2 f)
!" #" " #" !"
#
!

using independent, N(0, 1) random variables, and linear operations such as linear ltering and
addition, as in the Nyquist sampling theorem representation of baseband processes. You dont need
to address the fact that in practice, a truncation to a nite sum would be used to approximately
simulate the process over a nite time interval, but do try to minimize the number of N(0, 1)
variables you use per unit time of simulation. Identify explicitly any functions you use, and also
identify how many N(0, 1) random variables you use per unit of time simulated.
3 Some linear transformations of some random processes
Let U = (U
n
: n Z) be a random process such that the variables U
n
are independent, identically
distributed, with E[U
n
] = and Var(U
n
) =
2
, where = 0 and
2
> 0. Please keep in mind that
= 0. Let X = (X
n
: n Z) be dened by X
n
=

k=0
U
nk
a
k
, for a constant a with 0 < a < 1.
(a) Is X stationary? Find the mean function
X
and autocovariance function C
X
for X.
(b) Is X a Markov process ? (Hint: X is not necessarily Gaussian. Does X have a state represen-
tation driven by U?)
(c) Is X mean ergodic in the m.s. sense?
Let U be as before, and let Y = (Y
n
: n Z) be dened by Y
n
=

k=0
U
nk
A
k
, where A is a
1
random variable distributed on the interval (0, 0.5) (the exact distribution is not specied), and A
is independent of the random process U.
(d) Is Y stationary? Find the mean function
Y
and autocovariance function C
Y
for Y . (Your
answer may include expectations involving A.)
(e) Is Y a Markov process? (Give a brief explanation.)
(f) Is Y mean ergodic in the m.s. sense?
4 A standard noncausal estimation problem
(a) Derive the Fourier transform of the function g(t) = exp(|t|).
(b) Find

1
a+b
2
d
2
for a, b > 0. (Hint: Use the result of part (a) and the fact, which follows
from the inverse Fourier transform, that

g()
d
2
= g(0) = 1.)
(c) Suppose Y = X + N, where X and N are each WSS random processes with mean zero, and
X and N are uncorrelated with each other. The observation process is Y = X + N. Suppose
R
X
() = exp(||) and R
N
=
2
(), so that N is a white noise process with two-sided power
spectral density
2
. Identify the transfer function H and impulse response function h of the lter
for producing

X
t
=

E[X
t
|Y ], the MMSE estimator of X
t
given Y = (Y
s
: s R).
(d) Find the resulting MMSE for the estimator you found in part (c). Check that the limits of your
answer as 0 or make sense.
(e) Let D
t
= X
t


X
t
. Find the cross covariance function C
D,Y
.
5 Linear and nonlinear ltering
Let Z = (Z
t
: t R) be a stationary Markov process with state space S = {3, 1, 1, 3} and
generator matrix Q = (q
i,j
) with q
i,j
= if i = j and q
i,i
= 3, for i, j S. Let Y = (Y
t
: t R)
be a random process dened by Y
t
= Z
t
+ N
t
, where N is a white Gaussian noise process with
R
N
() =
2
(), for some
2
> 0.
(a) Find the stationary distribution , the transition probabilities p
i,j
(), the mean
Z
, and auto-
correlation function R
Z
for Z.
(b) Find the transfer function H, so that if

Z is the output of the linear system with transfer
function H, then

Z
t
=

E[Z
t
|Y ]. Express the mean square error, E[(Z
t


Z
t
)
2
] in terms of and
2
.
(c) For t xed, nd a nonlinear function

Z
(NL)
t
of Y such that E[(Z
t


Z
(NL)
t
)
2
] is strictly smaller
than the MSE found in part (b). (You dont need to compute the MSE of your estimator.)
(d) Derive an estimation procedure using the fact that (Z, Y ) is a continuous-time version of the
hidden Markov model. Specically, let > 0 be small and let t
0
= K for some large integer K.
Let

Y
k
=

k
(k1)
Y
t
dt and

Z
k
= Z
k
. Then (

Z
k
,

Y
k
: 1 k K) is approximately a hidden Markov
model with observation space R instead of a nite observation space. Identify the (approximate)
parameter (, A, B) of this Markov model (note that b
i,y
for i xed should be a pdf as a function
of y.) (Using this model, the forward backward algorithm could be used to approximately compute
the conditional pmf of X at a xed time given Y , which becomes asymptotically exact as 0.
An alternative to this approach is to simply start with a discrete-time model. Another alternative
is to derive a continuous-time version of the forward backward algorithm.)
2

Das könnte Ihnen auch gefallen