Sie sind auf Seite 1von 32

RANDOM PROCESSES 143 "

i1otation x(t) to represent a specific waveform ·of a random process denoted by


CHAPTER X(t).
Clearly, a random process X(t, s) represents a family or ensemble of time
SIX
.. functions when t and s arc variables. Figure 6.1-1 illustra lcs a few members of an
ensemble. Each member time function is called a sample j11nction, ensemble
RANDOM PROCESSES mem/)(!t', or sometimes a realization of the process. Thus, n random process also
represents a single time function when t is a variable and s is fixed at a specific
value (outcome).
A random process also represents a random variable when t is fixed and s is
a vmiablc. For example, the random variable X(l 1, s) = X (I tl is obtained from
the process when time is "frozen" at the value t .. We often usc the notation X 1
.to denote the random variable associated with the process X(t) at time 11 • X 1 cor- I

. responds to a vertical "slice" through the ensemble at time 11, as illustrated in


Figure 6.1-1. The statistical properties of X 1 = X(t.) describe the statistical
properties of the random process at time t 1• The expected value of X 1 is called
I.
the <'ll.wnhle arJeraoe as well as the expected or. mean value of the random process
(at time tr). Since t 1 may have various values, th:: mean value of a process may
·1 '· not be constant; in general, it may be a function of time. We easily visualize any

6.0 INTRODUCTION
In the real world of engineering and science, it is necessary that we be able to · ~
deal with time waveforms. Indeed, we frequently encounter random time wave-
forms in practical systems. More often than not, a desired signal in some system
is random. For example, the bit stream in a binary communication system is a
random message because each bit in the stream occurs randomly. On the other /o ~~
I I
hand, a desired signal is often accompanied by an 11ndesired random waveform, I I
:r ,. •• (1) I I
noise. The noise interferes with the message and ultimately limits the performance I I
I I
of the system. Thus, any hope we have of determining the performance of systems I I
with random waveforms hinges on our ability to describe and d~al with such
waveforms. In this chapter we introduce concepts that allow the description of
rnndom waveforms in a probabilistic sense. . '•I
: I

6.1 THE RANDOM PROCESS CONCEPT

The concept of a random process is based on enlarging the random variable


~j~
-r--\To/ lo
I I
""I
concept to include time. Since a random variable X is, by its definition, a func- I I
tion of the possible outcomes s of an experiment, it now becomes n function of x •. ,ttl : I
I I
both s and time. In other words, we assign, according to some rule, n time I I
I I Figure 6.1·1 A continuous
function I ---
I _._ .. _ __
x(t, s) (6 1·1) rr.ndom process. [Rtpror/uad
j. ·. '• from l'ttbltJ (/97(i) with ptr·
to every outcome s. The family of all such functions, denoted X(t, s), is called a mission of pub/lshtrs Addison·
rwrdom proC£"ss. As with random variables where x was denoted as a sp~cilic Wtslt)', Advanud Book Pro·
value of the random variable X, we shall often use the convenient short-form : .. gram.]
'
142
{ .
l'f'f ~KlJUAUILITY, RANDOM VARIAULCS, ANI> RANL>OM SIGNAL PRINCIPLes

l~~mbcr of random variabk:s X


I, 2, ... :
1 derived from a random process X(t) at times l~o
If X is continuous and 1 can have any of a continuum uf values, tlu:n .\'(1) is
callcd u coJIIimwus rcurclom process. Figure 6.1·1 is an illustration of this class of
X 1 = X(t 1 , s) = X(c 1) (6.1-2) process. Th\!rmal noise gencrat\!d by any realizable network is :1 practical
A random process can also represent a mere number when c and s arc both example of u waveform that is modeled us a sample function of a continuous
fixed.
random process. In this exurnple, the network is the outcome in thc underlying
random experiment of selecting a network. (The presumption is that many net·
Classification of Processes
works arc available from which to choose; this may not be the case in the real
It is convenient to cl~ssify random processes according to the characleristics of c world, but it should not prevent us from imagining a production line producing
ami lhe random van?ble X = ~{I) at time t. We shall consider only four cases ,. uny number of similar networks.) Each network establishes a sample.: function,
based on t and X havmg values m !he ranges - oo < c < oo and - 00 < x < oo.t and all sample functions form the process.t
A second class of random process, called a discrete rwrdom process, corre·
t 01hcr cuscs cun be defined b:tsed on u derinilion or random processes on a finile lime inlcrvul sponds to !h\! random variable X having only discrete valucs while.: 1 is contin-
(see .ror cumptc: Rosenblatl (1974), p. 91; Prabhu (1965), p. I; Miller (197~). p. 31; Parzcn (1962), uous. Figure 6.1·2 illustrates such a process derived by h\!avily limiting thc
P· 7 • Dubcs (1968), p. 320; Ross (1972), p. 56). Oiher rcccnl lex Is on random processes are Helslrom sample functions shown in Figure 6.1·1. The sample funclions have only two dis·
(1984), a11d Gray and Davisson (1986).

;:
. . .,I t Note Ihal lindinc Ihe mean value uf Ihe rn"ess al uny rime 1 is equivalent In finding the Ul'crage
voltage Ihal woulu be rroduced by ulllhe vurious ncl•.,.urks ul lime 1.

.!I .r
., ,.

f': __ j r--
:.~
x,.,J(/)
I.
(t' 0 .:•.1
I

I
·~; ~

_:~
t.:..·•· ~
W· i f
~~ .;.l •.
t!,... .
.~

!;~I
.;i>I 1
~-.'
j: ,,
,I
1.:
y .... ,(1)

.;:.
~:
.
.,,.--....
.... ..........,....._/-·',

Jl
I'
~
-~·;
,.
_]CL LRJ i:
,.,.__,.,.,

-~··"'1
~
.. I / '
:' .•. J..
··1. \,,I
I t~~ 0
I.'
;:..
~·r.
x •. ,<n

:I'. :·:;
I.
' ..... _ ..... Fi~urc 6.1-3 A cnnrinuous mn·
/..,-...... ..... , / .... dum sequence fnnned by sum·
------L~, ~----~-----~~------
0 "-...----.1 '-.•..~/ rling lhc wnvefurms of figure
~i.:ure 6.1·2 ~ discrele r:uulnm rrocess formed by hcuvily limiling lhe wuverorms nr Ficure 6.1·1. 6.1·1. [Hc·rm"l'"'''rl Jrttm l't•t·l>/,•s
[R<'(>ror/uc~rl }rum l'~<'illes (IY76) wilh f>(rmi.v.viull of tmbllslwrs Arldi.wm·IY"s/ey ,frft•tlll<'~rl Uook (/97/1), ll'ilh (>t'f/111.\'.\'11>11 11j'
Proyram.] ' (>11/Jii.viC<•r.• ,(r/rli.voll-ll't•.vkJ', ,·lr/-
VIIIIl'fri Book J'miJram.]

'
"~
~
146 I'ROIIAIIII.ITY, RANDOM VARIAIILI~~. ANIJ RANDOM SIGNAL 1'1\IN\II'LES RANDOM PRO\f>S.~ES 147

Deterministic and Nondeterministic Processes


.t,.. 1ul
In addition to the classes described above, a random process can be described by
,_.,__
:
1---'----11
---1
I
r--+--+--1
I I
,-----l----+----
I,_ ____
.. the form of its sample functions. If future values of any sample function cannot be
predicted exactly from observed past values, the process is called nondeterministic.
The process of Figure 6.1-1 is one example. 1
__ j 1 () I I
.1------l _!
I
A process is called cleterministic if future values of any sample function can be
predicted from past values. An example is the random process defined by
i.
X(t) = A cos (w '0 1 + e)
___:;·"'UI
I r---------------1,______ (6.1-3)

Here A, e, or w 0 (or all) may be random variables. Any one sample function cor-
I 0 I I responds to (6.1-3) with particular values of these random variables. Therefore,
L__ J L---~--- knowledge of the sample function prior to any time instant automatically allows
... (/)
prediction of the sample function's future values because its form is known .

i--,
I 1
r--------,
I I
,...,
: I I
r~------,

I
,.
6.2 STATIONARITY AND INDEPENDENCE
I 1o I I 1 I 1
__ l L-... l L__ l L-.1
As previously stated, a random process becomes a random variable when time is
....,(/+) fixed at some particular value. The random variable will possess statistical
properties, such as a mean value, moments, variance, etc., that are related to its
--- -., _..,_.,.__., ..."
1 I I
r--~--

I
density function. If two random variables are obtained from the process for two
lime instants, they will have statistical properties (means, variances, joint
.,
I : I I
moments, etc.) related to their joint density function. More generally, N random
variables will possess statistical properties related to their N-dimensional joint
density function.
Broadly speaking, a random process is said to be stationary if all its sta-
tistical properties do not change with time. Other processes are called notJsta-
Figure 6.1-4 II <li~crete ramlom ~e4uence formed by ~nmpling the waveforms of Figure 6.1-2. ticmary. These statements are not intended as d.:finitions of stationarity but are
[tl•l•lf'l~clji-om p,.,•l>fr.• (1976) witlr prrmis.!lon ofpul>li.!lrrr.• Ad<li.!on-Wr.!lty, Arlvancrd /Jnnk Program.] meant to convey only a general meaning. More concrete definitions follow.
Indeed, there are several "levels" of stationarity, nil of which depend on the
density functions of the random variables of the process.
:~
crete values: the positive level is generated whenever a sample function in Figure
6.1·1 is positive and the negative level 9ccurs for other times.
A random process for which X is continuous but time has only discrete Distribution and Density Functions
values is called a co11ti11rtorts ra11dom .~eque11ce (Thomas, 1969, p. 80). Such a
sequence can be formed by periodically sampling the ensemble members of To define stationarity, we must first define distribution and density functions as
they apply to a random process X(t). For a particular time t 1 , the distribu-
Figure 6.1·1. The result is illustrated in Figure 6.1-3.
A fourth class of random process, called a discrete ra11dom seqrte11ce, corre- tion function associated with the random variable X 1 = X(t tl will be dcnotcJ
Fx(x 1 ; 11 ). It is defined ast
sponds to bolh time and the random variable being discrelc. Figure 6.1-4 illus-
lratcs a discrete random sequence developed by sampling the sample functions of \
(6.2-1) \
Figure 6.1-2.
In this lcxl we arc concerned almost entirely with discrete and continuous
random processes. t F,(.< 1 : t 1 ) is known as thejir.•t-nrrlu di.!lrilmtimrjimctimr of the proces~ X (I).

' L
!'Ill l'kODADILJTY, RANDOM VARIADL~, ANI> RANDOM SIGNAL l'kiNCJI'LI:S
KANI>OM t'IIO<.:ESSt:S 149

for any real number .x- 1• This is !he same definition used all along for the distribu- To prove (6.2-9), we find mean values of the random variables X 1 = X(t .) and
tion function of one random variable. Only the notation has been altered to
X 1 = X(t 1 ). For X,:
reflect the fact that it is possibly now a function of time choice t •

distribution/unction is the two-dimensional extension of(6.2-l):


1
For two random variables X 1 = X(t 1 ) and X 2 = X(t 2 ), the second-order joint
/,' .•.
;.
E[X .J = E[X(t.)] = t:x .fx(x 1 ; t d dx 1 (6.2-1 0)

(6.2-2) '· t
In a similar manner, for N random variables X 1 = X(t 1), i = I, 2, ... , N, the Nth- :~
order joint distributionfimction is (6.2-ll)

. Now by letting t 2 = t 1 +C. in (6.2-ll), substituting (6.2-8), and usirig. (6.2-10),


(6.2-3)
we get
Joint density functions of interest are found from appropriate derivatives of
the above three relationships:t E[X(t 1 + C.)] = E[X(t 1)] (6.2-12)

f~·(S I ; I 1) = c/ Fx(X 1 ; I I )jc/x 1 which must be a constant because t 1 and C. arc arbitrary.
(6.2-4)
1
fx(x., X2; It, l2) = iJ Fx(x 1 , x 2 ; 1., 12)/(iJx 1 ox 2) (6.2-5)
Second-Order and Wide-Sense Stationarity
fx(XI, ... , XN; t,, .... , IN)= iJN.. Fx(XI,
..· ... , XN; 1., ... , tN)f(iJxl ... axN) (6.2-6)
A process is called stationary to order two if its second-order density function
Statistical Independence satisfies
fx(x 1, x 2 ; t 1, t 2 ) =fx(x 1, x 2 ; 11 + !:., t1 + l:l) (6.2-13)
Two processes X(t) and Y(t) are statistically independent if the random variable
group X(t 1): X(t 2 ),,. .. , X(tN) is independent of the group Y(t'.), Y(t2).... , Y(t~ )
1 ~: for all t 1, c2 , and l:l. After some thought, the reader wi!l conclude .that (6.2-13) is a
for any cho1ce of limes t 1, t 2 , ... , tN, t;, t2, ... , tAl. Independence requires that function of time diiTerences t 2 - t 1 and not absolute ttme (let .arbitrary !:. = - t tJ.
the joint density be factorable by groups; A second-order stationary process is also first-order stal10nary. because the
second-order density function determines the lower, first-order, denstty. . .
' . Now the correlation E[X 1 X 2] = E[X(t 1)X(t 1 )] of a rundom process wtll, tn
X
·',J t.' general, be a function of 1 1 and t 2 • Let us denote this function by R.uU,. t 2) and
.
I \ call it the autocorrelationfimction of the random procr.ss X(t):
i First-Order Stationary Processes Uu(l 1, t 2 ) = /~[X(t 1 )X(t 2 )]. (6.2-14)
A consequence of (6.2-13), however, is that the autoc~rrclat.ion function of a
A random process IS called stmionary to order one if its first-order density func- second-order stationary process is a function only of ltme dtiTercnccs and not
tion docs no! change with n shift in lime origin. In other words )
J absolute time; that is, if
(6.2-8) (6.2-1 5)
must be true for nny t 1 nnd any renl number C. if X(t) is to be a lirst-ordcr stn- I then (6.2-14) becomes
lionary process.
Consequences of (6.2-!!) nre that j~(x 1 ; t.) is independent of t and the H :I
Rxx(t 1, t 1 + r) = E[X(t 1)X(t 1 + t·)] = R,u(T) (6.2-16)
1
process mean value E[X(t)] is a constant: ·'.. !(
Proof of(6.2-16) uses (6.2-13); it is left as a reader exercis'e .(sec Problem 6-6). .
. I'

E[X(t)] = X = constant d,.


.. Many practical problems require that we deal wtth the. autocorrelatton
(6.2-9) function and mean value of a random process. Problem solullons are greatly
• i
. l
t Analogous 10 dislribulion functions, these are culled first-, second-, und Nih-order detuiry fimc- ;· ...~·
tions, respectively. . t Nole thai the variable .~ 1 of integration has been replaced by lhe alt~rnalive variable x 1 for con-
venience.

'
:: !·

;.
150 PROllAniLITY, RANDOM VARIAIILES, AND RANDOM SIGNAL PRINCIPLES RANDOM PROCESSF.S 151 '

simplified if these quantities are not dependent on absolute time. Of course, N-Order and Strict-Sense Stationarity
second-order stationarity is sufficient to guarantee these characteristics. How-
Oy extending the above reasoning toN random variables X 1 = X(t 1), i = I, 2, ... ,
ever, it is often more restrictive than necessary, and a more relaxed form of sta-
tionarity is desirable. The most useful form is the wicle-se11se sca1io11ary process,
defined as that for which two conditions are true: ·'·j
I
.. '
N, we say a random process is statio11ary to order N if its Nth-order density func-
tion is invariant to a time origin shift; that is, if.

E[X(t)] = R = constant (6.2-17a) fx(xlt ... , xN; t1, ... , tN) =fx(x 1, ... , xN; 11 + 6., ... , tN +A) (6.2-20)
for al: and 6.. Stationarity of order N implies stationarity to all orders
11 , ... , IN
E[X(t)X(t + t)] = H.YX(t) (6.2-1711) ,i,
k ~ N. 1\ process stationary to all orders N = I, 2, ... , is called strict-.~eme ·~ !
1\ process stationary to order 2 is clearly wide-sense stationary. llowcvcr, the stat imwry. :~ i
converse is not necessarily !PIC.

Time Averages nnd Ergodicity


The time average of a quantity is defined as
Example 6.2-1 We show that the random process ·~
X(t) = A cos (o> 0 I + 0) A[·]- lim _I
r~ao 2T
fr [·) dt
-T
(6.2-21)
is wide-sense stationary if it is assumed that A and u> 0 ure constants und 0 is Here A is used to denote time average in a manner analogous to E for the sta-
a uniformly distributed random variable on the interval (0, 2n). The mean tistical average. Time average is taken over all time because, as applied to


value is
.~ .. random processes, sample functions of processes are presumed to exist for all

2~ c/0 = 0
lime.
E[X(I}] = A cos (w 0 I + 0) Specific averages of interest arc the mean value x = A[x(t)] of a sample func-
tion (a lower case letter is used to imply a sample function}, and the time autocor-
The autocorrelation function, from (6.2-14) with t 1 = t and 12 =I+ t, relalionflmction, denoted !1l.u(r) = A[x(t)x(t + r)]. These functions are defined by
becomes
1
H.u(l. t + r) = E[A cos (w 0 t + E>)A cos (w 0 t + w 0 r + 0)] x= A[x(t)] = l.im
l .... cn 2
T fr x(t) dt
- T
(6.2-22)

= ~ /~[cos (w 0 r) + cos (2co 0 t + w 0 t + 20)J :ll.xAr) = A[x(t)x(t + r)J


2
2 Al · = lim 1T IT x(l)x(l + r) clc (6.2-23)
=- ~
/.
cos (rt1 0 r) + -
')
Er cos (2rt1 0 t + r''n r + 2(0)"1 .... T ... tn 2 -T
For nny cme sample function of the I process X(l), these lnst two integrals
The second term easily evaluates to 0. Thus, the autocorrelation function ;. simply produce two numbers (for a fixed value of r). However, when all sample
depends only on r and the mean value is a constant, so X(l) is wide-sense functions are considered, we see that x and :Ru(r) arc actually random variables.
stationary. Dy taking the expected value on both sides of (6.2-22) and (6.2-23), and assuming
the expectation can be brought inside the integrals, we obtaint

When we are concerned with two random processes X(t) and Y(l), we say
E[x] =X (6.2-24)

-. they arc jointly wide-sense stationary if each satisfies (6.2-17) and their cross- E[:R.u(r)J = Rxx(t) (6.2-25)
corrdationfullctioll, defined in general by
Now suppose by some theorem the random variables x and :Ru(r) could be
(6.2-18) made to have zero variances; that is, .i and :Rxx(r) actually become constants.
is a function only of time di!Tcrcncc r = 12 - I1 and not absolute time: that is, if
t We n~~ume nl~o lh:tl X(l) i5 n 5lnlionary proceu ~o thnl lhe mean and Ihe nutocorrelalion func-
Rxr(l, I + t) = E[X(t)}'(t + t}] = Rxr(t) (6.2-19) tion are nol limc-dcpcndcnl.

I o
:·rr:
~ r
.1 . .
~~ ~ .
(Iii:
.. ~"
1::1..£ I'ROBABIUTY, RANDOM VARIABLilS.-AND RANDOM SIGNAL PRINCIPLES

Then we could write


.•
For time assignments, 11 =I and /2 = t, + r, with t
1\ANI>OM 1'1\0t'ESSES I 53

a real nunthcr, (6.J-I)

t\ i~
x=X (6.2-26)
assumes the convenient form
(6.3-2)
-.1 •· R,u(l, 1 + r) = /~[X(t)X(t + r)]
•'i
.., :Jt,.x{r) = Rxx(r) (6.2-27)
::i If X(t) is at least wide-sense stationary, it was noted in Section _6.2 that
,,.
\:: In other words, the time uveragt:s .~ and :Jt,.,.(r) equul the stutistical uvemges X ·
RxxU. 1 + r) must be a functron on Iy o f t'1m e l l'rll'crerlc•' r - I 2 - I r· 1 hus • for
v -

unu Rxx(r) respectively. The eryotlic: theorem allows the vuliuity of (6.2-26) unu wide-sense stationary processes
(6.2-27). Stated in loose terms, it more generally allows all time averages to equal
f~.u(r) = E[X(t)X(t + r)] (6.3-3)
the corresponding statistical averages. Processes that satisfy the ergodic theorem

l are called ergodic processes.


Ergouicity is a very restrictive form of stationarity and it may be difficult to
prove that it constitutes a reasonable assumption in any physical situation.
Nevertheless, we shall often assume a process is ergodic to ·simplify problems. In
For such processes the autocorrelation function exhibits the following·p,roperties:
(I)
(2)
1 Rx,y(r)l!!: Rxx(O)
Rxx( -r) = Rxx(r)
(6.3-4)
(6.3-5)

i [,
(:'
the real world, we are usually forced to work with only one. sample function of a
process and therefore must, like it or not, derive mean value, correlation func-
tions, etc. from the time waveform. By assuming ergodicity, we may infer the
similar statistical characteristics of the process. The reader may feel that our
(3) Rxx(O) = E[X 1(t)]
The first property shows that Rxx(r) is bounded by its value at the origin, while
the third property states that this bound is equal ~o ~he mean-squared value
(6.3-6)

f\ ~·. theory is on shaky ground based on these comments. However, it must be called the power in the process. The second property 10d1cates that an autocorre-
remembered that all our theory only serves to model real-world conditions. lt~tion function has even symmetry.
;~· Therefore, what difference do our assumptions really make provided the assumed Other properties of stationary processes may also be stated [sec Cooper and
I! ,. model does truly reflect real conditions? McGillcm (1971), p. I J3,und Melsa and Sage (1973), pp. 207-208]:
~~ Two random processes are called jointly ergodic if they are individually
11•·::
,, ergodic and also have a time cross-correlation function that equals the statistical (4) If I~[X(t)] = x ;e 0 and X(l) has no periodic components then
t' . '
l:R/: cross-correlation function:t
lim Rxx(r) = g 1 (6.3-7)

~ ~· :Jt,.,(r) = lim -I
2T
IT x(t)y(t + r) dt = Rxy(r) (6.2-28)
1r1~w

(5) If X(t) hus a periodic co1~1ponent, then R.u(r) will have a perio.dic ~~~~-
r~oo -T
ponent with the same penod. . . ( · )
(6) rr X(t) is ergodic, zero-mean, and has no penod1c component, then

I !··,~·:
'Vi:"
~:r
r:J':.;
6.3 CORRELATION FUNCTIONS

The autocorrelation and cross-correlation functions were introduced in the pre-


vious section. These functions nre examined further in this section, along with
their properties. In addition, other correlation,type functions are introduced that
are important to the study of random processes.
lim Rxx(r) = 0
lrl~oo

(7) Rxx(r) cannot have an arbitrary shape.


(6.3-9)

Properties 4 through 6 nrc more or less sclf-explanntory. Property 7 si.mply sa~s


that any arbitrary function cannot be an autocorre~at}on functio~. TillS fact will
(6.3-10)

.... be more apparent when the power de11sity spectrum IS mtrod.uced 111 Chapter 7. ,It
;•r
will be shown there that f<.o·(r) is related to the power dens1t~ spectrum through

Autocorrelation Function and Its Properties
the Fourier transform and I he form of the spectrum is not arbitrary.
,.·- Recall that the autocorrelation function of a random process X(t) is the correla-
:ir"l•
~1
tion E[X 1 X 1 ] of two random variables X 1 = X(t 1) and X 2 = X(t 2 ) defined by
i.· the process at times t 1 and t 1 • Mathematica!ly, Example 6.3-1 Given the autocorrelation function for a stationary process is

·q.
r (6.3-l)
Rxx(r) = 25 + I + 6r2
4

1:
~·;: t As in ordinary stationarity, there arc various orders of ergodic stationarity. For more detail on we shall find the mean value and variance of the process X(t). From property
~~
... ergodic processes, the reader is referred to Papoulis ( 1965), pp. 323-332.
l
~~= ·.-~
J,!._:::

f.\
,. \,
.,·

154 I'ROIIAIIILITY, ltANOOM VAiliAIIL~, AND ll,,;,lll,;-.1 :ilt;i'o,\1. l'l<o"'dl''-'"' RANDOM PROCESSES 155
'
4, the mean value is E[X(t)] = X = j25 = ± 5. The variance is given by than that of (6.3-18), because the geometric mean of two positive numbers can no~
(3.2-6), so exceed their arithmetic mean; that is
'',
ai = E[X 2(t)] - (E[X(tlW (6.3-20)
llut £[,'(2(1)] = Rxx(G) = 25 + 4 = 29 from property 3, so •
j'.
ai = 29- 25 = 4 .. \ Example 6.3-2 Let two random processes X(t) and Y(t) be defined by

~ .. X(t) = A cos (ro 0 t) + B sin (ro 0 t)


:~ \ Y(t) = B cos (ro 0 t) - A sin (cv 0 t)
Cross- Correia tion Function and Its Properties
where A and JJ are random variables and w0 is a constant. It can be shown
The cross-correlation function of two random processes X(t) and Y(t) was defined (Problem 6-12) that X(t) is wide-sense stationary if A and Bare uncorrelated,
in (6.2-18). Setting It= t and r = 11 - It, we may write (6.2-18) as .ij: zero-mean random variables with the same variance (they may have different
.:;
R.Y!'(t, t + r) = E[X(r)Y(t + r)J (6.3-11) .,
.t: ,• \1' density functions, however). With these same constraints on A and B, Y(t) is
~· also wide-sense stationary. We shall now lind the cross-correlation function
If X(t) and l'(t) arc at least jointly wide-sense stationary, RxrU. t + r) is indepen- '1
. '
.~
Rxr(l, t + r) and show that X(t) and Y(t) arc joi111ly wide-sense stationary.
dent of absolute time and we can write . ~. ·~,.. By use of (6.3-11) we have
Hxr(t) = /~[X(t)Y(r + r)] (li.3-12) ,,·'
-~ ... Rxr(t, t + t) = E[X(t) Y(t + t)J
If ....
.¥ • = E[AB cos (o1 0 I) cos (ro 0 I + cv 0 t)
Hxr(t •. I + r) = 0 (li.3-1 3) + 8 sin (w 0 t) cos (ro 0 I + 01 0 t)
2

then X(r) and Y(t) arc called ortlro(JOIIIII proces.~e.~. If the two processes arc sta- - A2 cos {ro 0 t) sin (ro 0 t + ro 0 t)
tistically independent, the cross-correlation function becomes - A B sin (w 0 I) sin (ro 0 I + l•lo t}]
Rxy{t, t + r) = E[X(t)]E[Y(t + t)] (6.3-14) = E[AB] cos (2w 0 t + ro 0 t)
tr, in addition to being independent, X(t) and Y(l) arc at least wide-sense station- + E[B 2
] sin (co 0 t) cos (ro 0 t + ro 0 t)
ary, (6.3-14) becomes - E[A 2] cos (ro 0 t) sin (w 0 I + ro 0 t)
(6.3-15) Since A and B are assumed to be zero-mean, uncorrelated random variables,
E[AB] = 0. Also, since A and B are assumed to have equal variances,
which is a constant. E[A 2] = £[8 2) = a 2 and we obtain
We may list some properties of the cross-correlation function applicable to
pn1ccsscs that arc at least wide-sense stationary:
Rxy(l, t + t) = - a 2 sin (ro 0 r)
Thus, X(t) ancl Y(t) are jointly wide-sense stationary because Rxy(t, t + t)
(I) Rxr(-t) = Rrx(t) (li.3-16) depends only on t.
(2) I Rxr{t)l:::; JRxx(O)Ryy(O) (6.3-17) Note from· the above result that cross-correlation functions are not
I necessarily even functions oft with the maximum at t = 0, as is .the case with
(3) IRxr(r)l:::; 'MRxx(O) + Hrr(O)] (li.3-18) ..l·:. autocorrelation functions.
Property I follows from the definition (6.3-12). It describes the symmetry of
·. ,·

R.n(t). Property 2 can be proven by expanding the inequality Covariance Functions


E[{ Y(r + t) + cxX(tWJ ~ o (6.3-19)
The concept of the covariance of two random variables, as defined by (5.1-13),
where ex is a real number (sec Problem 6-27). Properties 2 and 3 both constitute can be extended to random proct:sses. The autocovariancefunction is defined by
bounds on the magnitude of Rxr(t). Equation (6.3-17) represents a tighter bound Cxx(t, t + t) = E[{X(t)- E[X(t)]}{X(t + t)- E[X(t + r)J}] (6.3-21)
....
~··· I=>U I'KUUAUILITY, RANOOM VARIAIILiiS, ANI> RANDOM SIONAL PRINCIPLiiS
RANDOM l'ROCI:SSI:S 157
r
which can also be put in the form
'£. y(l)

:: i
;~
CxxU. t + r} = Rxx(t, I+ r)- E[X(1)]E[X(1 + r)] (6.3-22) .!..
27"
JI,+2T(•)t11
lo
l
.,
~: The cross-covariance flmction for two processes X(t) and Y(1) is defined by
Cxy(/, I+ r) = E[{X(I}- E[X(t)]}{Y(t + r)- E[Y(t + r)]}J (6.3-23)
x(r)

' or, alternatively,


Figure 6.4·1 A time cross-correlation [unction mcasurcmen~ sys~em. Aulucurrclation [unction mea·
surement is possible by connecting points A nnoJ U nnoJ npplyrng either .~(1) or J\1).
Cxr(t. t + r) = Rxy(l, t + r)- E[X(t)]E[Y(t + r)] (6.3-24}
For processes that are at least jointly wide-sense stationary, (6.3-22) and Figure 6.4-1 illustrates the block diagram of a possib~e .system for .measuring
(6.3-24) reduce to the approximate time cross-correlation function of two JOtntly ergodi<.; randon~
processes X(t) and Y(t). Sample functions x(t) and y(t) are delayed by a.mounts 7
(6.3-25} and T _ r, respectively, and the product of the delay~d waveforms !s formed.
and
This product is then integrated to form the output. whrch ~quais ~he tntcgr:tl HI
(6.3-26} time t 1 + 2T, where r 1 is arbitrary and 2T is .the mtegr.atwn penod. The rnte-
The variance of a random process is given in general by (6.3-21) with r = 0. grator can be of the integrate-and-dump vanety descnbed by Peebles ( 1976,
For a wide-sense stationary process, variance does not depend on time and is p. 361). . . •.
given by (6.3-25) with r = 0: If we assume x(t) and y(t) exist HI leHsl dunng the rnterval - I < I and lr rs
an arbitrary time except 0 ~ 11, then the output is easily found to be
ai = E[{X(t)- E[X(t)]} 2 ] = Rxx(O)- X2 (6.3-27)
I fto+T
For two random processes, if R.(t 1 + 2T) = ~ x(t)y(t + r) cit (6.4-1)
21 "-1"
Cxy(/, t + r} = 0 (6.3-28) Now if we choose t1 = Ot and assume Tis large, then we have
they are called uncorrelated. From (6.3-24) this means that
Rxy(t, I + r) = E[X(t)]E[Y(t + r)] (6.3-29)
1
R.(2T) = - IT x(t)y(t + -r) dt ~ at,:y(r) = Rxr(r) (6.4-2)
2T -T
Since this result is the same as (6.3-14), which applies to independent processes, Thus, for jointly ergodic processes, the system. of Fi~ure 6.4-1 ca.n approximately
we conclude that independent processes are uncorrelated. The converse case is measure their cross-correlation function (r rs varrcd to obtam the complete
not necessarily true, although it is true for jointly gaussian processes, which we
consider in Section 6.5. ·' ·· function). . .
Clearly, by connecting points A and B and applymg erther x(t) or )~t) to the
system, we can also measure the autocorrelation functions Rxx(r) and Rrr(r).
6.4 MEASUREMENT OF CORRELATION FUNCTIONS

In the real world, we can never measure the true correlation functions of two Example 6.4-1 We connect points A and 8 together in Figure 6.4-1 and use
random processes X(l) and Y(t) because we never have all sample functions of the the system to measure the autocorrelation function of the process X(t) of
Example 6.2-1. From (6.4-2)
ensemble at our disposal. Indeed, we may typically have available for measure-
ments only a portion of one sample function from each process. Thus, our only
recourse is to determine time averages based on finite time portions of single
sample functions, taken large enough to approximate true results for ergodic pro-
R.(2T) = 2~r r> 2
cos (wo t + 0) cos (coot+()+ Wo r) tit

cesses. Because we are able to work only with time functions, we arc forced, like
it or not, to presume that given processes are ergodic. This fact should not prove = ~2 J·r [cos (w 0 r) +cos (2w 0 t + 20 + w 0 r)] tit
47 -7"
too disconcerting, however, if we remember that assumptions only reflect the
details of our mathematical model of a real-world situation. Provided that the In writing this result 0 represents a specific value of the random variable 8;
model gives consistent agreement with the real situation, it is of lillie importance
whether ergodicity is Hssumed or not. t Since lhe processes urc ussumcd jointly ergodic unoJ thcrcrurc jointly sluliun;~ry, lhc inrc~:ral
(6.4-1) willocnd to be indcpcndcnl or / 1 if 1" is large enough.

'
. I'

JS!I 1'1\0IIAIIII.ITY, 1\ANI>OM VAI\IAIII.I~~. ANI> 1\ANI>OM SI<:NAI. 1'1\INCII'I.I'~

the value that corresponds lo the specific ensemble member being. used in From (6.5-2) and (6.5-3), when used in (6.5-1), we sec that the mean and auto-
(6.4-2). On straightforward rctluction of l he above integral we obtain covariance functions nrc all that arc needed to completely specify a gaussian
random process. By expanding (6.5-3) to get
R.(2T) = R,u(t) + r.(T)
where (6.5-4)
Rx.~(r) = (A /2)1
cos (co 0 r) we sec that an alternative specification u10ing only the mean and autocorrelation
is the true autocorrel:\lion function of X(t), anti function Rxx(1 1 , lk) is possible.
If the gaussian process is not stationary the mean and autocovariance func-
sin (2w 0 T) tions will, in general, depend on absolute time. However, for the important case
t:(T)=(A 2 /2)cos(w 0 t-l-20) T
2Wo where the process is wide-sense slaliona~y. the mean will be constant,
is nn error term. If we require the error term's magnitude lo be al lens! 20
times smaller than the largest value of the true nutocorrelntion function then (constant) (6.5-5)
I r.('l) I < 0.05Rxx(O) is necessary. Thus, we must have l/2w 0 7' S 0.05 or while the aulocovariancc and autocorrelation functions will depend only on lime
T:?: 10/<v 0 diiTcrcnces and not absolute time,
In other words, if T:?: 10jw 0 the error in using Figure 6.4-1 to measure the Cxx(t,. It)= Cxx(/t- 11) (6.5-6)
autocorrelation function of the process X(l) = A cos (w 0 I + 9) will be 5'Y., ; ..
or less of the htrgcst value of the true autocorrelation function. Rxx(1 1 , lk) = Rxx(/t- 11) (6.5-7)
It follows from the preceding discus~ions that a wide-sense stationary gauss-
ian process is also strictly stationary.
We illuslrnlc some of the above remarks with an example.
6.5 GAUSSIAN RANDOM PROCESSES
•.l. •'.
·;!
t\ number <Jf random processes arc important enough to have been given names.
We shall discuss only the most important of these, the {}tlltssiml ralHiom f"'oce.~s. Example 6.5-1 A gaussian random process is known to be wide-sense sta-
..,,
Consider a continuous random process such as illustrated in Figure 6.1-1 tionary with a mean of X = 4 and autocorrelation function
'· !;
anti tlcline N random variables X 1 = X(/ 1), ... , X 1 = X(t 1), .... X,.= X(tN) corre-
sponding toN time instants 11, ... , l~o ... , IN. If, for any N =I, 2, ... and any Rxx(t) = 25e·Jhl
times 1 1, ... , 111 , these rantlorn variables arc jointly gaussian, that is, they have a We seek to specify the joint density function for three random variables X(t 1),
joint density as given by (5.3-12), the process is called gaussian. Equation (5.3-12) i ~ I, 2, 3, defined at limes t 1 = t 0 + [(i- 1)/2], with 10 a constant.
can be written in the form h Here lk- 11 = (k- r)/2, i and k = I, 2, 3, so
'
. cxp {-(l/2)(x- ,YJ'IC.rl 1 1:<- X'l)
.fx(x 1, ... , x 11 ; 11 ' ... , IN)=. j(2n)NI1Cxll
(6.5-1) ..
.;;

and .·;.
;·I
where matrices (x- -'?(and (Cxl arc defined in (5.3-13) and (5.3-14) and (5.3-15),
rcspcctiv~ly. The mean valncs -'? 1 of' X(t 1) arc
}? 1 = E[X 1] = E[X(t 1)] (6.5-2) from (6.5-4) through (6.5-7). Elements of the covariance matrix are found
from (6.5-3). Thus, ··
The clements of the covariance matrix (Cxl arc
(25 - 16) (25e- 312 - 16) (25e- 612 - 16)]
c,k = C.r,x. = E"[(X I - _\';)(X k - ,\' k)] 312
ICxl = (25e- - 16) (25- 16) (25e- 311 - 16)
= E[{X(c 1) - E[X(1 1)]}{X(Ik)- E[X(Ik)]}] [ (25e- 611 - 16) (25e- 312 - 16) (25- 16)
= C.u(1 1 , tk) (6.5-3)
and -'? 1 = 4 completely determine (6.5-1) for this case where N = 3.
whid1 is the aulm:ovariance of X(1 1) and X(ld from (6.3-21).
1'".• J(,\NIIIIM 1'1<11< I'~" S I hi
160 1'1\0UAillllTY, kANtxlM VAklAULf.S, ANI> RANOOM SIGNAL l'kiNCII'LI::S

Two random processes X(l) and Y(1) are said to be joimly gaussian if the respective Iy. If tile t•vo
• processes arc at least jointly wiJe-sensc stationary, we
random variables X(t .), ... , X(tl'l), Y(t'.), ... , Y(t:U) defined at times 11 , ... , tN for obtain
X(t) and times t'., ... , t~, for Y(c), are jointly gaussian for any N, t 1 , ... , tN, M, c'1, Rx,r)l, 1 + r) = Rx,r)r) i~j (6.6-10)
.... 1~,. ((..C>-11)
C.,.,(l, I+ T) = C.:z,z,(r)
""•'~J
.i.·
2,(l) and 2,{1) arc said to be unco~related process~ ~f iCx,~P· I + r) = 0, it- J.
*6.6· COMPLEX RANDOM PROCESSES
They are called orlltogonal processes 1f Rz,zP • t + r) - • t- 1·
If the complex random variable of Section 5.6 is generalized to include time, the
result is a complex ram/om process .G(I) given by
-----------:--:-:----------
Example 6.6-1 A complex ran d om process V(1) is comprised of a sum of N
2(1) = X(t) + }Y(t) (6.6-l} .i
. . complex signals:
I~ ,' .~ N
where X(t) and Y(l) are real processes. 2(t) is called stationary if X(1) and Y(l) are V(t) = L A.el••o•+}a.
jointly stationary. If X(l) and Y(l) arc jointly wide-sense stationary, then 2(1) is ou I
said to be wide-sense stationary.
Two complex processes 2 1(1) and 2 1{1) are jointly wide-sense stationary if Here wu/2n is the (constant) frequency o r eac Il s1gn.1.
· . J· 11, is• ''I. nndom
'.
v·1riahh:
_' 's .
re resenting the random amplitude of the nth s1gnal. Sumlarly, e. 1: .'1
each is wide-sense stationary and their cross-correlation function (defined below)
is a function of time differences only and not absolute time. ra~dom vuriuble representing a random phase nngle. We ussume ull the VMI·
We may extend the operations involving process mean value, autocorrelation abl es A un d e" ror II -- I • 2• ... • N • arc stutistically independent and the . e.f
are uniformly J'i~tributed on (0, 2n). We find the autocorrelution funct1on o
function, and autocovariance function to include complex processes. The mean
value of 2(t) is V(l).
From (6.6-3):
£[2(1)] = E[X(c)] + }E[Y(t)] (6.6-2)
Autocorrelation/unction is defined by Rvv(l, t + r) = E[V•(t)V(t + r)]

(6.6-3) = ~::[ f A.
,.,
I!- }wnl- )~. I A,.,
m•l
el••nl >J••u• i· Jll~]

where the asterisk • denotes the complex conjugate. A11tocovariance function is N N


defined by L L eJwor E[A. A., el<~~-9·1] = Rvv(r)
,.co I "'~I
Ca(l, t + r) = 1::[{2(1)- l:'(.G(I)]}•(2(t + r)- £[2(1 + r)]}] (6.6-4) ..
: :':"·
From statisticul independence:
If 2(1) is at least wide-sense stationary, the mean value becomes a constant
N N
2 = x + JY (6.6-5) Rvv(r) = ei'""' L L E[A. A,.,]E[cxp {j(0.,- e.)}]
n• 1 n••l
and the correlation functions are independent of absolute time:
However,
Rxz(l, I + r) = R;a(t) (6.6-6) E[cxp {j(e..,- 0,)}] = E[cos {e,.,- e.)] + )E[sin (0,.- e.)]
Cz;t(t, I + r) = Czz(t) (6.6- 7) ln iln -1- [cos (0
For two complex processes 2 1(1) and 2/1}, cross-correlation and cross--
covaricmcefunctions are defined by
.'r:
-
- io o (2n)2 "'
- (} ) + j sin (0,., - 0,)] dO. dO..,
• .

{~
Ill ~II
i ~} (6.6-8) = m =n
and so
N
Cz 1x,(1, t + r) = t:[{2 1(1)- E(Z1(1)]}•(2,{c + r)- E[21{t + r)]}] i ~j Hyy{t) = cl••u• LA;
oAI
(6.6-9)

'
RANDOM PROCI!.~ES 163
162 PR<JnAniLITY, RANDOM VARIAnLES, AND RANDOM SIGNAL PRINCIPLES '
PROBLEMS where Cis a discrete random variable having possible values c 1 = I, c 2 = 2, and
c3 = 3 occurring with probabilities 0.6, 0.3, and 0.1 rcsrectivcly.
6-1 A random experiment consists of selecting a point on some city street that (a) Is X(t) deterministic?
has two-way automobile tramc. Define anti classify a random process for this .. (h) Find the !irst-ordcr density function of X(t) at any time 1.
experiment that is related to tramc now. 6-6 Utilize (6.2-13) to prove (6.2-16).
6-2 A tO-meter section of a busy downtown sidewalk is actually the platform of a ~ ..
• 6-7 A random process X(t) has periodic sample functions as shown in Figure
scale that produces a voltnge proportional to the total weight of people on the 1'6-7 wh'!re D, T, Hnd 4t 0 ~ Tare constants but r. is a random variable uniformly
scale at any time. distributed on the interval (0, T).
{a) Sketch a typical sample function for this process. (a) Find the first-order distribution function of X(t}.
{h) What is the underlying random experiment for the process'! (il) rind the first-order density function.
{c) Classify the process. ., . (c) Find ~[X(t)), E[X 2 (t)], Hnd cri.
* 6-,:\ An experiment consists of measuring the weight IV of some person each 10 .<ttl
minutes. The person is randomly male or female (which is not known though)
with equal probability. A two-level discrete random process X(t) is generated
where B

... ...
• J
X(t)= ±10
The level -10 is generated in the period following a measurement if the mea-
sured weight does not exceed W0 (some constant). Level + 10 is generated if 0
weight exceeds W0 • Let the weight of men in kg be a random variable having the Figure P6-7
gaussian density
"6-8 Work Problem 6-7 for the waveform of Figure P6-8. Assume 2t 0 < T.
,t;1.(wlmale) = r::;: exp [ -(w -77.1) 2/2(11.3) 2 ]
v 2n I 1.3 x(l)

Similarly, for women


.
.fw(w I female)=
I
exp [ -(w- 54.4) 2/2(6.8) 2 ]
r'·--1 A
r::;:
v 2n6.8 .. • ••
(a) Fintl 11'0 so that !'{ W > W0 I male} is equal to 1'( W ::::; W0 I female}.
'+ '•2
0 '+ T
(h) If the levels ± 10 arc interpreted as" decisions" about whether the weight
~ I measurement of a person corresponds to a male or female, give 11 physical signifi- Figure 1'6-8
cance to their generation.
(c) Sketch a possible sample function. "6-9 Work Problem 6~7 for the waveform of Figure P6-9. Assume 4t 0 ~ T.
6-4 The t~vo-lcvcl semiraiiClom binary proce.~s is defined by .Y(/)

X(t) =A or -A (11- l)T < t < nT II• If-cycle


where the levels A and -A occur with equal probability, Tis a positive constant,
and 11 = 0, ±I, ±2, ....
(a) Sketch n typical sample function.
"'-1''\:'!J Q
(b) Classify the process.
(c) Is the process deterministic? '+ T
6-5 Sample functions in a discrete random process are constants; that is t - '• c + '•
Fi~ure 1'6-9
X(t) = C = constant
164 I'RODADILJTY, RANDOM VARIAULES, ANI) RANDOM SIGNAL PRINCIPLES RANDOM l'ROCI:SSES I b5

:'}
_.,.j

'•
6-10 Given the random process
X(t) =A sin (w 0 t + E>)
.I
6-17 Sttttistically independent, zero-mean, ranuom
autocorrelation functions
process~.:s

,,"(I) •'Ill l I }'(I) lt,'I\'C

}~
where ~ and coo 11re const11~11s 11nd E> is a random variable uniformly distributed
~~
on the mtervnl ( -n, n). Dehne a new rundom process Y(t) = X2(t). and
(a) Find the ttutm:orrel11tion function of Y(t). H n(t) = cos (2nt)
(b) Find the cross-correlation function of X(t) and Y(t).
(c) Are X(t) und Y(t) wide-sense st11tionary? respectively.
(a) Find the autocorrelation function of the sum W,(t) = X(t) + Y(l).
(d) Are X(t) 11nd Y(t) jointly wide-sense stationary?
(b) Find the autocorrelation function of the difference 11'2(1) = X(t) - }'(I).
6-11 A random process is defined by
(c) Find the cross-correlation function of W1(t) and W2(t). •
Y(t) = X(t) cos (w 0 t + E>) 6-18 Define a mndom process as X(t) = p(t + r.), where p(l) is any periodic wave-
whe~e X(t) is a wide-sense stationnry rnndom process that a~plitude-modulates a form with period T and & is 11 random v11riable uniformly distributed on the inter-
earner of constant angular frequency w 0 with a random phase e independent of val (0, T). Show that
X(t) and uniformly distributed on ( -n, n).
(11) Find E[Y(t)]. E[X(t)X(t + t')] = .!_ ('~'p(~)p(~ + r) d~ = Rxx(t)
T Jo
(b) Find the autocorrelation function of Y(t).
(c) Is Y(t) wide-sense stationnry? *6-19 Use the result of Problem 6-18 to find the autocorrelation function of
6-I2 Given the random process random processes having periodic sample function waveforms p(t) defined
(a) by Figure P6-7 with c = 0 and 4t 0 ~ T, and
X (I) = A cos (w 0 t) + B sin (w 0 t) (b) by Figure P6-8 with & = 0 and 21 0 ~ T.
where Wo. is u .const11nt, und A and 8 11re uncorrel11ted zero-mean random vari- 6-20 Define two random processes by X(t) = p 1(1 + r.) and Y(t) = £12(1 + 1:) when
~ble.s havmg d11f~rent density functions but the same variances u 2 • Show that X(t) P' (t) and p 2(1) are both periodic waverorms with period T and r. is a random vari-
IS w1de-sense statiOnary but not strictly stationary.

able uniformly distributed on the interval (0, T). Fmd •
an expressiOn for tI1e cross-

6-13 If X(t) is a stationary random process having a mean value E[X(t)] = 3 11 nd correlation function E[X(t) Y(t + t')].
autocorrelation function Rn(r) = 9 + 2e-l•t, lind: 6-21 Prow:
(a) the mean value and (a) (6.3-4) and (b) (6.3-5).
(b) the variance of the random vari11ble 6-22 Give arguments to justify (6.3-9).
6-23 For the random process having the autocorrelation function shown in
r = fx(c) tit Figure P6-23, lind:
(a) E[X(t)] (b) E[X 2(t)] and (c) ui.
(Jlint: Assume expect11tion 11nd integration opemtions are interchangenble.)
6-14 Define 11 random process by
X(t) = A cos (nt)
50
where A is a gaussian random varinble with zero mean and v11ri 11 nce u~.
(a) Find the density functions of X(O) nnd X( I). '
(b) Is X(t) stutionary in any sense'?
!~
6-15 For the random process of P;oblem 6-4, calculate: ··.
• (a) the n~~an value E[X(t)] (I>) R.rxU 1 = 0.57', t 2 = 0.7'1') (c) l~xxU, =
20 ------+----
0.2 r. r2 = 1.21 ).
6-16 A random process con~ists of three sample functions X(t, s 1 ) = 2, X(t, s 2 ) =
2 cos (t), a~d X(t, .sJ) = 3 sm (t), e11ch occurring with equal probability. Is the -10 0 10
process statiOnary Ill 11ny sense'! .I
Figure 1'6-23

'
J(l6 1'1\0IIAIIII.ITY, 1\ANI>OM VAI\IAIILI~~. AND llANI>OM SIGNAl. l'IUNCII'I.L~

6-24 A random process )'(t) = X(t)- X(t + t) is dclincd in t<:rms of a process 6-31 An ensemble member of a stationary random. process X(l) is sampled at N
X(l) that is at least wide-sense stationary. times t1 , i = I, 2, ... , N. By treating the samples as random variables X 1 = X(t 1),
(a) Show that the mean value of Y(t) is 0 even if X(t) has a non7.cro mean an estimate or measurement X of the mean value X = E[X(t)) of the process is
value. sometimes formed by averaging the samples:
(h) Show that
" I
a:, = 2[Hu(O) - R.u(r)] .. '
X =-N
H
2::X 1
I• I

If )'(t) = X(t) + X (I + t), find £[ Y(t)] and a:.. How do these results
(c) (a) Show that E[X] =X.
compare to those of parts (a) and (/J)'I (h) If the samples arc scparutcd far enough in time so that the random vari-
6-25 For two zero-mean, jointly wide-sense stationary random proc.:csscs X(t) ables X 1 can be considered statistically independent, show that the variance of the
and )'(t), it is known thatai = 5 and a:
= 10. Explain why each of the following :::
·';
estimate of the process mean is
functions cannot apply to the processes if they have no periodic components. .i
:'·
(a) Rxx(t) = 6u(t) cxp (- 3r) (/>) Rxx(t) = 5 sin (5r)
6-32 For the random process and sum pies defined in Problem 6-31, let an esti-
(c) Hxrlr) = 9( I + 2r r'
2
(cl) Urr(t) = -cos (6t) cxp (-1 r I) mate o! the variance of the process be defined by
~ 1 H
(e) Rn(t)=5 [sin
- (Jr)J
- (f) Rrr(t) = 6 +{sin (lOr)] O'x = -N L(X,- X)l
3t lOt 1•1

6-26 Given two random processes X(t) and Y(t). Find expressions for the auto· Show that the mean value of this estimate is
correlation function of J.V(t) = X(t) + Y(t) if:
(a) X(t) and )'(t) arc correlated. ,.., N-1
(IJ) They arc uncorrclated.
E[irD = - -
N
ai
(c) They arc uncorrelatcd with zero means.
6-33 Assume that X(r) of Problem 6-31 is a zero-mean stationary gaussian
6-27 Usc (6.3-19) to prove (6.3-17). process and let
6-28 Let X(t) be a stationary continuous random process that is di!Terentiable. ~ 1 N
Denote its time-derivative by X(l). O'x=-N l::Xf
(11) Show that E[X(t)] = 0. 1•1

(h) Find Rx,t(t) in terms of Rxx(r). be an estimate of the variance a,i of X(t) formed from the samples. Show that the
(c) Find Rt,t(t) in terms of Rxx(t). (lli111: Usc the definition of the derivative variance of the estimate is
• X(t + r.) - X(t)
"( t) = lm
J\ 1 ~--'--...:...:. .
vanance of ax =
-z. N
2a~
,~o c
and assume the order of the limit and expectation operations can be inter- (Hint: Use the facts that E[X 2] = a.i, E[X 3 ] = 0, and E[X 4 ] = 3a~ for a
changed.) gaussi<>n random variable having mean zero.)
6-29 A l:!:iuf;sian random process has an autocorrelation function 6-34 How many samples must be taken in Problem 6-33 if the standard devi-
ation of the estimate of the variance of X(t) is to not exceed 5% of ai?
Rxxlt) =6 cxp (-lrl/2) *6-35 A complex random process Z(t) = X(t) + j Y(t) is defined by jointly station-
Determine a covariance matrix for the random varinhles X(t), X(t + 1), X(t + 2), nry real processes X(t) and Y(t). Show that
and X(l + J).
E[ I Z(tWJ = l{xx(O) + R rr(O)
6-30 Work Problem 6-29 if -~
I
.
*6-36
o
=; .,
L.:t X 1 (t), X 1 (t), Y1 (r) and Y1 (t) be real random processes and define
r. sin (nt)
R~~(r ) = ,, - - -
.· nr •,:

·~·. I
,..( '
lbll I'KORAIJILITY, RANDOM VAKIAIJLES, ANI> RANDOM SIGNAL I'KINCII'LES IIANI>OM 1'1\I)('I·S~IS ((,')

.,.,, Finu expressions for the cross-correlation fun<:tion ol' .Z 1(1) anu Z 2(1) if: . ralll 1om l1t"l1ccss X(l) has an autocorrdation
6-43 Assume that an crgm I1c
~'.
l'
(a) All the real processes are correlated. function
........
f..
(b) They are uncorrelated . ')
•.
\.
(c) They are uncorrelated with zero means. R .. (r) = l!l + - - -2 [I+ 4 cos (12r)]
*6-37 l.et Z(l) be a stationary complex rundom process with an autocorrch11ion
.1.1 6 +f
function R;:z(r). Deline the random variable
; (a) Find IX I· . . .
(b) Docs this process have a pcnod1c compont:nl ·1
I-;·.
w= f+T Z(t) dt (c) What is the average power in X(t)'!
where T > 0 and a are real numbers. Show that 6 44 Define 'I r·lndom process X(t) as follows: (I) X(l) assumes only !Jill.! of I\~U
1' ~ssible level's ,·or _ 1 at any time, (2) X (I) switches back atll~ :orth. bet ween. 11~
I:TIIVI
2
f
.l '" _ .('1' -lri)Ru(r) dr
1
ptwo 1eves 1 ramon 1 11y \V'atll t'1111 c• (l)
• the nattnbcr of level transtttons
.. 111 any lttlH:
1 k .
interval t is a Poisson random variable, that is, the ~rohabd1\y of cx~1c1 _Y · tr •.ln-
·1·ons when the ·1verage rate of transitions is J., is gaven by [(J.r)l/k!J cxp (-l.r).
~~) tr· ;1 sitions occ,urring in any time interval arc statistically indcpcnd.cnt nf Iran·
1

ADDITIONAL PROBU~MS ·,:: \j 1


sitiot;s ·in .111 y other interval, and (5) the levels at the start of any 1nt..:rva.l arc
'{: equaliy pr~1bablc. X(l) is usually called the rwulom tl'it'!JI'IIJlh fm 1c,•ss. II ts att
6-31! For a random process X(t) it is known that fx(x 1 , x 1 , x 3 ; 11 , 12 , 13) =
fx(x., X1, xJ; 11 + 6., t 2 + 6., 13 + 6.) for any t .. t 1 , t 3 and 6.. Indicate which of 'f. l. example of a discrete random process.
I.
t·:.· i' (a) Find the autocorrelation function of the process.
the following statements are unequivocably true: X(t) is (a) stationary to or-
> ,.•! (/J) Find probabilities P{ X(t) = I} and I'{X(t) = -I} for any I.
der I, (b) stationary to order 2, (c) stationary to order 3, (d) strictly stationary, ~!
\.'. ~ ( (c) What is E[X(r)]'?
(e) wide-sense stationary,(/) not stutionary in any sense, and (g) ergodic.
6-39 A random process is defined by X(t) = X 0 + Vt where X 0 and V ,are sta-
,..··...
i
.,
,~
(c/) Discuss the stationarity of X(t) .

tistically independent random variables uniformly distributed on intervals .:;:. 6- 45 Work Problem 6-44 assuming the random telegraph signal has levels 0
,j and I.
[X o1, X o2J and [V., V2], respectively. Find (a) the mean, (b) the autocorrelation, ..
and (c) the autocovariance functions of X(t). (d) Is X(t) stationary in any sense'! If I 6-46 X = 6 and Rxx(l, 1 + r) = 36 + 25 exp (- 1r I) fur a n1ndon~ process X~l).
so, state the type. . '' It Indicate which of the following s\atcmcnls are true b<tsed on wh<tt IS known Will.\
* 6-40 (a) Find the first-order density or' the random process of Problem 6-39. certainty. X(l) (u) is Jirst-ortkr stationary, (/1) has tot.al :.1veragc power of 61 ~' ~v:'·
(b) Plot the density for t = k(X 02 - X 01 )/(V2 - V1) with k = 0, 1/
2
, I, and 2.' is ergodic, (c/) is wide-sense stationary, (e) has a pcraod1c component, and U) h.1s
Assume V2 = 3 V1 in all plots. an <tC pow..:r of 36 W.
6-41 Assume a wide-sense stationary process X(t) has a known mean J? and a 6-47 A zero-mean random process X(t) is ergodic, h<ts. average power ~f ~4 W,
known autocorrelation function Rxx(r). Now suppose the process is observed at and has no periodic components. Which of the followmg can be a •.v:1hd .auto-
time t 1 and we wish to estimate, that is, predict, what the process will be at lime · func t'on? If one cannot
corre 1alton 1 • • , slate at least one rc.1son • why.
)
t1 + t with t > 0. We assume the estimate has the form r
(a) 16 + 18 cos (3<), (/I) 24Sa 2(2r), (c) [I + 3r 2 1 exp (- 6r), and (c/) 24c~(l - r .
X(t 1 + r) = exX(t 1 ) + p 6-48 Usc the result of Problem 6-18 to find the autocorrelal~on function of a
where ex and pare constants. rundom process with periodic sample function waveform [l(l) defined by
(a) Find ex and fl so that the mean-squared prediction error
p(t) = A cos 2 (2nt/T)
~= E[{X(tl + t)- X(t. + t}p)
is minimum. where A and T > 0 are constants.
(b) Find the minimum mean-squared error in terms of Rxx(r). Develop an 6-49 An engineer wants to measure the mean value of a noise signal that can. be
alternative form in terms of the autocovariance function.
6-42 Find the time average and time autocorrelation function of the mndom
. well-modeled as a sample function of a gaussian ~roecss: He .uses the sam~ltt~g
. I f (>roblenl 6 31 After 100 samples he Wishes Ius estimate \0 be Wl\htn
cslama or o • · .. .. . . ·. .. 1 ,
process of Example 6.2-1. Compare these results with the statistical mean and +0.1 y of the true mean with probability 0.9606. What1s the 1.1rgest v.1ra.1ncc llc
i
j! ....
autocorrelation found in the example. · rroccss can have such that his wishes will be true?

'
t:.' L
170 1'1\0ilAilll.ITY, RANI>OM VAIIIAIILI~~. ANil llANDOM SIGNAL I'RIN<'II'I.l~~ RANDOM I'ROCI~~SE.S 171

6-50 Let .\'(1) be the ~11m of a deterministic signal s(l) and a wide-sense stationary * 6-55 Extend Example 6.6·1 to allow the sum of complex-amplitude unequal-
noise process N(l). Find the mean value, and autocorrelation and autocovariance frequency phasors. Let Z" i = I, 2, ... , N be N complex zero-mean, uncorrelated
functions of X(l). Discuss the stationarity of X(l). random variables with variances d,. Form a random process

6-51 Random processes X(l) and Y(l) arc defined by


. Z(t) =
N
L Z 1 el••••
X(l) =A cos (w 0 t + 0) 1•1 ·jf
where w1 are the frequencies of the phasors. ..,~:'
, ..'
Y(c) = B cos (w 0 1 + 0) (a) Show that E[Z(t)] = 0. ...
where A, B, and w 0 nrc constants while 0 is a random variable uniform on (/J) Derive the autocorrelation function and show that Z(t) is wide-sense
(0, 2n). By the procedures of Example 6.2-1 it is easy to find that X(c) and Y(c) arc stationary. ··~
zero-mean, wide-sense stationary with autocorrelation functions *6-56 A complex random process is defined by
R,\'.\·( r) = (11 2 /2) cos (m 0 r) Z(t) = cxp UOt)
wher~ n is a zero-mean random variable uniformly distributed on the interval
2
Ryy(r) = (IJ /2) cos (c.o 0 r)
(a) Find the cross-correlation function Hxr(l, 1 + t) and show that X(t) and from w0 - !J.w to w0 + !J.w, where w0 and !J.co are positive constants. Find:
(a) the mean value, and (b) the autocorrelation function of Z(t).
Y(l) arc jointly wide-sense stationary. ·Il,
(c) Is Z(t) wide-sense stationary?
(/J) Solve (6.4-2) and show that the response of the system of Figure 6.4-1
equals the true cross-correlation function plus an error term c(T) that decreases *6-57 Work Problem 6-56 except assume the process II.,
~I· l

as T incrcascs.
r Z(t) = e10
' + e- 10' = 2 cos (Ot)
(c) Sketch I r.(T) I versus T to show its behavior. How large must 7' be to make
1r.(T) I less than I 'X, of the largest value the correct cross-correlation function can ...
*6-58 Let X(t) and Y(t) be statistically independent wide-sense stationary real pro-
have? cesses having the same autocorrelation function R(t). Define the complex process
:~
6-52 Consider random processes ,.. Z(t) = X(t) cos (w 0 t) + jY(t) sin (w 0 t)
.,"
X(t) = A cos (w 0 I + 0) ~:.
w~1ere w0 is a ~ositive constant. Find the autocorrelation function of Z(t). Is Z(t)
w1de-sense statiOnary?
Y(t) = B cos (w 1 t +<I>)
where A. H, w 1, and (1) 0 are constants, while 0 and <I> arc statistically independent
.,.,,;~
random variables uniform on (0, 2n). :' rt
(a) Show that X(t) and l'(t) are jointly wide-sense stationary. '
) ~
(/>) If 0 = <I> show that X(t) and Y(t) arc not jointly wide-sense stationary ..i
unless w 1 =mo. }! ·';·

6-SJ A zero-mean gaussian random process has an autocorrelation function


13[ I - (It I /6)] ltl ~ 6
R.u (r) = { 0 elsewhere '
~.

Find the covariance function neccssnry to specify the joint density· of random
variables defined at times 11 = 2(i - I), i = I, 2, ... , 5. Give the covariance matrix
I
for the X 1 = X(l 1).
l
'':,
.I
0'
6-54 If the gaussian process of Problem 6-53 is shifted to have a constant mean
S = --2 !>Ut all else is unchanged, discuss how the autocorrelation function and
covariam:c matrix change. What is the cfTect on the joint density of the five
random variables?
SI,ECTRAL <.:11/.IC.AC' I t.KI:.I u....:. ur 1\1\1"1-'"'''' • '"'""'' .-.... .....

7.1 POWim DENSITY SPECriWM AND ITS PROPERTIES


CHAPTER The spectral properties of a det,•rministic signal s(l) arc contained in its F!lurier
SEVEN transform X(w) given by
(7.1-1)
SPECTRAL CHARACTERISTICS OF X(w) = J_"'aux(t)e-Jwt dt

RANDOM PROCESSES
Tl f tion X(w) sometimes called simply the spectrum of x(t), has li.H: u.nit. of
voll~s u~~ hertz and describes the way in which relative signal volt~gc IS dlstnb."
uted \~ith frequency. The Fourier transform can, therefor~, be condsldelrcd, to fbcth,t
. bl (t) Doth the amplitudes an p Jases o e
uolwge density spectrum app IICa e to x . . 'f X( ) is
fre uencics present in x(t) are described by X(w). For thiS reason, I w
q 1 ·( ) . be recovereu by means of the inverse Fourier transform
known t 1en x 1 c.tn

x(l) = .J...
2rr _ ~·
f"' X(cv)e1'"' clcv (7.1-2)

In other words ' X(w) forms a complete description of x(t) an~ vice v.ersa.
·,::·
In •tttcmpting to apply (7.1-1) to a ran d om process , we 1mmed1Utely. encoun·
ter problems. The principal problem is the fact that X(w) may not exist ~or. mostf
sam le functions of the process. Thus, we conclude that a sp~ctral descnpll.on o
a ra~dom process utilizing a voltage density spectrum (Fou~ler tra~sfo~~L IS~~~~
feasible because such a spectrum may not e~ist. Other prob ems arise I up
7.0 INTRODUCTION · 'd d (Cooper and McG11lem, 1971, p. 132).
. tran~o:~lse ~~~l~~~:~n~~eif we turn our attention to the description o~ the powe~ in
All of the foregoing discussions concerning random processes have involved the the random process as a function of frequency, instead of ~ol;age,.lt resu/,tsd ttl~~
time domain. That is, we have characterized processes by means of autocorrela- such a function does exist. We next proceed to develop th1s unct1on, ca c
tion, cross-correlation, and covariance functions without any consideration of powa density spt•ctnrmt of the random process.
,··. spectral properties. As is well known, both time domain line/ frequency domain
~··
analysis methods exist for analyzing linear systems and deterministic waveforms.
~ t But what about random waveforms? Is there some way to describe random pro- The Power Density Spectrum
I:
cesses in the frequency domain'? The answer is yes, and it is the purpose of this • .... "(t) let x..(t) be defined as thut portion of a sample.: func-
\· 1· or a ;a mom
1 process " • · r .
I chapter to introduce the most important concepts that apply to charucterizing tioll x(l) that eXistS between - 'J' allU '/';that IS
random processes in the frequency domain.
The spectrul description of a deterministic waveform is obtained by fourier X( I) -'!'<I< T (7.1-J)
transforming the waveform, and the reader would be correct in concluding that .\'·r(ll = { O elsewhere
Fourier transforms play an important role in the spectral characterization of
random waveforms. However, the direct transformation approach is not attrac- Now so long as Tis finite, x.,.(t) will satisfy
tive for random waveforms because the transform may not exist. Thus, spectral
.:· analysis of random processes requires a bit more subtlety than do deterministic (7.1-4)
signals.
An appropriate spectrum to be associated with a random process is intro-
duced in the following section. The concepts rely heavily on theory of Fourier t Many books catllhis (unclion a power ~pecrr 11 I 11rusity. W~ shall u~:~:asiunally usc utsu lhc names
transforms. Readers wishing to refresh their background on Fourier theory are power tlemit)' or pllwer spectrum.
referred to Appendix D where a short review is given.

172

~··
174 1'1\0IIAIIII.ITY, 1\ANDOM VARIAOLES, AND RANDOM SIGNAl. I'RINCII'I.ES

and will have a Fourier transform (see Appendix D for conditions sufficient for .:v Equation (7.1-9) establishes two important facts. First, average power P-xx in
the existence of Fourier transforms), which we denote X T(w), given by a random process X(t) is given by the time average or its second moment:

,\'.,((1)) == f.~/.:.,(t)e-1'" 1 tit= f.T .:<(t)e-1'" 1


c/t (7.1-5) ·:, (

• (7.1-10)
1
1
The energy contained in x(t) in the interval (- T, '/') ist l
-~
for a process that is at least wide-sense stationary, E[X 2(t)] =XI, a constant,
and l'x.Y =XI. Second, Pxx can be obtained by a frequency domain integration.
..,,
J E( '/') = IT x}(t) cit = IT .x (t) clt 2 (7.1-6)
~f
If we define the power density spectrum for the random process hy ... ·..
'1.,
. '
'
~}·;
-1· - T
•• ( ) • I:[ I X 1(wWJ .,. :

Since x 7.(t) is Fourier transformable, its energy must nlso be related to X ,.(w) by nxx w = 1trn (7.1-11)
T~n'l 2T
Parseval's theorem. Thus, from (7.1-6) and (D-21) of Appendix D

E(T) = T .x (t) dt = - l
2 I"" 2
I X 1 {w) 1 dw (7.1-7)
the applicable integral is
\
I
-1 21t -ao Pxx = - If"" Rxx(w) clw
21t -ao
(7.1-12) f
,.
·• I·
?,
By dividing the expressions in (7.1-7) by 2T, we obtain the average power \\ . ~
P(T) in x(t) over the interval (- T, T): from (7.1-9). Two examples will illustrate the above concepts:

1'(7') = -
1 IT .x (t) dt = - I f""
2 I X ,(w) 1
2
dw (7.1-8) ·...
2T -T 2n -ao 2T Example 7.1-1 Consider the random process
At this point we observe that I X ,.(w)I 2 /2T is a power density spectrum became X(l) = A cos (w 0 t + 0)
power results through its integration. 1-lowever, it is not the function that we seck
for two reasons. One is the fact that (7.1-8) docs not represent the power in an where A and w0 are real constants and 0 is a random varinhle uniformly dis-
entire sample function. There remains the step of letting T become arbitrarily tributed on the interval (0, n/2). We shall find the average power l'xx in X(t)
large so as to include all power in the ensemble member. The second reason is by use of(7.1-IO). Mean-squared value is
-. that (7.1-8) is only the power in one sample function and docs not represent the
process. In other words, P(T) is actually a random variable with respect to the
random process. By taking the expected value in (7.1-8), we can obtain an
E[X l(t)] = E[A 2 cos 2 (w 0 t + 0)] == E[ ~
2

+ ~
2
cos (2w 0 t + 20)J
average power Pxxfor the random process.t Al Al (•11 2
From the above discussion it is clear that we must still form the limit as = T + T Jo -;cos (2w 0 t + 20) dO
T -• co and take the expected value of (7.1-8) to obtain a suitable power density
spectrum for the random process. It is important that the limiting operation be . Al Al
=- - - sin (2w 0 t)
done last (Thomas, 1969, p. 98, or Cooper and McGillem, 1971, p. 134). After 2 1t

these opcr~tions arc performed, (7.1-8) can be written


This process is not even wide-sense stationary, since the above function is
I IT £[X (t)] tit=-I f"" E[ I X (w) 1 ]
2 time-dependent. The time average of the above expression is
~
2
2 (7.1-9)
l'.u = lim --: lim dm
T-eo 27 -T 21t -ao T-eo

t We "'"""c n re:otprncc~s X(l) nml inlerrncl x(t) ns eilher lhe vollnge nero!! n 1-0 impe<l:once or
lhe currenl lhruugh 1 n. In u1her words, we shnll :"!ume n 1·0 renl impedance whenever we di:ocuss which easily evaluates to
energy or power in suhsef.Juent work, unless spccilicnlly slated olherwise.
t In lakin~ the expccleu value we replace x(l) hy X(t) in (7.t-8) since the integral of xl(l) is an
opcralinn perfnrmed on ~II sample funelinn• of .\'(1).
I /0 I'ROBABILITY, RANOOM VARIAIILES, ANU RANDOM SIGNAL PRINCiPLES SPI!<..TRAL CIIAI\ACTEiliSTICS 01' RANDOM 1'1\0CJ:SSES 177

Propt:rly I fllllllWS from the.: dc.:linition (7.1-11) ami the.: litct that lht: t:xpc~:lcd
Eumple 7.1-2 We reconsider the process of the above example to find
value of a nonnegative function is nonnegative. Similarly, property J is true from
8~x(w) and average power P.rx by use of (7.1-11) and (7.1-12), respectively.
F1rst we find X r{w): (7.1-11) since 1X r(w) 12 is real. Some rellection on the properties of Fourier trans-
forms of real functions will verify property 2 (sec Problem 7-9). Property 4 is just
another statement of (7.1-9).
X r(w) = JT A cos (w 0t + 0) exp (- jwt) cit
-1" Somc.:limc.:s another propc.:rty is included in a list of properties:

2
= A exp U0) I'/'. r exp (j(w 0 - w)t] dt
(7.1-17)

It says that the power density spectrum of the derivative X(t) = c/X(t)/dt is w!

+ ~ ex.p.( -j0) J_rr exp [-j(w 0 + w)t] dt


times the power spectrum of X(t). Proof of this property is left as a ·r~ader exer-
cise (Problem 7-1 0).
A final property we list is
sin [(w- w0 )T]
= AT exp Ue)

(w- w 0 )T If"' 8xx(w)e1"'' clw = A[Rxx(t. t + r)]
(6) - (7.1-18)
27t -"'
+ AT exp ( -je) sin [(w + w 0 )Tj
(w + w 0 )T
Sxx(w) = t:A[RxxU. t + r)]e-J••• cit (7.1-19)
Next w_e determine I X r{w) 12 = X r(w)X1{w) and find its expected value. After
some s1mple algebraic reduction we obtain
It stutes that the power density spectrum nnd the time average of the :ntlocor-
2
E[ I X 1{w) 1 ] = A 7t
2
{I. sin 2 [(w- w 0 )T] +I. sin 2 [(w + w 0 )TJ) relation function form a Fourier transform pair. We prove this very important
2T 2 7t ((w - w 0 )T] 2 7t [(w + w 0 )T] 2 J property in Section 7.2. Of course, if X(t) is at least wide-sense stationary,
Now it is known that A[Rxx(t, t + t)] = R,u(r), and property 6 indicates that the power spectrum and
2
the autocorrelation function form a Fourier transform pair. Thus
. T [sin (o:T)]
I•m-
r- .... 7t
- exT =o(o:)
(7.1-20)
(Luthi, 1968, p. 24), so (7.1-11) and the above result give
A27t
Sxx(w) = 2 [o(w- Wo) + o(w + w 0)] (7.1-21)

Finally, we use this result to obtain average power from (7.1-12): for a wide-sense stationary process.
I
Pxx = -2
I"'
A27t .
-.,- [<5(w- Wo) + o(w + Wo)] dw = -
A2
7t
- - .... 2 B:uulwidth of the Power Density Spectrum
Thus, P xx found here agrees with that of the earlier Example 7.1-1. Assume that X(t) is a Iowpuss process; that is, its spectral components are clus-
tered ncar co= 0 and have decreusing magnitudes at higher frequencies. Except
for the rae! thut the areu or Xxx(co) is not necessarily unity, Sxx(co) hns chnructer-
Properties of the Power Density Spectrum istics similar to a probability density function (it is nonnegative and real). Indeed,
by dividing ilxx(w) by its areu, a new function is formed with area of unity that is
The power density spectrum possesses a number of important properties: analogous t<> u density function.
(I) 8xx(w) ~ 0 (7.1-13) Recall that st<tndnrd deviution is a rneusure of the spread in a density func-
tion. The analogous quantity for the normalized power spectrum is a measure of
(2) Sxx( -w) = Xxx(w) X(t) real (7.1-14) its spre<td that we call rms bmulwitltlr,t which we denote ~V, ... (rad/s). Now since
(3) Sxx(w) is real (7.1-15) Sxx(w) is an even function for a real process, its "mean value" is zero and its

(4)
2
I7t f""-oo Sxx(w) dw = A{E[X (t)]} 2
(7.1-16)
t The no1u1ion rms bundwidlh slunds for roul·lll~tiii·S</Uared bundwidlh.
17!1 l'lUliiAIIII.In', llANl>llM VARIAIIL!!S, AND RANDOM SIGNAL l'lliNCli'LES Sl'llCTRAL CIIARACTERISTICS OF RANDOM PROCE$Sr~~ 179 '
"standard deviation" is the square root of its second moment. Thus. upon nor- and rms bandwidth by
mali7.atinll, the rms ba11dwidth is givc11 by
11'2 = r~~. CIJ2S.u(W) d(l) (7.1-22)
(7. 1-24)
""' r~· ,,,
. :.; ..((1)) duJ
.\.\
...
The rcaclcr is encouraged to sketch a few l1iwpass and bandpass power spectrums
and justify for himself why Ihe factor of 4 appears in (7. 1-24).
i
Ex:unplc 7.1-3 Given the power spectrum ''I ,
-&·

10 7.2 HELATIONSHIP Bl~TWEEN POWim SPECTIWM ' li~..


~
K.u(w) =[I + (w/1WJ2 AND AUTOCORRELATION FUNCTION I'

where the 6-d\l ba!ldwidth is 10 radialiS per second, we ond H',,,.. First, In Section 7.1 it was slated that the inverse Fourier transform of the power
usi11g (C -2R) from Appendix C, density spectrum is the time average of the autocorrelation function; thai is
••· 10 duJ ~ f'" ,/(1)
-I J~' l'lxx(W)c.' 1'"' c/c/J = A[Hx.y(t, I+ tl]
J_,.,[I + (w/10) 2) 2 = IO· _., (100 + co 1 ) 2 27t -no
(7.2-1)

{200(10~ + co 2~00 tn 11 - ;~) l:J


1 This expression will now be proved. ~ !I
= WS 2) [ " , + ( If we usc (7.1-5), which is the definition of X ~w), in the defining equation
(7.1-11) for the power spectrum we havet
=SOn
Next, from (C-30) of Appendix C: 'l· Sx,y(w) = limE[;}- f 1
X(tde 1'" 1' dt 1 JT.X(t 2 )e-J"'11 d1 2 ]
"' ,_ '
T-"' T -T -7
2
I Ow drv ~ f"' clw
2 •I.

f .1
c1J t~

------:-:-. - I 0 -~
-of• [ \ + ((I)/\WJ2- -rro (100 + (1)2)2 ..
{ lim
T-m
_!__IT
2T
IT E[X(t,)X(12)]e-J••I11-Irl dt2 dt,
-T -T
(7.2-2)
;
.....'
IO~L(IO~: [~, + 2~ tan-~(;'~) [J

·'
= 102 ) :- , The expectation in the integrand of (7.2-2) is identilied as the autocorrelation
~!· ~ function of X(l):
•\
;·r ~
= 5000rr !· ·~
:.. E[X(ItJX(t 2 )] = R,u(/ 1,1 2) - i < (1 1 and 12 ) < T (7.2-J)
~~
Thus ;'
2 Thus, (7.2-2) becomes
:~

11~,,. =
.
'i
(7.2-4)

I
\Ill\ ' ,, 11....11 II'''""'·and the ll-dll bandwidth of 1-i.n("') arc equal in this case, they Suppose we next make the variable changes
a1.: lllllt''lllal in f!.CIIcral.
rlt "" dt, (7.2-.~CI)

t = 12 - I(= 12 - I dr = t/1 2 (7.2-Sh)


The abtwc co11cep1 is readily extended to a process thai laas a bandpass form
in (7.2-4); we obtain
of power spectrum; that is, its significant spectral componct~iS clusl:r ncar ~ome
frequencies 1;10 and -iiJu. If we assume thai the ~roccs~ X(l) ts rc~l, S,u(w) wtll be IT-I IT .Rxx(l,l + r) dl e-Jror eft :!

Sxx(u>) =lim IT
real nnd hnve even symmetry about c•J = 0. Wtlh llus nssumplton we define a (7.2-6)
T-.., 2 -T-1 -7
IJII.'c/11 freq ucncy ciJ 0 hy

_ _£~.' wS,rx(m) t/(1) (7. 1-2:\) t We usc X(r) in (7.1-5), rnlhcr lhnn x(c), to imply lhnl I he opcr;~lions performed tnkc place on 1:1c
(1)11-
. r··· ~· X <
11 '1 X CIJ ) II(I) prucc". :as nppmcd tn nne ,\lllnptc function.

'I
·.~
180 PROBABILITY, RANDOM VARIAIILES, ANI> RANDOM SIGNAL I'RINl:IPLI!S
SPECTRAl. CIIARAC.:T~IliSTICS OF RANDOM I'R\lC'I:.SSES 11!1

Next, taking the limit with respect to the t integral first will allow us to inter- where 11 and w 0 are constants. This cquution can be written in the form
change the limit and -r integral operations to get
R.u(t) = A42 (elwor + e-Jwo•)
X,yx(w) =I"'
- UJ
{lim T
T-t CIO 2
1
IT Rxx(t, t + t) dt}e-J••• th
- T
(7.2-7)
Now we note that the inverse transform of a frequency domain impulse func-
tion is
The quantity within braces is recognized as the time average of the process auto-
correlation function ..!.. f"' o(w)e1"'' clw =..!..
2n _.., 2n
;.;..
I'.
I [.

.;·
A[Rxx(t, I + t)] = lim - I IT Rxx(l, t + t) dt (7.2-8) from (A-2) of Appendix A. Thus
'· T-a> 2T -T
I .... 2no(w)
Thus, (7.2- 7) becomes
and, from the frequency-shifting properly of Fourier transforms given by
S,u(rv) = .
J~ A[R.rx(t, 1
I+ -r)]e- "'' c/-r (7.2-9) (0-7) of Appendix D, we get

which shows that Sxx(w) and A[Rxx(l, t + t)] form a Fourier transform pair: By using this last result, the Fourier transform of Rxx(r) becomes
A[Rxx(t, t + r)] +-> 8xx(w) (7.2-1 0) Aln
Sxx(w) = 2 (o(w- Wo) + o(w + WolJ
This expression implies (7.2-1), which we started out to prove.
For the important case where X(t) is at least wide-sense stationary, This function and Rxx(t) are illustrated in Figure 7.2-1.
A[Rxx(t, t + t)] = Rxx(r) and we get

Sxx(w) =I: Rxx(t)e- 1"'' c/r (7.2-11) Rxx<rl

A'
T
Rxx(r) = - I
27t I""
-..,
8xx(w)e1"'' tlw (7.2-12)
I
I '\\
I
or
R.YX(r) .... 8xx(co) (7.2-13)
The expressions (7.2-11) and (7.2-12) are usually called the Wiener-Khincllln rela-
tions after the great American mathematician Norbert Wiener (1894-1964) and
the German mathematician A. I. Khinchin (1894-1959). They form the basic link
between the time domain description (correlation functions) of processes and
their description in the frequency domain (power spectrum). (u)
From (7.2-13), it is clear that knowledge of the power spectrum of a process
allows complete recovery of the autocorrelation function when X(t) is at least Sxx<wl
wide-sense stationary; for a nonstationary process, only the time average of the .1'w
-2-
autocorrelation function is recoverable from (7.2-10).

Fi~:ure 7.2-1 The autocorrelntion


runclion (n) and power density
Example 7.2-1 The power spectrum will be found for the random process of spectrum (b) or the wide-sense
Example 6.2-1 that has the autocorrelation function 0 ~l:tlion:try rnnuolll process or
Exnmple 7.2·1.
R.u(r) = (A 2/2) cos (w 0 r) (/>)

'
182 I'ROnAIIII.ITY, RAt•H>OM V,_RI ... nLI~~. M-10 RANI>OM SIGNAL PRINCIPLf:S

7.3 CROSS-POWER DENSITY SPECTRUM We next define the cross power P xr(T) in the two processes within the inter-
AND ITS PROPERTIES val (- T, T) by

Consider a real random process W(t) given by the sum of two other real pro- (7.3-8)
cesses X(t) and Y(t):

W(t) = X(l) + Y(t) (7.3-1) Since x1{1) and y 1{t) are Fourier transformable, Parsevnl's theorem (D-20)
appli~s; its left side is the same as (7.3-8). Thus, we may write
The autocorrelation function of W(t) is

Rww(t, I + t} = E[W(t)W(t + t)]


Pxr(T) = ...!..._
2T
JT x(t)y(t) dt = _.!._
-T
f"' xr{co)2TY {w) dw
2n -..,
1
(7.3-9)

= f:{[X(t) + 1'(t)][X(t + r) + Y(l + r)]} This cross power is a random quantity since its value will vary depending on
= Rxx(l, I+ r) + Rn(l, I+ t) which ensemble member is considered. We form the average cross power,
denoted Pxr(T), by taking the expected value in (7.3-9). The result is
+ Rxr(l, I + r) + Ryx(t, I + r) (7.3-2)

Now if we take the time average of hoth sides of (7.3-2) and fourier transform Pxr(T) = _!_ Jr
R (t t) dt = _.!._
2T - r xr ' 2n _..,
J"' E[X~(co)Y1(cv)] d
1T w (7.3-10)
the resulting expression by applying (7.2-9), we have
Finally, we form the total average cross power Pxr by letting T-> co:
Kww(co)"" Kxx(<o) + Krr(w) + :F{A[Hxr(l, I+ r)]) + .'F{A[RrxU. I+ rll) (7.3-3)

where :t (·) represents the Fourier transform. ll is clear that the left side of (7.3-3) P xr = lim _!_
r-.., 2T
JT Rxy(t, t) dt = _.!._
-T
J"' lim E[X~(co)
2n _.., r-.., 2T
Yr{w)] dw (7.3-11)
is just the power spectrum of W(t). Similarly, the first two right-side terms are the
power spectrums of X(t) and Y(t), respectively. The second two right-side terms It is clear that the integrand involving w can be defined as a cross-power density
arc new quantities that are the subjects of this section. It will be shown that they spectrum; it is a function of w which we denote
arc cross-power clensily .~pectrwns defined by (7.3-12) and (7.3-14) below.
_ . E[X1-(w)Yr{w)]
SxrW-
( ) 11m~~~~~ (7.3-12)
The Cross-Power Density Spectrum
r-.., 2T
1 Thus
For two real random processes X(t) and Y(l}, we define x 1.(t) and y,.(l) as trun-
cated ensemble members; that is ' Pxr = - 1 J«J Sxr(W) dw (7.3-13)
21t
-, x 1 (t) = {x(t)
-T <I< 'I'
(7 .3-4)
-«>

0 elsewhere By repenting the above procedure, we can also define another cross-power
... density spectrum by
. ~,
and . E[yt:(w)X r{cv)]
}: SYX (W ) = I1m (7.3-14)
)'·r(t) = {y(l)
-T <I< T (7 .3-5) l·
·. r-.., 2T
0 elsewhere Cross power is given by
Both x 1-(t) and y1.(t) arc assumed to he magnitude integrabl~ over the intcr~al
(- T, T) as indicated hy (7.1-4). As a consequence, they w1ll possess founer ~'rx = - I
21t J"' Sr.t(co)
•• «>
dco = f'xr (7.3-15)
transforms that we denote by X r(cv) and Y1.(cv}, respectively:
(7.3-6) Total cross power Pxr +Pyx can be interpreted as the additional power two pro-
x 1.(t) <-+X 1 (w) cesses are capable of generating, over and above their individual powers, due to
>'J'( I) .... yl ((I)) (7 .3-7) the fact that they are correlated.
rF <•
184 PROBABILITY, RANDOM VARIABLES, AND RANDOM SIGNAL PRINCIPLES SI'I'C rrv.L ('IIARACTI:IIISTICS 01' RANDOM I' I<' II'"'"''·' I u.•
· tr
Properties of the Cross-Power Density Spectrum where w > o, 11 and b arc real constants. We use (7.3-25) to lind tin: ~:ross·
correlation function. It is
Some properties of the cross-power spectrum of real random processes X(l) and
Y(t) are listed below without formal proofs. Rxr(r) =
2
I
n: -w fw ( 11 +j l'Ww) e
1
'"' tlw
(I) Sxr(w) = Srx( -w) = S~x(w) (7.3-16)
(2) Re [Sxr(w)] and Re [Srx(w)] are even functions of w (see Problem 7-40). = e''"' clw + j - 1' -
.!!... f iV
we!'"' clw f"'
(7.3-17)
2n: -IV 2n:W -IV
On using (C-45) and (C-46) this expression will readily reduce to
(3) Im [Sxr(w)J and Im [Srx(w)] are odd functions of w (see Problem 7-40).
•·
{4) Sxr(w) == 0 and Srx(w) = 0 if X(1) and Y(t) nre orthogonal.
(7.3-18)
(7.3-19)
Rxr(t) = ;n: [e;~· [J + j 2:W {e''"l;- U!l 2
] [J'
(5) If X(t) and Y(t) are uncorrelated and have constant means X and Y =_I_ [(aWt- b) sin (Wr) + bWr cos (IVr)]
n:1Vr2
Sxr(w) = Srx(w) = 2n:XYo(w) (7.3-20)
(6) A[Rxy(l, l + t)] +-+ Sxy(w) (7.3-21)

.. A[Rrx(t, I + t)] +-+ Srx(w) (7.3-22) *7.4 RELATIONSHIP BETWEEN CROSS-POWER SPECTRUM
'.
-·.. In the above properties, Re [ ·] und lm [ ·] represent the real anrl imaginary AND CROSS-CORRELATION FUNCTION
parts, respectively, and A[·] represents the time average, as usual, defined by
In the following discussion we show that
(6.2-21). .
Property I follows from (7.3-12) and (7.3-14). Properties 2 and 3 are proved
by considering the symmetry that X-1{w) and Y7{w) must possess for real pro-
Sxr(w) = f""
- <0
{lim
.,._GO 2T
_!_f.,. RxrU. l + r) cll}e- 1"'' clr
-1'
(7.4-1)

cesses. Properties 4 and 5 may be proved by substituting the integral (Fourier


as indicated in (7.3-21).
transform) forms for X 1{w) and Y7{w) into E[Xf(w) Y,{w)] and showing that the The development consists of using the transforms of the truncated ensemble
function has the necessary behavior under the stated assumptions.
members, given by
Property 6 states that the cross-power density spectrum and the time average
'. . . '
.,
(7.4-2)
I of the cross-correlation function are a Fourier transform pair; its development is
given in Section 7.4. For the case of jointly wide-sense stationary processes,
1'
(7.3-21) and (7.3-22) reduce to the especially useful forms

t: Y:,.(w)= y(t)e-''"'c/1 (7.4-3)

Sxr(w) = Rxr(t)e- 1"" dt (7.3-23)


f
-1'

in (7.3-12) and then taking the expected value and limit as indicated to obtain

r:
·~·
l'ixr(w). From (7.4-2) and (7.4-3):
~rx(w) = Rrx(t)e- 1'"' cit (7.3-24) ., J'/'
X !·(w)l~,.(w) =
f - .,.
s ( I de"I,,, 1I I 1
-1
_)' (I)
l <'
1"" 1 cit.•

~.~-:
~<xr(r) = - I
2n: J"'
-oo
~xr(w)e''"' clw (7.3-25)
= f·r fT x(1 1)y(t 2)e- 1"'1"- 111 d11 c/12 (7.4-4)
-1' -T

Rrx(r) = - I
2n: J"" Srx(w)e''"' tlw
-oo
(7.3-26) Now by changing variables according to (7.2-5), dividing by 2T, and taking the
cxpecteu value, (7.4-4) becomes .
E[Xr-(~),Y1{co)] = {~ fr x(t)y(t + t) dt}e-J••• c/r]
1
E[fr-
Example 7.3-1 Suppose we are given a cross-power spectrum defined by 2/ -T-1 -f -T
{a + jbw/W T-1 {?-
I fT.. R,n(t, t + r) c/t}e-Jw• dt
S ( )_
xr w - 0
- W <w< W
elsewhere
=
f -1'-1 -1 _,
(7.4-5)

t:•.'
-·~

J!l6 l'll011Ailll.ll'Y, RANIJUM VAI\IAIILES, ANI> ltANl>UM SIUNAI. l'ltiNCII'l.ES SI'ECTRAI.l'IIAI\AC1HUSII('S 01'1\ANIJOM I'ROG~~.~~·~~ 1!17
-~
After the limit is taken: 7.5 SOME NOISE DEFINITIONS AND OTHER TOPICS
T::[Xj((l)) Y1 ((1))]
S.rr((ll) = lim '1' In many practical problems it is helpful to sometimes characterize noise through
1'-~
2 its p::~wer density spectrum. Indeed, in the following discussions we define two
forms of noise on the basis of their power spectrums. We also consider the
response of a product device when one of its input waveforms is a random signal
-~ or noise.
f
!j
,,,. {.l.un
. 1 f.,..Hn(t, 1 + r) tit }C' - 1'•If (7.4-6) --~-

= '/' tir
. ,,, I··•..,, 2 I
White nnd Colored Noise
which is the same as (7.4-1). Since (7.4-(l) is a Fourier transform, and such traus-
furms arc unique, the inverse transform applies: A sample function 11(1) of a wide-sense stationary noise random process N(r) is
called white noise if the power density spectrum of N(r) is a constant at all fre-
. -;-:
hm I fT' R.n(t, t + r) tit = -I f"' ~'Xl·(w)e
. ;. . clw (7.4-7) quencies. Thus, we define
·r-··~· 2/ _ 1• 1n _ ,,,
(7.5-1)
It should be noted from (7.4-7) that, given the cross-power spectrum, the
cross-correlation function cannot in general be recovered, only its time average for white noise, where .A~" 0 is a real positive constant. 13y inverse Fourier trans-
can. For jointly wide-sense stationary processes, however, the cross-correlation formation of(7.5-1), the autocorrelatio.n function of N(t) is found lobe
function H l'r(r) rw1 he found from Kxr(t•l) since its time average is just R,n(r).
Althnl;g.h we shall not give the proof, a development similar to the above (7.5-2)
shows that (7.3-22) is true. The above two functions arc illustrated in Figure 7.5-1. White noise derives its
name by analogy with "white" light, which contains all visible light frequencies
in its spectrum.
Exnmplc 7.4-1 Let the cross-correlation function of two processes X(r) and . :•,
White noise is unreali1.able as can be seen by the fact that it possesses infinite
)'(I) be
.I. avcrnge power:
Hn(r, 1 + r) = A;J isin (co 11 r) +cos [!11 0 (2r + r)]}
If"' ~1111(111) tiltl =
-2 <X) (7.5-3)
1t ....
where ;1, H, and (1) 0 arc constants. We find the cross-power spcdrum by use
of (7.4-1 ). First, the time average i:; formed However, one type of real-world noise closely approximates white noise. TIIC!rlllal
11oi.~e
generated by thermal agitation of electrons in nny electrical conductor has a
l'
lim ~
.,._ ,. 2 r f
-1·
H,n(l, t

AB .
+ r) dt

AB . I fr
power spectrum that is constant up to very high frequencies nnd then decreases .
For example, a resistor at temperature T in kelvin produces a noise voltage

=- sm (w 0 r) +- hm - cos [w 0 (1t + r)] cit


2 2 r-oo 2T -r
Th~ i1 1tcgral is readily evaluated and is fou~d to be ~ero. :inally w.c Fouri~r
transform the time-averaged eross-corrclat1on function w1th the a1d of pa1r
12 of Appendix E:

Kxr{tll) = :J· -{All


2" . Sill (<Our) }
0 ()
"'
(n) 1/ol
-jn;\ll .
,., -·- - [t5(w - 111 11 ) ·-
.
<~(til + W 11 ll. Fij!ure 7.:'i-1 (11) The nulocorrelnlion runclion nnd (/•) lhe power den~ity ~peclrum or while noi~e.
2 I A,ftl(>lrtlfrnm l'rrlolr.f (19711) ll'iilo pt•rmi.•.•inu <ofpu/oli.•/u•r.• A,/di.<nn-lVr.dr,t·, tldo•morrd flnnk l'murmu.]
---·-··· -------------- ...-------------------·-·· --------- ·----·--··--·-- --
,,.I
I
'"r'
188 I'KOBABILITY, KANOOM VAKIAULilS, AND RANDOM SIGNAL I'RINCII'LES
..
' St'~CTI\AL CIIARACT~RtSTtCS 01' RANDOM I' ROCESSES I Oll

across its open-circuited terminals having the power spectrumt (Carlson, 1975, <.kpit.:ts sudt a power spet.:trum that is lowpass. Here
p. 118)
-W < w < W

!
Prr
~NN(W) = W (7.5-5)
s (w) = (X 0/2Xa I w I/T)
(7.5-4)
NN e•l .. ltT - I 0 elsewhere
I" Inverse transformation of (7.5-5) gives the autocorrelation function shown in
where a= 7.64(10- 12 ) kelvin-seconds is a constant. At a temperature ofT= r.
290 K (usually called room temperature although it corresponds to a rather cool
•·~: Figure 7.5-2&:
room at 63°F}, this function remains above 0.9 (_,V 0 /2) for frequencies up to sin (Wr) (7.5_6)
RNN (r ) = p Wr
" 10 12 Hz or 1000 GHz. Thus, thermal noise has a nearly Oat spectrum at all fre-
quencies that are likely to ever. be used i.n radio, microwave, or millimeter-wave The constant P equals the power in the noise. •
systems.t Band-limited white noise can also be bandpass as illustrated in F'igure 7.5-3.
Noise having a nonzero and constant power spectrum over a finite frequency'
band and zero everywhere else is called band-limited white noise. Figure 7.5-2a ~NN(W)

t The unil of SNN((J)) is actually volls squared per hertz. According to our convention, \\<e obtain
w;~llsper hertz by presuming the voltage exists across a t-n resistor. '
____ _R
wP
--
r-w-J
t This statement must be reexamined for 'f < 290 K, such us in some supcrconducling systems or ~·
~··
:·_: other low-temperature devices (masers). r·'
I

J. -wo 0 Wo w
\
w
-wo + 2 Wo-
J\1
2 Wo
II'
+2
~-·
(a)

I' p

/ \
\
I
I \
I \
-W 0 w
(a)

_-
........

I
I
I
I
I

(b)

Fi~ure 7.5-2 Power densily spectrum


Fi~uro 7.5-3 Power density spectrum (11) and uutocorrelation function (b) for bandpass band-limited
(a) and autocorrelation function (b) while noise. [Arlllf>lrd from l'ccbles (1976) witlt permission of pub/isl•ers Arldisou-IVcsley, Arlvrmced
(b) of lowpass band-limited white noise. Book l'royrum.]

. ~··
190 I'ROOAOILITY, RANDOM VARIAnLES, AND RANDOM SIGNAL I'RINCIPLr:S SI'ECTRAI. CIIARAGrJ;RISTICs 01' 1\ANI>OM I'ROC"~Sr:.~ 191 '
The applicable power spectrum and autocorrelation function arc: These integrals easily evaluate using (C-45) to give
l'nfW (1) 0 - (IV/2) < lo>l < w 0 + (W/2) . p p 61'
s,..,..(w) = { 0 elsewhere
(7 .5-7) R,..,..(w) = - -
3 + jw
+- - = --
3 - jw 9 + w 2
and . This power spectrum is sketched in Figure 7.5-4 along with the preceding
R,..,..(r) = I' sin(Wt/2)
(ll'r/2)
cos (<lio t) (7.5-R) autocorrelation function.

where w 0 anJ IV arc constants and I' is the power in the noise.
Again, by analogy with colored light that has only a portion of the visible Product Device Response to a Random Signal
light frequencies in its spectrum, we define colored 11oise as any noise that is not
white. An example serves to illustrate colored noise.
.. Product devices are frequently encountered in electrical systems. Often they
~ involve the product of a random waveform X(t) (either signal or noise or the sum
of signal and noise) with a cosine (or sine) "carrier" wave as illustrated in
Example 7.5-1 A wide-sense stationary noise process N(1) has an autocorrel- .• Figu;·e 7.5-5. The response is the new process
ation function Y(t) = X(l)A 0 cos (C1> 0 1) (7.5-9)
R,..,..(t) = Pe-Jhl
where A 0 and ro 0 are constants. We seek to find the power spectrum :->l'Y(w) of
where Pis a constant. We find its power spectrum. It is }'(1) in terms of the power spectrum Sxx(ro) of X( I) .
The autocorrelation function of Y(l) is
.'\,..,..((1)) = f"' /'c-Jlrl"-i••• tlr
-.., Rrr(l, t + t) = E[Y(t)Y(I + t)]
= p r"'e-(J+J••lt dt +p fo e(J-J«>l< dt = E[A5 X(t)X(t + t) cos (ro 0 t) cos (ro0 I + w 0 r)]
Jo -«>

= 2A5 Rxx(t, t + t)[cos (w 0 t) +cos (2w 0 t + w0 r)] (7.5-10)


\.
'· . Even if X(t) is wide-sense stationary Y(1) is not since Rrr(t, 1 + t) depends on t.
r Thus, we apply (7.1-19) to obtain Srr(w) after we take the time average of
•. Rrr(l, t + t). Let X(t) be assumed wide-sense stationary. Then (7.5-10) becomes
.:·
A2
'I
·I
A[nrr(t, t + r)] = T I~.u(t) cos (co r)
0 (7.5-11)

.jI
On Fourier transforming (7.5-11) we have
0 ~ _,.
·. A5
(a) · Srr(w) = 4 [Sxx(w- co 0 ) + Sxx(w + w0 )] (7.5-12)

A possible power density spectrum of X(l) and that given by (7.5-12) arc illus-
trated in Figure 7.5-6. It presumes that X(t) is a lowpass process, although this is
not n constraint in applying (7.5-12).
Figure 7.5-4 The autocorrelation
function (11) 1111<1 power speclrum X(I)
(h) or the colored noi~c ,,r Exnmple
7.5-1. [A<Inprrtl.frnm P~rl>lt.< (1976) .~··
ll'itll ptrmi.~!inn nf p11hli.<ller.~ Figure 7.5-5 A product or inlerest in electrical systems.
0 w [Adapted frnm Ptthlts (1976) witlo permission nf p11hli.~htn
Adtli.~nn-IV•·.~It}'. Acl<'llllrrtl nnnk
t/J) l'rngrnm.] Adtli.mn-IYt.dty, Admnrtd nnok Prngrnm.]
'I SI'ECTI\AI. CIIARACTiii\ISTICS OF 1\ANI>OM l'lltl('l~~~liS ,I IJJ
r:t.t I"KUIJAUILITY, RANDOM VAI\IAIILiiS, ANI> RANDOM SIGNAl. 1'1\INCII'I.ES

the power density spectrum of the output noise Y(t} of the product device is
readily found (by sketch) to be
.;V A5/8
0
- 2wo - (WRr/2) < cr> < - 2<1Jo + (ll'.~~r/ 2 )
Xo At/4 - w.~~f/2 < w < W.~~r/2
~rr(w} = A~'oA5/8 2wo- (W,u.f2) < w < 2wo + (WRt.f2)
II' 0 elsewhere
2 \
. . d 'V /2
Now only the no1se 111 the ban - • RF < w
< wRf'12 cannot

be removed
a low ass filter (which usually follows the product device. t? ren~o~e
~~~ante/~oi~e and other undesired outputs) because th.e desi.red !)'lgnal IS In
Srr(wJ the same band. This remaining component of Syy(co) gives nse to the final
output noise power, denoted N.,
I JIYu/l .Ar A~ 0 _ ,.,V 0 A~ W11 r
N.=- 4 dw- 8n
2n -Wu/1

~:.

0
(b)
*7.6 POWEH SPECTRUMS OF COMPLEX PROCESSES
Figure 7.5-6 Power density spectrums upplicable to Figure 7.5-5; (u) at the input unu (/J) al the ily be defined for complex processes. We consider only
outpul. [Adapted from Peebles (1976) with permission of publislrers Addison-Wesley, Advntrced Book Power spectrums may read f 1 t .
Program,] those rocesses that are at least wide-sense stationary. In terms o tle au. ocorrc-
lation ~unction R:~:Ar) of a complex random process Z(t), the power density spec-
trum is defined as its Fourier transform

.·'.
'
Example 7.5-2 One important use of the product device is in recovery
(demodulation) of the information signal {music, speech, etc.) conveyed in the
Su;(w) = f~ . 1
Rn(T)e- "'' tit
(7.6-1)

wave transmitted from a conventional broadcast radio station that uses AM


(ampliwde mudrtlatioll). The wuve received by a receiver tuned to a stution The inverse tmnsform upplies, so
with frequency w 0 /2n is one input to the product device. The other is a "local
oscillator" signal A 0 cos (w 0 I) generated within the receiver. The product I
R--(r)=-
zz 2n
J"' S
-w
<zzCVe
1
( ) "'' tW
I
(7.6-2)

device output passes through· a low pass filter which has as its output the
desired information signal, Unfortunately, this signal also contains noise For two jointly wide-sense stationary complex proces~es Zm(l) and z.•(t}, their
because noise is also present at the input to the product device; the input cross-power density spectrum and cross-correlation functiOn arc a Founer trans-
noise is added to the received radio wave. We shall calculate the power in the
output noise of the product demodulator. form pair:
Let the power spectrum of the input noise, denoted X(t), be approx-
imated by an idealized {rectangular) function with bandwidth W11 t· centered
.,
C)z,..,z" (w) =I"" R., z(r)e- 1'"' th
-oo.
L'" II

(7.6-3)

at ± w 0 • Thus,
R· . (t) = - I f"" Sz.. z. (we
) ,)wr '/w (7.6-4)
-w 0 - (Wu/2) < w < -W0 + (W11 t.f2) z.. z. 21t - Q)

111 0 - (W.HF/2) <co < W 0 + {Wu/2)


elsewhere An equivalent statement is:
(7.6-5)
where __y 0 /2 is the power density within the noise band. By applying (7.5-12)
194 1'1\0IIAIIII.ITY, RANDOM VAI\IAIILI~~. AND 1\ANI>OM SIONAI. l'lliNC'II'I.ES SI'ECTRAI. ('IIARACTRRISTI('S OF RANDOM !'ROCESSES 195

(n) Show that if A and B are uncorrelated with zero means and equal
Example 7.6-1 We reconsider the complex process 1'(1) of Example 6.6-1 and variances, then X(t) is wide-sense stationary.
find its power spectrum. From the previous example (h) Find the autocorrelation function of X(t).
l,. N (c:) Find the power density spectrum.
-~ Rvv(t) = elmo• LA; .~
"• I
7-7 A limiting form for the impulse function was given in Example 7.1-2. Give
o: o '•
I,
.arguments to show that the following arc also true:
On Fourier transforming this autocorrelation function we obtnin
Ji~
·!:...· .
(11) lim T cxp [ -no: 2T 2] = c5(o:)
Svv(w) = :J {el•••• t A:}
••I
."l·
·' , . ... t.O

(h) lim I. cxp [-I IX IT] = c5(o:)


1 T-.., 2
7-!1 Work Problem 7-7 for the following cases:
N
-,
' = 27H5(ol - w 0 ) L A; . T sin (o:T)
(a) I 1111 - - - - = o(IX)
<
ftD I T-oo 1t IXT
after \l.~ing pair 9 of Appendix E. (hi lim T[ I - lo: IT] = <5(o:)
..
\ T-eo

l ~"ft ' 1•1 < 1/T


7-9 Show that (7. 1-14) is true.
PROBLEMS .'·
~· 7-10 Prove (7.1-17). [Him: Usc (D-6) of Appendix D and the definition of the
·'.. derivative.]
7-1 We arc given the random process
7-11 A random process is defined by
X(t) = A cos (w 0 I + 0)
Y(t) = X(t) cos (co 0 I + 0)
where A and w0 arc constants and 0 is a random variable uniformly distributed
on the interval (0, n). ,j
.. .
:
;
'

where X(t) is a lowpass wide-sense stationary process, w 0 is a real constant, and


(n) Is X(t) wide-sense stationary? ~.: I 0 is a random variable uniformly distributed on the interval (0, 2n). Find and
(/J) Find the power in X(t) by using (7.1-10). sketch the power density spectrum of Y(t) in terms of that or X(t). Assume 0 is
(r) Find the power spectrum of X(/) hy u~ing (7.1-11) :~ml calculate power indcpcnd~nt or X(t).
from (7.1-12). Do your two powers agree'/ >. '•,
7-12 Determine which of the following runctions can nnd cannot be valid power '.
7-2 Work Problem 7-1 if the process is defined by ij

l
~
.' density spectrums. For those that nrc not, explain why.
;1,
X(t) = u(t)A cos (w 0 t + 0) w2
1.. ~ ~f .. (h) exp [ -(w- 1) 2]
where u(t) is the unit-step function. ~ ik . (n) w6 + Jw2 + 3
*7-J Work ·P.roblem 7-2 assuming 0 is uniform on the interval (0, rr/2). (1)2 co"
+ 0). (c) - -- c5(w) (d) ----::----,.
7-4 Work Problem 7-1 if the random process is given by X(t) = A sin (Clint .. '·
•. \ w4 + I I+ co2 + j(l/'
•1-S Work Problem 7-1 if the random process is
7-13 Work Problem 7-12 for the following functions.
X(t) = A 2 cos 2 (w 0 t + 0)
cos (3co) I
7-6 Let .-1 and H be random varia hies. We form the random process ;· F (a)
I+ oJ
2
(b) (I + co2)2
:··1 ••
,.[
X(l) = A cos (w 0 I) + 13 sin (w 0 1) I
wlwrc m., is a real constant.
(c/) Jl - 3w 1
.-
196 I'KOUAUILITY, RANI>OM VAKIAIII.I:S, ANI> IIANL>OM SIGNAl. I'KINCII'LI!S Sl'l!CTIIAL ('IIAIIA(~fi!IIISTICS OF KANI>OM I'IIO('I:SSI!S 197

:,. 7-14 Given !hut X (I)= l:f~ 1 oc1 X 11t) where {oc 1} is u set of real constants und !h..: wh..:rc I'> 0 ami 11 > 0 arc constants. Find th..: power density sp..:ctrum of ,\'(1).
,. [lli11t: Usc Appendix E to cvuluate the Fourier transform of Rxx(r).]
'' processes X 1(1) ure stationary and orthogonal, show that
"···
N 7-23 A rundom process has an autocorrelation function
~.dw) = L ocf Sx,x,(w)
P[t - (2r/T)] 0 < r::; T/2
1•1

7-15 A random process is given by


X(t) = A cos (Ot + 0)
where A is a real constant, n is a random variable with density function ;;1( • ),
Hxx(r) =
I~(I + (2r/T)] - T/2::; r::; 0
-r < - T/2 und
Find and sketch its power density spectrum. (Hi11t: Use Appendix E.)
r > T/2

and 0 is a random variable uniforml>),distributed on the interval (0, 2n) indepen- *7-24 A rundom process X(t) has u periodic autocorrelation function wlierc the
dent of n. Show that the power spectrum of X(t) is function of Problem 7-23 forms the central period of duration T. Find und sketch
the power spectrum.
nA 2 ·
l'ixx(w) = - - [Mw) +/n(-w)] 7-25 If the random processes of Problem 7-14 are stationary, zero-mean, sta·
2 tisticully independent processes, show that the power spectrum of the sum is the
7-16 If X(c) is 11 stationary process, find the power spectrum of sume as for orthogonnl processes. For stationary independent processes with
nonzero means, what is Sxx(w)?
Y(t) = A + BX(t) 7-26 Given that a process X(t) hns the uutocorrelation function
in terms of the power spectrum of X (I) if~ and 8 are real constants.
H.u(r) = tle-•t•t cos (co 0 r)
7-17 Find the power density sp~ctrum of the random process for which
Rxx(r) = J> cos 4 (w 0 r) where A > 0, oc > 0, and w 0 arc real constants, lind the power srcclrurn of X(t).
7-27 A random process X(t) having the power spectrum of Problem 7-19 is
if P and w 0 ure constants. Determine the power in the process by use of(7;1-12). applied !0 an ideal differentiator.
7-18 A random process hus the power density spectrum (n) Find the power spectrum of the diiTerentiator's output.
6w 2 i (h) What is the power in the derivative'/
Sxx(W) = ---4 i 7-2H Work Problem 7-27 for the power spectrum of Problem 7-20.
I +w
Find the average power in the process.
! 7-29 A wide-sense stationary random process X(t) is used to define anothl.!r
process by
7-19 Work Problem 7-18 for the power spectrum
.
Kxx(w)=.
6w 2
2 l
Y(l) = f.~.~~(~)X(t - .:J c/~
[I+ w]
where lt(l) is some real function having a Fourier transform H(w). Show that the
7-20 Work Problem 7-18 for the power spectrum
power spectrum of Y(t) is given by
6w 2 j.
~xx(w) = 2 4 ~ ~·
Krr(w) = S,rx{w) IH(co) 12
(I +w)
7-21 Assume X(l) is a wide-sense stutionary process with nonzero meun value '. 7-30 A <ktcrministil.! signal t1 cos (w 11 1), where A and co 0 arc real constants, is
added to a noise process N(t) for which
X .f 0. Show that

,,.
S,u(w) = 2nX 2,5(w) + t: Cxx(r)e-J<»• c/r
and W > 0 is a constant.
,
1:.: where Cxx(r) is the autocovariancc function of X(t).
(n) Find the ratio of average signal power to average noise power.
7-22 For a random process X(l), assume that (b) What value of W maximizes the signal-to-noise ratio'? What is the conse-
Rxx(r) = l'e-r2J2o2 quence of choosing this value of W?
·'· .
:r:
...
~.
~;
't..

'
\9!! I'ROIIAIIII.ITY, RANDOM VARIAIILES, ANIJ RANDOM SIGNAl. PRINCII'I.I~~ SI'ECTRAI. C:IIARACTERISTICS OF RANDOM !'ROCESSES \99

7-31 Find the rms bandwidth of the power spectrum 7-37 A random process is given by
I' W(t) = AX(r) + IJY(r)
lwl < KW
1-ixx(<v) == I + (tii/IV)2 where A and B arc real constants and X(r) and Y(l) are jointly wide-sense station-
0 ary processes.
(a) f-ind the power spectrum Sww(w) of H'(l).
where 1', II', and K arc real positive constants. If K -• oo, what happens'!
(/J) Find Sww(w) if X(t) and Y(t) are uncorrelated.
7-32 Find the rms bnndividth of the power spectrum (c) Find the cross-power spectrums Sxw(w) and Srw(w).

.
1-ix,y(UJ) =
{I'
0
cos (rr111/2W) 11111~1¥ *7-38 Define two random processes by
lwl> W
X(t) =A cos (w 0 t + 0)
where IV > 0 and I' > 0 arc constants. Y(t) = W(t) cos (w 0 I + E>)
7-33 Determine the rms bandwidth of the power spectrums given by:
where A nnd w 0 nrc real positive constants, e is a random variable independent
/' 1(1)1 < w \'
of W(t), and W(t) is a random process with a constant mean value W. Oy using
(a) 1-i.dw) == { 0 lwl > W
(7 .3-12), show that
.
(h) 1-ix.,(w) ==
{1'(1 -1(1)/WIJ lwl~ W
AWn
0 lwl > W Sxr(co) == - - [b(w - w 0 ) + b(w + w 0 )]
2
where /' and II' arc rcttl positive constants.
regardless of the form of the probability density function of e.
*7-34 Given the power spectrum
*7-39 Again consider the random processes of Problem 7-38.

~
(a) Usc (6.3-11) to show that the cross-correlation function is given by
•.
Xn(W} [ (
1+--
(1)-CI.
)'( [ ( ['+
1+--
(I) Ct. )2]2 AW
IV W R,n(t, t + t) == T (cos (w 0 t) + E[cos (20)] cos (2(1J 0 I + w 0 t)
where 1'. et., and IV arc real positive constants, lind the mean frequency and rms
l - E[sin (20)] sin (2w 0 I + w 0 t)}
bandwidth.
where the expectation is with respect to e only.
i
7-35 Show that the rms bandwidth of the power spectrum of a real bandpass ..·! I

process X(l) is given by (IJ) Find the time average of Rxr(l, 1 + t) and determine the cross-power
• .I.
w:m. = 4(Wl'- iiJ~] '·i"
~· ' denfity spectrum Sxr(w).
7-40 Decompose· the cross-power spectrums into real and imaginary parts
where (iJ 0 is given by (7 .\-23) and W' is given by the right side of (7 .1-22). -~ ;_.;~
:",1 according to
*7-36 Jointly wide-sense stationary random processes X(t) and Y(t) define a ~
process W(t) by 8xr(w) = Rxr(w) + j I xr(w)
W(t) == X(t) cos (w 0 I) + Y(t) sin (w 0 t) Srx(w) = Rrx(cv) + jl rx(w)

and prove that


where w 0 is a real positive constant. . .
(a) Develop some conditions on the mean values and correlation functions of Rxr(w) = Rrx( -w) = Rrx(OJ)
X(l) ami )'(I) such that 11'(1) is wide-sense stationary. . .
(h) With the conditions of pnrt'(a) applied to IV(I), lind 1ls power spectrum tn lxy(w) = lrx(-w) = -/n(<v)
terms of power spectrums of X(t) and Y(t). 7-41 From the results of Problem 7-40, prove (7.3-16).
(c) If .\'(1) and )'(t) arc also uncorrclatcd, what is the power spectrum of
11'(1)'1 7-42 Show that (7.3-19) and (7.3-20) are true.
rr:-r
I· .;• 200 I'KODAUILITY, RANDOM VARIAULES, ANI) RANI>OM SIONAL PKINCIPLf:S
Sl'ECTitAI. CIIAitA<~J'EitiS'I Jl'S m 1\ANI>OM I'I\O('I'$SES 2(1 t

'!~'·
i.}"t' ~.·~
..
i' ~"'!
tr
7-43 (a) Sketch the power spectrum of (7.5-4) as a function of a.w/T. 7-52 (a) Rework Problem 7-15 and show that even if 0 is a constant (not
': .. ~
random) the power spectrum is still given by
r ~·: (b) For what values of w will SNN(w) remain above 0.5(% 0 /2) when
: ~· T = 4.2 K (the value of liquid helium at one atmosphere of pressure)? These S_u(w) = (nA 2/2)(J;,(w) + fu( -w)]
values form the region where thermal noise is approximately white in some
umplifiers operated ut very low temperatures, such ns a maser. [/filii: Time-average the autocorrelation function before Fourier transforming to
7-44 For the power spectrum given in Figure 7.5-2a, show that (7.5-6) defines the obtain ~'.u(w).] . . .
corresponding band-limited noise autocorrelation function. , (11) Find the total power in X(t) and show that 1t IS tmli.:pendent of the form
7-45 Show thut (7.5-8) gives the uutocorrelation function of the bandpass band- of the density function.f;,(w).
limited noise defined by Figure 7.5-3a. 7-53 Find the rms bandwidth of the powl.!r spectrum
7-46 A lowpass random process X(t) has a continuous power spectrum Sxx(w) K,rx(co) = 1/[1 + ((I)/IV) 2] 3
+
and ~xx(O) 0. Find the bandwidth W of a lowpass band-limited white-noise
power spectrum having a density Sxx(O) and the same total power as in X(t). where W > 0 is a constant.
7-4~ Work Problem 7-46 for a bandpass process assuming Sxx(w 0 ) 0, where + 7-54 Work Problem 7-53 for the power spectrum
Wo IS some convenient frequency ubout which the spectral components of X(t)
cluster.
K.dcu) = cu 2/[t + (w/Wl 2Y
*7-41! A complex rundom process is given by 7-55 Work Probll.!m 7-53 for the power spl.!ctrum
Z(t) = Ae' 11
' <'ixx(w) = 1/[1 + (co/IV) 2
]~

where n is a random variable with probability density function fn( ·) and A is a 7-56 Work Problem 7-53 for the power spectrum
complex constant. Show that the power spectrum of Z(t) is
1-i.u(w) = cv 2/[l + (cu/W) 2
]~
X;r,x(w) = 2n I A l 2,/; 1(w)
•7.57 Ul.!tH.:ratizl.! l'robll.!lll~ 7-:'>J and 7-55 hy finding th...: nn~ bandwidth of th~:
power spectrum
ADDITIONAL PROBLEMS

7-49 The autocorrelation function of a random process X(t) is wherl.! N ~ 2 is un integer.


Rxx(t) = 3 + 2 exp ( -4t 2) *7-58 Generalize Problems 7-54 and 7-56 by llnding thl.! rms bundwidth of the
power spectrum
(a) Find the power spectrum of X(t).
(h) What is the average power in X(t)?
(co) What fraction of the power lies in the frequency bnnd - l/fi ~ where N ~ J is an integer.
(I)~ 1/ji? 7-59 Assume u random process has a power spectrum
7-50 State whether or not each of the following functions can be a valid power
density spectrum. For those that cannot, explain why. lwl ~ 6
elsewhere
I w I exp (.- 4w 2)
(II)
1
+ JW (b) cos (Jw) exp ( -a; 2 + j2w) Find (a) the averugl.! power, (h) thl.! rms bandwidth, and (c) the autocorrelation
(J)6 function of the process.
(c)
(12 + co 2) 6 (c/) 6 tan [12w/(l + wl)] 7-60 Show that rms bandwidth of a lowpass random process X(r), as given by
(7.1-22). can also bl.! obtainl.!d from
(e) cos 2 (w) exp (- !!w 2) (f) ( -jw)Uw)/(3- jw) 2(3
7-51 If Sxx(w) is a valid power spectrum of a random process X(t), discuss
+ Jw) 2 ,2 _
II""'- I~.u(O)
2
--=..!._ c/ R.u(t)
cIt l
I r•O
whether the functions c/Kx.dco)/clw <llld c/ 2Sxx(w)fclw 1 can be valid power spec-
trums. whl.!rl.! /(u(r) is thl.! autocmTI.!Iatiun function of X(r).

'

202 1'1\0IIAilii.IT\', RANDOM V/\1\1/\111.1~~. /\ND 1\/\NDOM SI<ON/\1. I'RIN('II'I.ES Sl'ECTRAL Clt/\R/\C I HRISllCS OJ' RANDOM l'RUCr:SSES 203

7-61 1\ random proccs~ has the aulocorrclatinn functinn


·'~ -
.;,,
,,. 7-67 An engineer is working with the function

H.0 (r) = /J co~ 2 (111 0 r) cxp (-II' I r I)


}
;i
R.n(t) = I'( I + t) cxp (- W 2r 2 ) !
.,.,
whcr~ I' > 0 and W > 0 nrc constants. He suspects that the function may not be I
·',,
.
I
where /l, w 0 , and IV arc positive constants. a valtd cross-correlation for two jointly stationary processes X(l) and Y(t), as he
(Cl) Find and sketch the power spectrum of X(l) when w 0 is at least several has been told. Determine if his suspicions arc true. [lli111: Find the cross-power
times larger than II'. spectrum and sec if it satisfies properties (7.3-16) through (7.3-18).]
(hl Compute the avcr;tg.e power in the lowpass part of the power ~pectrum.
7-6~ i\ wide-sense stationary process X(t) is applied to an ideal difTerentiatpr
Repeal for the bandpass part. In each case assume tll 0 ~ W.
havtng lh~ response Y(t) = clX(t)/dt. The cross-cnrrclalion of the input-output
*7-62 Generalize Problem 7-61 by replacing cos 2 (111 0 r) with cos" (c"o r) where processes ts known to be
N C: 0 is an integer. What is the resulting power spectrum when N is((/) odd, and Rxr(t) = cl H,rx(t)/dt
(h) even'!
*7-6J The product of a wide-sense stationary gaussian random process X(l) with (a) Determine .Sxr(w) and Srx(w) in terms of the power s1'cctrum ::; (crJ) of
:·. X(t). XX
itself delayed hy T secnnds forms a new process Y(l) = X(I)X(I - 7'). Determine .,,.
l••l lh.: :llll<~<:nnl'lalion fllnl·tion, ami (/•) the pow.:r spectrum of )'(1). \ lli111: lise ·' (h) Since R_xx(w) must be renl, nonnegative, nnd hnvc even symmetry, what
arc the propcrttes of Rxr((l))?
the fact that E[X 1 X 2 X"X 4) = /~[X 1 X 2 jE[X"X4j + l:'[X 1 X.di:I_X 2 X.] +
E[X 1 .\' 4 )1:(.\ 2 X.d- 2E[X 1]E[X 2 ]E[.\'.1 ]/:'[X 4 ] for gaussian random variables 7-69 The cross-correlation of jointly wide-sense stationary processes X(t) and
X 1, X 2 , X 3 , and X 4. (Thomas, 1969, p. 64.)} 1'(1) is assumed to be
7-64 Find the cross-correlation function Rxr{t, t + r) and cross-power spectrum Rxr(t) = Bu(t) exp (- Wr)
.<;.n(co) for the delay-and-multiply device of Problem 7-63. {1/illl: Usc the fact where B > 0 and W > 0 are constants.
that /::[X 1X 2 X 3 ] = E[X 1 )/~[X 2 X 3 ] + E[X 2]E[X_,X.] + E[X 3]E[X 1 X 2) - (a) Find Rrx(t).
:!E[X 1 )/~[X 2 )E(X"] for three gnussian random variables X 1, X 2 , and X 3 • (/1) Find 8xr(w) and Srx((l)).
(Thomas, 1969, p. 64.)} 7-70 Work Problem 7-69 for the function
7-65 If X(t) and Y(t) arc real random processes determine which of the following
Rxr(t) = Bu(t)t cxp {- Wt)
functions can be valid. For those that arc not, state at least one reason why.
7-71 The cross-power spectrum for random processes X(t) and Y(t) can be
(u) H.~,~(t)"" cxp ( ·· l rl) (I>) I /{.o-(r)l ~.iJHn(O)Hn(O) wrillen as
(cl) :-ixx(cv) = 6/(6 + 7(1)'1) :-in((l)) = Sn(cu)l/(cv)
r (c) Hx.1 (r) =

.
2 sin (3r)
4exp(-31rll
·.i ,. where Rxx(w) is the power spectrum of X(t) and H(w) is n function with an
l<'l :-; .nlw)
. = I+ w 2
inverse Fourier transform II(t). Derive expressions for Hxr(t) and Rrx(t) in terms
of /~xx(t) and II(t).
= 18,)((1))
-'
(H) ·".nlw) 7~72 !he po~cr spectrum of a ban~pass process X(t) is shown in Figure 1'7-72.
X(l) ts applied to a product dcvtcc where the second multiplying input is
7-66 Form the product of two statistically independent jointly wide-sense sta·
J cos (cv 0 t). Plot the power spectrum of the device's output 3X(1) cos (w 0 t).
tionary random processes X(t) and Y(l) as
W(t)= X(t)Y(t)

Find general expressions for the following correlation functions and power
spectrums in terms of those of X(l) and Y(t): (a) Rww(t. t + t) and 1-iww(ol),
(h} Hx,.(l, 1 + t) nnd l'\x 11-(w), and (c) Hwx(t,t + t) and Swx(w).(d) If

Hxx(r) = (ll' 1/n)Sa(IV1 r)


------------
)

and
= (11'2/n)Sa( 11'2 t)
Hn(t)
II .
J with constants 11'2 > \111, find explicit functions for Hww(l, I + t) and :-iww(cll).
~"'u. ~\V -tdu .. W '''n II' '''" 1 .'II' FiJ!nrt 1'7-72

I
204 I'IWUAIIILI'I'Y, RANDOM VAKIAIILt:S, ANIJ RANIJOM SIONAL PRINC'II'Lt:S

r :·. 7-73 Let lh7 "carrier" A 0 cos (w 0 t) in Figure 7.5-5 be modified to add a phase
ru.ndom vartable 0 so that Y(t) = A 0 X(t) cos (w 0 t + 0). If 0 is uniformly dis- ('II APTER
lrtbu.ted _on (0, 2rr) an~ is independent of X(t), find Rrr(t, 1 + r) and Syy(w) when
X(t) IS wtde-sense staltOnary. EIGHT
7-74 Assume a stationary bandpass process X(t) is adequately approximated by
the power spectrum
,..\ LINEAR SYSTEMS WITH RANDOM INPUTS
;~·
Sxx(w) = Pu(w- w 0 )(c1.1- w 0 ) exp ( -(w- w 0 ) 2fb] ,.I
\.
+ Pu( -w- WoX - w - w 0 ) exp [ -(w + w 0 ) 1fb]

where Wo, P > 0, and b > 0 are constants. The product Y(t) = X(t) cos (w t) is
fum~. . o ..,
(u) Find and sketch the power spectrum of Y(t).
(b) Determine the average power in X(t) and Y(t).
:7-75 Compute the power spectrum uf the complex process of Problem 6-55.
7-76 Let X(t) and Y(l) be statistically independent processes with power spec-
trums

Sxx(w) = 2b(w) + 1/[1 + (w/IWJ


and

A complex process 8.0 INTI{ODUCTION

i. Z(t) = [X(t) + jY(t)] exp Uw 0 t) A large pari of our preceding work has been aimed at describing a random signal
I
I
is formed where w 0 is a constant much larger than 10. by modeling it as u sumple function of u rundom process. We have found that
(a) D_elermine the autocorrelation function of Z(t). lime domain methods bused on correlation functions, und frequency domain
(b) Ftnd and sketch the power spectrum of Z(t). techniques bused on power spectrums, constitute powerful ways of defining the
behavior of random signals. Our work must not stop here, however, because one
of the most important uspects of random signals is how they interact with linear
systems. The knowledge of how to describe a random waveform would be of little
vulue to u communication or control system engineer, for example, unless he was
nlso uble to determine how such n waveform will niter the desired output of his
.I'
(
system .
~i In this chapter, we explore methods of describing the response of a linear
.,~. system when the applied wuveform is random. We begin by discussing some basic
aspects of linear systems in the following section. Those readers well-versed in
., linear system theory can proceed directly to Section 8.2 without loss. For others,
~r
the topics of Section 8.1 should serve us u brief review un~ summary.
t.
'

,,... 8.1 LINI~An SYSTEM FUNDAMENTALS
.
In this section, a brief summury of the basic aspects of linear systems is given.
Attention will be limited to a system having only one input and one output, or
response, as illustrated in Figure 8.1-1. It is assumed that the input signalx(l) and

205

Das könnte Ihnen auch gefallen