Sie sind auf Seite 1von 35

Lecture #3

Review on Lecture 2 Random Signals Signal Transmission Through Linear Systems Bandwidth of Digital Data

Random Signals
1. Random Variables

All useful message signals appear random; that is, the receiver does not know, a priori, which of the possible waveform have been sent. Let a random variable X(A) represent the functional relationship between a random event A and a real number. The (cumulative) distribution function FX(x) of the random variable X is given by Fx(x) = P(X x) (1.24) Another useful function relating to the random variable X is the probability density function (pdf) px(x) = d Fx(x) / dx (1.25)

Averages

time average: Averaged quantity of a single system over a time interval.(Directly related to the real experiment) ensemble average: Averaged quantity of many identical systems at a certain time (theoretical concept)

Ensemble vs Time average

1.1 Ensemble Averages


mX = E{X} = x px(x) dx
-

The first moment of a probability distribution of a random variable X is called mean value mX, or expected value of a random variable X The second moment of a probability distribution is the mean-square value of X

E{X2}

= x2 px(x) dx
-

Var(X) = E{(X mX)2 } = (x mX)2 px(x) dx


-

Central moments are the moments of the difference between X and mX and the second central moment is the variance of X
Variance is equal to the difference between the meansquare value and the square of the mean

Var(X) =E{X2} E{X}2

2. Random Processes

A random process X(A, t) can be viewed as a function of two variables: an event A and time.

2.1 Statistical Averages of a Random Process

A random process whose distribution functions are continuous can be described statistically with a probability density function (pdf). A partial description consisting of the mean and autocorrelation function are often adequate for the needs of communication systems. Mean of the random process X(t) : E{X(tk)} = x Px(x) dx = mx(tk)
-

(1.30)

Autocorrelation function of the random process X(t) Rx(t1,t2) = E{X(t1) X(t2)} = xt1xt2Px(xt1,xt2) dxt1dxt2 (1.31)

2.2 Stationarity
A random process X(t) is said to be stationary in the strict sense if none of its statistics are affected by a shift in the time origin. A random process is said to be wide-sense stationary (WSS) if two of its statistics, its mean and autocorrelation function, do not vary with a shift in the time origin. E{X(t)} = mx= a constant Rx(t1,t2) = Rx (t1 t2) (1.32) (1.33)

2.3 Autocorrelation of a Wide-Sense Stationary Random Process

For a wide-sense stationary process, the autocorrelation function is only a function of the time difference = t1 t2; Rx() = E{X(t) X(t + )} for - < < (1.34) Properties of the autocorrelation function of a real-valued widesense stationary process are

3. Time Averaging and Ergodicity

When a random process belongs to a special class, known as an ergodic process, its time averages equal its ensemble averages. The statistical properties of such processes can be determined by time averaging over a single sample function of the process. A random process is ergodic in the mean if
T/2

mx = lim 1/T-T/2 x(t) dt


T

(1.35-a)

It is ergodic in the autocorrelation function if Rx() = lim 1/T


T T/2

-T/2

x(t) x (t + ) dt

(1.35-b)

4. Power Spectral Density and Autocorrelation

A random process X(t) can generally be classified as a power signal having a power spectral density (PSD) GX(f ) Principal features of PSD functions

5. Noise in Communication Systems

The term noise refers to unwanted electrical signals that are always present in electrical systems; e.g spark-plug ignition noise, switching transients, and other radiating electromagnetic signals or atmosphere, the sun, and other galactic sources.
Can describe thermal noise as a zero-mean Gaussian random process.

A Gaussian process n(t) is a random function whose amplitude at any arbitrary time t is statistically characterized by the Gaussian probability density function
(1.40)

Noise in Communication Systems

The normalized or standardized Gaussian density function of a zero-mean process is obtained by assuming unit variance. Central limit theorem

5.1 White Noise

The primary spectral characteristic of thermal noise is that its power spectral density is the same for all frequencies of interest in most communication systems Power spectral density Gn(f ) (1.42) Autocorrelation function of white noise is (meaning) uncorrelated (1.43)

The average power Pn of white noise is infinite (1.44)

The effect on the detection process of a channel with additive white Gaussian noise (AWGN) is that the noise affects each transmitted symbol independently. Such a channel is called a memoryless channel. The term additive means that the noise is simply superimposed or added to the signal

Signal Transmission through Linear Systems

A system can be characterized equally well in the time domain or the frequency domain, techniques will be developed in both domains The system is assumed to be linear and time invariant. It is also assumed that there is no stored energy in the system at the time the input is applied

1. Impulse Response A.5


The linear time invariant system or network is characterized in the time domain by an impulse response h (t ),to an input unit impulse (t) h(t) = y(t) when x(t) = (t) (1.45) The response of the network to an arbitrary input signal x (t )is found by the convolution of x (t )with h (t ) y(t) = x(t)*h(t) = x()h(t- )d (1.46) - The system is assumed to be causal,which means that there can be no output prior to the time, t =0,when the input is applied. convolution integral can be expressed as: y(t) = x() h(t- )d
-

(1.47a)

2. Frequency Transfer Function


The frequency-domain output signal Y (f )is obtained by taking the Fourier transform Y(f) = H(f)X(f) (1.48) Frequency transfer function or the frequency response is defined as: H(f) = Y(f) / X(f) (1.49) H(f) = |H(f)| ej (f) The phase response is defined as: (f) = tan-1 Im{H(f)} / Re{H(f)} (1.50)

(1.51)

2.1. Random Processes and Linear Systems


If a random process forms the input to a time-invariant linear system,the output will also be a random process. The input power spectral density GX (f )and the output power spectral density GY (f )are related as: Gy(f)= Gx(f) |H(f)|2 (1.53)

3. Distortionless Transmission
What is the required behavior of an ideal transmission line?

The output signal from an ideal transmission line may have some time delay and different amplitude than the input It must have no distortionit must have the same shape as the input. For ideal distortionless transmission:
Output signal in time domain Output signal in frequency domain System Transfer Function

y(t) = K x( t - t0 ) Y(f) = K X(f) e-j2 f t0

(1.54)

(1.55)

H(f) = K e-j2 f t0

(1.56)

What is the required behavior of an ideal transmission line?

The overall system response must have a constant magnitude response The phase shift must be linear with frequency All of the signals frequency components must also arrive with identical time delay in order to add up correctly

Time delay t0 is related to the phase shift and the radian frequency = 2f by: t0 (seconds) = (radians) / 2f (radians/seconds ) (1.57a)
Another characteristic often used to measure delay distortion of a signal is called envelope delay or group delay: (f) = -1/2 (d(f) / df) (1.57b)

3.1. Ideal Filters

For the ideal low-pass filter transfer function with bandwidth Wf = fu hertz can be written as: H(f) = | H(f) | e-j (f)
Where (1.58)

| H(f) | = { 1 for |f| < fu 0 for |f| fu }


(1.59)

e-j (f) = e-j2 f t0

(1.60)

Figure1.11 (b) Ideal low-pass filter

3.1. Ideal Filters

The impulse response of the ideal low-pass filter:

h(t) = F-1 {H(f)} = H(f) e-j2 f t df


- fu

(1.61)

= e-j2 f t0 e-j2 f t df
- fu
fu

= e-j2 f (t - t0) df
- fu

= 2fu * sin 2fu(t t0)/ 2fu(t t0)


= 2fu * sinc 2fu(t t0)
(1.62)

3.1. Ideal Filters


For the ideal band-pass filter transfer function For the ideal high-pass filter transfer function

Figure1.11 (a) Ideal band-pass filter

Figure1.11 (c) Ideal high-pass filter

3.2. Realizable Filters

The simplest example of a realizable low-pass filter; an RC filter H(f) = 1/ 1+ j2 f RC = e-j (f) / 1+ (2 f RC )2 (1.63)

Figure 1.13

3.2. Realizable Filters


Phase

characteristic of RC filter

Figure 1.13

3.2. Realizable Filters

There are several useful approximations to the ideal low-pass filter characteristic and one of these is the Butterworth filter

| Hn(f) | = 1/(1+ (f/fu)2n)0.5 n 1 (1.65)

Butterworth filters are popular because they are the best approximation to the ideal, in the sense of maximal flatness in the filter passband.

4. Bandwidth Of Digital Data


4.1 Baseband versus Bandpass

An easy way to translate the spectrum of a low-pass or baseband signal x(t) to a higher frequency is to multiply or heterodyne the baseband signal with a carrier wave cos 2fct xc(t) is called a double-sideband (DSB) modulated signal xc(t) = x(t) cos 2fct (1.70) From the frequency shifting theorem Xc(f) = 1/2 [X(f-fc) + X(f+fc) ] (1.71) Generally the carrier wave frequency is much higher than the bandwidth of the baseband signal fc >> fm and therefore WDSB = 2fm

4.2 Bandwidth

Theorems of communication and information theory are based on the assumption of strictly bandlimited channels The mathematical description of a real signal does not permit the signal to be strictly duration limited and strictly bandlimited.

4.2 Bandwidth

All bandwidth criteria have in common the attempt to specify a measure of the width, W, of a nonnegative real-valued spectral density defined for all frequencies |f | < The single-sided power spectral density for a pulse xc(t) takes the analytical form: (1.73)

Different Bandwidth Criteria

(a) Half-power bandwidth. (b) Equivalent rectangular or noise equivalent bandwidth. (c) Null-to-null bandwidth. (d) Fractional power containment bandwidth. (e) Bounded power spectral density. (f) Absolute bandwidth.

Digital Communications: A Discrete-Time Approach: International Edition View Larger Michael Rice Image Publisher: Pearson Higher Education Copyright: 2009 Format: Paper; 800 pp

ISBN-10: ISBN-13:

0138138222 9780138138226

Das könnte Ihnen auch gefallen