Sie sind auf Seite 1von 31

CODING THEORY

A Birds Eye View

Text Books
Shu Lin and Daniel. J. Costello Jr., Error Control Coding: Fundamentals and applications, Prentice Hall Inc. R.E. Blahut, Theory and Practice of Error Control Coding, MGH

References:
Rolf Johannesson, Kamil Sh. Zigangirov, Fundamentals of Convolutional Coding, Universities Press (India) Ltd. 2001. Proakis, Digital Communications, MGH.

Introduction Role of Channel Coding in Digital Communication System Block Codes and Convolution Codes Channel Models Decoding Rules Error Correction Schemes References

Slide 41

Shannons Theorem (1948)


Noisy Coding Theorem due to Shannon: Roughly: Consider channel with capacity C. If we are willing to settle for a rate of transmission that is strictly below C, then there is an encoding scheme for the source data, that will reduce the probability of a decision error to any desired level. Problem: Proof is not constructive! To this day, no one has found a way to construct the coding schemes promised by Shannons theorem.

Shannons Theorem (1948)-contd


Additional concerns: Is the coding scheme easy to implement, both in encoding and decoding? May require extremely long codes.

The Shannon-Hartley Theorem


Gives us a theoretical maximum bit-rate that can be transmitted with an arbitrarily small bit-error rate (BER), with a given average signal power, over a channel with bandwidth B Hz, which is affected by AWGN. For any given BER, however small, we can find a coding technique that achieves this BER; smaller the given BER, the more complicated will be the coding technique.

Shannon-Hartley Theorem-contd.

Let the channel bandwidth be B Hz and signal to noise ratio be S/N (not in dB).

C = B log (1 + S / N )bits / sec


2

Shannon-Hartley Theorem-contd.

For a given bandwidth B and a given S/N, we can find a way of transmitting data at a bit-rate R bits/second, with a bit-error rate (BER) as low as we like, as long as R C. Now assume we wish to transmit at an average energy/bit of Eb and the AWGN noise has two sided power spectral density N0 /2 Watts per Hz. It follows that the signal power S = EbR and the noise power N = N0B Watts.

Shannon-Hartley Theorem-contd.

R/B ratio is called bandwidth efficiency in bits/sec/Hz. How many bits per sec do I get for each Hz of bandwidth.We want this to be as high as possible. Eb /N0 is the normalised average energy/bit, where the normalisation is with respect to the one sided PSD of the noise. The law gives the following bounds:

Shannon-Hartley Theorem-contd.
R ER log (1 + ) NB B
b 2 0

E 2 1 ( ) N R/B
R/B b min 0

Shannon Limit

The bound gives the minimum possible normalised energy per bit satisfying the Shannon-Hartley law. If we draw a graph of (Eb/N0 )min against (R/B) we observe the that (Eb/N0 )min never goes less than about 0.69 which is about -1.6dB. Therefore if our normalised energy per bit is less than -1.6dB, we can never satisfy the ShannonHartley law, however inefficient (in terms of bit/sec/Hz) we are prepared to be.

Shannon Limit-contd.

There exists a limiting value of (Eb/N0 ) below which there cannot be error free communication at any transmission rate. The curve R = C will divide the achievable and non-achievable regions.

Modulation-Coding trade-off

For Pb=10-5, BPSK modulation requires Eb/N0 = 9.6dB (optimum un-coded binary modulation) For this case, Shannons work promised a performance improvement of 11.2dB over the performance of un-coded binary modulation, through the use of coding techniques. Today, Turbo Codes, are capable of achieving an improvement close to this. Turbo Codes are Near Shannon limit error correcting codes

Coding Theory-Introduction
Main problem:

A stream of source data, in the form of 0s and 1s, is being transmitted over a communication channel, such as a telephone line. Occasionally, disruptions can occur in the channel, causing 0s to turn into 1s and vice versa. Question: How can we tell when the original data has been changed, and when it has, how can we recover the original data?

Coding Theory-Introduction

Easy things to try: Do nothing. If a channel error occurs with probability p, then the probability of making a decision error is p. Send each bit 3 times in succession. The bit that occurs the majority of the time, gets picked. (E.g. 010 => 0) Repetition codes!!

Coding Theory-Introduction

Generalize above: Send each bit n times, choose majority bit. In this way, we can make the probability of making a decision error arbitrarily small, but inefficient in terms of transmission rate. As n increases the achievable BER reduces, at the expense of increased codeword length (reduced code rate) Repetition coding is inefficient

Coding Theory Introduction (contd)


Encode source information, by adding additional information (redundancy), that can be used to detect, and perhaps correct, errors in transmission. The more redundancy we add, the more reliably we can detect and correct errors, but the less efficient we become at transmitting the source data.

Error control applications


Data

communication networks (Ethernet, FDDI, WAN, Bluetooth) Satellite and Deep space communications Cellular mobile communications Modems Computer buses Magnetic disks and tapes CDs, DVDs. Digital sound needs ECC!

Error control categories


The error control problem can be classified in several ways: Types of error control coding: detection vs. correction Types of errors: how much clustering- random, Types of codes: block vs. convolutional

burst etc.

Error Control Strategies


Error detection. Goal: avoid accepting faulty data. Lost data may be unfortunate; wrong data may be disastrous. (Forward) error correction (FEC or ECC). Use redundancy in encoded message to estimate from the received data what message was actually sent. The best estimate is usually the closest" message. The optimal estimate is the message that is most probable given what is received.

Das könnte Ihnen auch gefallen