Sie sind auf Seite 1von 9

1/12/2013

**Information Theory and Coding


ECE533

Overview
Introduction to Course

j
Advance Coding Theory Syllabus & Books
ECEP581 Course Logistic

ja
Prerequisite Knowledge
Information theory
Nikesh Bajaj
nikesh.14730@lpu.co.in
Asst. Prof., ECE Dept.
Lovely Professional University 2 By Nikesh Bajaj

Introduction to Course
Expectations and Aim
What are your Expectations?
Ba


Communication System
Purpose:
Transmitting the information to destination through some media or
channel.
Typical Block Diagram of Comm. System ???
What is Course?
sh
Examples:
What are Courses Expectations? Information
Tx FM Radio
Source
Telephone
Mobile Comm.
Channel Television
What comes in your mind when you listen word
Storage Channel
Coding? Information CD, DVD,
Rx
User Magnetic Tap

3 By Nikesh Bajaj 4 By Nikesh Bajaj


ke

Information Source Channel Information Source Channel


Source Formatting Encryption Source Formatting Encryption
Encoding Encoding Encoding Encoding

Freq. Multiple Freq. Multiple


Multiplexing Modulation Tx Multiplexing Modulation Tx
Ni

Spreading Access Spreading Access

Other Other
Sources Sources

Other
Sources
Synchronization Channel
Other
Sources
CODING
Synchronization Channel

1-Efficiency
2-Reliability
Freq. Multiple Freq. Multiple
Demultiplexing Demodulation
Spreading Access
Rx Demultiplexing 3-Security Spreading
Demodulation
Access
Rx

Information Source Channel Information Source Channel


Formatting Decryption Formatting Decryption
Sink Decoding Decoding Sink Decoding Decoding

5 By Nikesh Bajaj 6 By Nikesh Bajaj

1
1/12/2013

Philosophy Syllabus
Check Updated IP
Unit
1. Information Theory and Source Coding

j
The basic philosophy of the course is 2. Foundation of Error Correcting Codes
3. Groups and Vector Space
Most of the ideas in modern coding are very intuitive and
4. Linear Block Codes
natural.

ja
5. Cyclic Codes
If someone had not invented them a few years ago, you
could invent them yourself. 6. Number Theory and Algebra
7. BCH and Reed-Solomon Codes
8. --------------
Convolutional Codes
9. Trellis and Turbo codes
10. Bounds on Codes and Other Codes
7 By Nikesh Bajaj 8 By Nikesh Bajaj


Syllabus: Overview of Subject
Part 1: Information Theory and source coding


Source Coding
Channel capacity and Coding
Ba
##Check updated IP

Books:

Part 2: Channel Coding -I


sh
Linear Block codes Text Book

Cyclic Codes Error Correction Codes by TODD K. MOON, Wiley Blackwell,


India, 1st Edition (2005) ISBN: 978-0471648000
Part 3: Channel Coding-II Other Specific Books
BCH Codes Ranjan Bose, Information Theory, Coding and Cryptography, TMH
Publication, 2005.
Convolutional Codes
Trellis Coded Modulation
9 By Nikesh Bajaj 10 By Nikesh Bajaj
ke

Books.. Prerequisite Knowledge


Communication system
Ni

Other Specific Books


Mathematics & Probability
Richard E. Blahut, Algebraic Codes for Data Transmission, CAMBRIGDGE Strong conceptual understanding
University Press. 2003

Cover, Thomas, and Joy Thomas. Elements of Information Theory. 2nd ed. Programming
New York, NY: Wiley-Interscience, 2006. ISBN: 9780471241959
MATLAB
Andre Neubauer, Jurgen Freudenberger, Volker Kuhn Coding Theory, Mapple
Algorithm, Architectures and Application.. John Wiley & Sons, Ltd.
Python
11 By Nikesh Bajaj 12 Eular.. others By Nikesh Bajaj

2
1/12/2013

Course Logistics Assignments & Homework


HW1 :30
Assignment (Not to submit) + Test 1

HW2 :30

ja
Programming Assignment + Test 2 (Open Book)

HW3 :30
Design Problem (Unique to each students)

13 By Nikesh Bajaj 14 By Nikesh Bajaj

What else.(Learn with Fun!!!)



Online Group/Forum tinyurl.com/CodyNyk
for discussion and share
Readings & Exercises
Ba



Update yourselves with
IEEE Information Theory Society
http://www.itsoc.org
IEEE Communication Society
Readings: To go through (for better understanding) http://www.comsoc.org
sh
Exercises: Numerical, Problems, Programming Google Group/Forum
QTT! Question to Think! https://groups.google.com/forum/?fromgroups#!forum/codynyk
http://tinyurl.com/CodyNyk
Challenge Problem


Other online links
Open Problem of State-of-the-Art
http://www.usna.edu/Users/math/wdj/research.php
Contest for Code Designing (*may be)
15 By Nikesh Bajaj 16 By Nikesh Bajaj
ke

Aim of Subject So What you will be


Strong understanding of various coding Reason about Coding in Communications
Ni

Source coding techniques Can analyze performance of Comm. Sys


Channel coding techniques
Can understands the need of any Comm
Sys.
Able to perform these techniques
In MATLAB or LabView or any other Lag.
Will be aware of State-of-the-Art in same
field
Develop your own coding techniques Will be able to contribute in same field
Research work
17 By Nikesh Bajaj 18 By Nikesh Bajaj

3
1/12/2013

PART -I Communication System


Purpose:
Information Theory
Transmitting the information to destination through some media or
Source Coding channel.

j
Typical Block Diagram of Comm. System ???

Information Examples:
Tx

ja
Source FM Radio
Telephone
Mobile Comm.
Channel Television

Storage Channel
Information CD, DVD,
Rx
User Magnetic Tap

19 By Nikesh Bajaj 20 By Nikesh Bajaj

Information

Other
Sources
Source

Multiplexing
Formatting

Modulation
Source
Encoding

Freq.
Spreading
Encryption

Multiple
Access
Ba
Channel
Encoding

Tx
Communication blocks


Information Source/sink
Tx/Rx
Channel
Formatting
Synchronization Channel
Other Modulation/Demodulation
Sources Coding/Decoding
sh

Source Coding

Freq. Multiple Channel Coding


Demultiplexing Demodulation Rx
Spreading Access
Multiplexing/Demultiplexing
Multiple Access
Encryption/Decryption
Information Source Channel
Formatting Decryption
Sink Decoding Decoding Equalization
21 By Nikesh Bajaj 22 Synchronization By Nikesh Bajaj
ke

Coding/Decoding
Source Coding
Block Coding
Introduction: Information Theory

Ni

Variable Length Coding


Lossless Compression
Lossy Compression
Predictive Coding
Channel Coding
Error correction Codes
Waveform
M-ary signaling, Orthogonal
TrellisCoded Modulation
Structured sequences
Block,
By Nikesh Bajaj 24
23 Convolution, Turbo By Nikesh Bajaj

4
1/12/2013

Guess.? Claude E Shannon

ja j
(April 30, 1916 February 24, 2001)
Father of Information Theory
Claude Elwood Sir Isaac 1948: The Mathematical Theory of Communication, Bell
Shannon (April 30, Newton (4 Jean Baptiste Joseph
1916 February 24, January 1643 31 Fourier (21 March 1768 University of Michigan, MIT
2001) March 1727 16 May 1830) 25 26 By Nikesh Bajaj


Introduction
information from one point to another.

Key issues in evaluating performance of a digital


communication system:
Ba
Communication theory deals with systems for transmitting


Introduction
Information theory was born with the discovery of the
fundamental laws of data compression and transmission.

The information theory deals only with mathematical modeling


and analysis of communication system, rather than with physical
Efficiency with which information from a given source sources and physical channels.
sh
can be transmitted. Purpose:
Rate at which information can be transmitted reliably over given an information source and noisy channel, the
a noisy channel. information theory provide limits on
What is the minimum number of bits per symbol required to

The fundamental limit on these key aspects have their root in fully represent the source?
information theory (or mathematical theory of Answer: The Entropy H.
communication). The minimum rate at which reliable communication can take
place over the channel. Answer: Channel Capacity C.
27 By Nikesh Bajaj 28 By Nikesh Bajaj
ke

Shannons Considerations Information


In early days it was thought that increasing Syntactic
Ni

transmission rate over a channel increases Semantic


the error rate.
Pragmatic
Shannon showed that this is not true as long
as rate is below Channel Capacity.

Shannon has further shown that random


processes have an irreducible complexity
below which they can not be compressed.
29 By Nikesh Bajaj 30 By Nikesh Bajaj

5
1/12/2013

Information Source Uncertainty and Information


Information Source Consider these news
Tomorrow, the sun will rises from the east.
Analog Sources

j
There will be a thunderstorm in the afternoon.
Speech, Temperature Variation, Nature vision A group of aliens arrived on the earth this morning.
Discrete Sources

ja
English alphabets, Computer files/data, digitized
voice or songs or video.
Source output is Random.
WHY?
DMS-Discrete Memory less Source Information content and probability are inversely related.
31 By Nikesh Bajaj 32 By Nikesh Bajaj


Self Information
Information content and probability are inversely related.
The self information of an event X=xi, having
probability P(xi) is defined as:
Ba Self Information

? ? ? ? ?
1
sh
I ( xi ) log 2 log 2 P( xi ) bits
P( xi )

Which means that less probable events need more bits.


Unit: Think of Practical Examples
base 2:- bits
33 base e :- nats By Nikesh Bajaj 34 By Nikesh Bajaj
ke

Self Information Properties of self information


Why base is 2..? I(xm) > I(xn), if Pm < Pn;
I(xk) = 0, if Pk = 1;
Ni

Consider a fair coin, giving output as HEAD or


TAIL. How many bits require to represent the I(xk) 0, since 0 Pk 1;
output?
For two independent message, the total
information is the sum of each
Consider same for block of m binary digit. x1 P1 x2 P2
x x 1 x2 P = P1P2

1 1 1 1
I x log 2 log 2 log 2 log 2 I x1 I x2
P P1P2 P1 P2
35 By Nikesh Bajaj 36 By Nikesh Bajaj

6
1/12/2013

Mutual Information Mutual Information


Two random variable, X, Y; with outcomes xi, i=1, The Mutual Information is defined as
2 ...n and yj, j=1, 2 ...m.

j
Information about x from y: mutual information
Extreme cases Lets consider same two extremes

ja
If X and Y are independent, No information about x
from y or vice versa.
If X and Y are dependent then information of x can be
determine from y.

37 By Nikesh Bajaj 38 By Nikesh Bajaj

Mutual Information
Mutual Information:
Ba Think of Practical Examples

Conditional Self Information


Conditional self information of x when y is
given.
sh
Mutual information
Then
Information of x from y is identical to
information of y from x
39 By Nikesh Bajaj 40 By Nikesh Bajaj

Think of Practical Examples


ke

Average Mutual Information Average Self Information


Average Self Information
Ni

This is Called Entropy of X


Mechanics Phenomena: disorder

41 By Nikesh Bajaj 42 By Nikesh Bajaj

7
1/12/2013

Do solve and also consider


case of individual character's
frequency from Wikipedia

Entropy: Problems Entropy


Calculate the average information in If a source has n different letters and each
bits/character in English assuming each letter has same probability then Entropy:

j
letter is equally likely.
26 Q. Consider practical case:

ja
1 1
H log 2 P=0.10 for a, e,o,t
P=0.07 for h,i,n,r,s
26 26
i 1 P=0.02 for c,d,f,l,m,p,u,y
P=0.01 for b,g,j,k,q,v,w,x,z
Entropy of random binary source, if
4.7 bits / char P(0)=P(1)=q
Q. Entropy?

43 By Nikesh Bajaj 44 By Nikesh Bajaj

Properties of Entropy
Properties of Entropy:
For a DMS, the entropy is bounded as

0 H ( X ) log 2 N
Ba
Where N is the total number of symbols of the source.
0

Tx
1-p 0

p1
0

Rx
p0
sh
The lower bound on entropy corresponds to no uncertainty. 1 1-p 1
1

The upper bound corresponds to maximum uncertainty. BSC

1
H ( X ) log 2 N if Pk for all k Try example for more than two
N
symbols from a source and
prove same
45 46
ke

Conditional Entropy Prove :


Average conditional self-information or the conditional
entropy is defined as
Ni

It is interpreted as the average amount of uncertainty in X after


Y is observed
Therefore, the information can be given as
Prove it and quote one example
Prove it for this relation

Since I(X;Y) 0, therefore H(X) H(X |Y )


47 48 By Nikesh Bajaj

8
1/12/2013

Example Example
Conditional Entropy H(X|Y)??

ja j
Entropy of X ??

49 By Nikesh Bajaj 50


Summary

Consider a pair X and Y of discrete variables


H (X) :average information in observing X
H (Y) :average information in observing Y
H (X,Y) :average information in observing (X,Y)
H (X |Y) :average information in observing X when Y is known
H (Y |X) :average information in observing Y when X is known
I (X ;Y): average mutual information between X and Y
Ba Information Measures for Continuous Random
variables


If X and Y are random variables with joint PDF p(x,y) and marginal PDFs p(x)
and p(y), the average mutual information between X and Y is define as

Self-information or differential entropy of the random variable X is


sh
The average conditional entropy of the random variable X given Y

Also, The average mutual information between X and Y is define as

I ( X ; Y ) H ( X ) H ( X / Y ) H (Y ) H (Y / X )
51 52
ke
Ni

Das könnte Ihnen auch gefallen