Beruflich Dokumente
Kultur Dokumente
Overview
Introduction to Course
j
Advance Coding Theory Syllabus & Books
ECEP581 Course Logistic
ja
Prerequisite Knowledge
Information theory
Nikesh Bajaj
nikesh.14730@lpu.co.in
Asst. Prof., ECE Dept.
Lovely Professional University 2 By Nikesh Bajaj
Introduction to Course
Expectations and Aim
What are your Expectations?
Ba
Communication System
Purpose:
Transmitting the information to destination through some media or
channel.
Typical Block Diagram of Comm. System ???
What is Course?
sh
Examples:
What are Courses Expectations? Information
Tx FM Radio
Source
Telephone
Mobile Comm.
Channel Television
What comes in your mind when you listen word
Storage Channel
Coding? Information CD, DVD,
Rx
User Magnetic Tap
Other Other
Sources Sources
Other
Sources
Synchronization Channel
Other
Sources
CODING
Synchronization Channel
1-Efficiency
2-Reliability
Freq. Multiple Freq. Multiple
Demultiplexing Demodulation
Spreading Access
Rx Demultiplexing 3-Security Spreading
Demodulation
Access
Rx
1
1/12/2013
Philosophy Syllabus
Check Updated IP
Unit
1. Information Theory and Source Coding
j
The basic philosophy of the course is 2. Foundation of Error Correcting Codes
3. Groups and Vector Space
Most of the ideas in modern coding are very intuitive and
4. Linear Block Codes
natural.
ja
5. Cyclic Codes
If someone had not invented them a few years ago, you
could invent them yourself. 6. Number Theory and Algebra
7. BCH and Reed-Solomon Codes
8. --------------
Convolutional Codes
9. Trellis and Turbo codes
10. Bounds on Codes and Other Codes
7 By Nikesh Bajaj 8 By Nikesh Bajaj
Syllabus: Overview of Subject
Part 1: Information Theory and source coding
Source Coding
Channel capacity and Coding
Ba
##Check updated IP
Books:
Cover, Thomas, and Joy Thomas. Elements of Information Theory. 2nd ed. Programming
New York, NY: Wiley-Interscience, 2006. ISBN: 9780471241959
MATLAB
Andre Neubauer, Jurgen Freudenberger, Volker Kuhn Coding Theory, Mapple
Algorithm, Architectures and Application.. John Wiley & Sons, Ltd.
Python
11 By Nikesh Bajaj 12 Eular.. others By Nikesh Bajaj
2
1/12/2013
HW2 :30
ja
Programming Assignment + Test 2 (Open Book)
HW3 :30
Design Problem (Unique to each students)
Online Group/Forum tinyurl.com/CodyNyk
for discussion and share
Readings & Exercises
Ba
Update yourselves with
IEEE Information Theory Society
http://www.itsoc.org
IEEE Communication Society
Readings: To go through (for better understanding) http://www.comsoc.org
sh
Exercises: Numerical, Problems, Programming Google Group/Forum
QTT! Question to Think! https://groups.google.com/forum/?fromgroups#!forum/codynyk
http://tinyurl.com/CodyNyk
Challenge Problem
Other online links
Open Problem of State-of-the-Art
http://www.usna.edu/Users/math/wdj/research.php
Contest for Code Designing (*may be)
15 By Nikesh Bajaj 16 By Nikesh Bajaj
ke
3
1/12/2013
j
Typical Block Diagram of Comm. System ???
Information Examples:
Tx
ja
Source FM Radio
Telephone
Mobile Comm.
Channel Television
Storage Channel
Information CD, DVD,
Rx
User Magnetic Tap
Information
Other
Sources
Source
Multiplexing
Formatting
Modulation
Source
Encoding
Freq.
Spreading
Encryption
Multiple
Access
Ba
Channel
Encoding
Tx
Communication blocks
Information Source/sink
Tx/Rx
Channel
Formatting
Synchronization Channel
Other Modulation/Demodulation
Sources Coding/Decoding
sh
Source Coding
Coding/Decoding
Source Coding
Block Coding
Introduction: Information Theory
Ni
4
1/12/2013
ja j
(April 30, 1916 February 24, 2001)
Father of Information Theory
Claude Elwood Sir Isaac 1948: The Mathematical Theory of Communication, Bell
Shannon (April 30, Newton (4 Jean Baptiste Joseph
1916 February 24, January 1643 31 Fourier (21 March 1768 University of Michigan, MIT
2001) March 1727 16 May 1830) 25 26 By Nikesh Bajaj
Introduction
information from one point to another.
Introduction
Information theory was born with the discovery of the
fundamental laws of data compression and transmission.
The fundamental limit on these key aspects have their root in fully represent the source?
information theory (or mathematical theory of Answer: The Entropy H.
communication). The minimum rate at which reliable communication can take
place over the channel. Answer: Channel Capacity C.
27 By Nikesh Bajaj 28 By Nikesh Bajaj
ke
5
1/12/2013
j
There will be a thunderstorm in the afternoon.
Speech, Temperature Variation, Nature vision A group of aliens arrived on the earth this morning.
Discrete Sources
ja
English alphabets, Computer files/data, digitized
voice or songs or video.
Source output is Random.
WHY?
DMS-Discrete Memory less Source Information content and probability are inversely related.
31 By Nikesh Bajaj 32 By Nikesh Bajaj
Self Information
Information content and probability are inversely related.
The self information of an event X=xi, having
probability P(xi) is defined as:
Ba Self Information
? ? ? ? ?
1
sh
I ( xi ) log 2 log 2 P( xi ) bits
P( xi )
1 1 1 1
I x log 2 log 2 log 2 log 2 I x1 I x2
P P1P2 P1 P2
35 By Nikesh Bajaj 36 By Nikesh Bajaj
6
1/12/2013
j
Information about x from y: mutual information
Extreme cases Lets consider same two extremes
ja
If X and Y are independent, No information about x
from y or vice versa.
If X and Y are dependent then information of x can be
determine from y.
Mutual Information
Mutual Information:
Ba Think of Practical Examples
7
1/12/2013
j
letter is equally likely.
26 Q. Consider practical case:
ja
1 1
H log 2 P=0.10 for a, e,o,t
P=0.07 for h,i,n,r,s
26 26
i 1 P=0.02 for c,d,f,l,m,p,u,y
P=0.01 for b,g,j,k,q,v,w,x,z
Entropy of random binary source, if
4.7 bits / char P(0)=P(1)=q
Q. Entropy?
Properties of Entropy
Properties of Entropy:
For a DMS, the entropy is bounded as
0 H ( X ) log 2 N
Ba
Where N is the total number of symbols of the source.
0
Tx
1-p 0
p1
0
Rx
p0
sh
The lower bound on entropy corresponds to no uncertainty. 1 1-p 1
1
1
H ( X ) log 2 N if Pk for all k Try example for more than two
N
symbols from a source and
prove same
45 46
ke
8
1/12/2013
Example Example
Conditional Entropy H(X|Y)??
ja j
Entropy of X ??
49 By Nikesh Bajaj 50
Summary
If X and Y are random variables with joint PDF p(x,y) and marginal PDFs p(x)
and p(y), the average mutual information between X and Y is define as
I ( X ; Y ) H ( X ) H ( X / Y ) H (Y ) H (Y / X )
51 52
ke
Ni