Sie sind auf Seite 1von 33

TC-502 INFORMATION THEORY

Books
1. Introduction to Information Theory and Data
Compression Second Edition by Darrel
Hankerson, Greg A. Harris, Peter D. Johnson, Jr.,
CHAPMAN & HALL/CRC, 2003
2. Elements of Information Theory, T.M. Cover
and J. A. Thomas, JOHN WILEY & SONS, 1991
3. Fundamentals of Information Theory and
Coding Design, Roberto Togneri & Christopher J.S.
deSilva, CHAPMAN & HALL/CRC, 2002
Introduction
Basic Concepts of Probability Theory
Random Experiments & Examples
Sample Space and Events
Mutually Exclusive Events
Axioms of Probability
Conditional Probability
Bayes Rule
Total Probability Law
Independent Events
Random Variables
Discrete Vs. Continuous Random Variables
Probability Distribution
Expected Values
Introduction
Structures
Structure in Randomness
IS it possible to see structures in random phenomena
!
Can we quantify random phenomena
What to we mean by probability of an event!
Can events SURPRISE us?
Can we quantify the surprise!
How much surprise an event produces
Surprise and information
Information provided by an event
Information Source
An information source is an object that produces an event,
the outcome of which is selected at random according to a
probability distribution.
A practical source in a communication system is a device that produces
messages, ad it can be either analog or discrete
A discrete information source is a source that has only a finite set of symbols
as possible outputs. The set of source symbols is called the source alphabet,
and the elements of the set are called symbols or letters
A source with memory is one for which a current symbol depends on the
previous symbols
A memoryless source is one for which a current symbol is independent of the
previous symbols
A Discrete memoryless source (DMS) can be characterized by the list of
symbols, the probability assignment to these symbols, and the
specification of the rate of generating these symbols by the source.
Information & Entropy
Definition of information
Why use logarithm to quantify information?
Properties of information
Unit of information
Entropy
Examples of relationship between entropy and
uncertainty / surprise
Maximum and minimum values for entropy
Information rate
0 H ( X ) log 2 ( N )
Problem
JOINT ENTROPY
Summary:
Entropy:

Joint Entropy:

Conditional Entropy:

Das könnte Ihnen auch gefallen