Sie sind auf Seite 1von 2

E2 201: Information Theory (2018)

Homework 1
Instructor: Himanshu Tyagi

Reading Assignment

• Verify that Lemma 2.7 and Lemma 3.2 proved in the class hold even when the discrete
alphabet X is not finite. For instance, they hold for X ∼ Poi(λ).
In his comments on the first version of the notes for this course, Prakash Narayan suggested
the apt name Little-Big Lemmas for these results since these are results about the small
cardinality of large probability sets. We shall use this terminology to refer to these results.

• Read Chapter 3 of the Cover and Thomas book. Compare with our derivation of the same
results using Little-Big Lemmas.

• Read the first two pages of Chapter 1 of the Csiszár and Körner book. Compare with our
derivation of the same results using the Little-Big Lemmas.

Homework Questions Questions marked ∗ are difficult and will not be tested in the quiz.

Q1 A source outputs one of 10 symbols each with equal probability.

(i) For  = 0.01, what is the largest value of λ so that

P (h(X) ≥ λ) ≥ 1 − ?

Recall that here h(x) = − log PX (x) denotes the entropy density of the source X.
(ii) What is L (X)?
(iii) If you are using a code of length L (X) (as calculate in (ii)), what is the minimum error
you will make?
(iv) Answer (i), (ii) and (iii) with  = 0.2, 0.25, 0.5.

Q2 Show that for a uniformly distributed source X over any finite alphabet X , the upper and
the lower bounds given by Litte-Big Lemmas ( Lemma 2.7 and Lemma 3.2) differ by at most
log(1 − ). Furthermore, show that the lower bound is tight.

Q3 Provide upper and lower bounds for L (X) using Little-Big Lemmas for each case below.

(a) A source with symbols having probabilities {0.2, 0.2, 0.15, 0.15, 0.15, 0.14, 0.01} and  =
0.01, 0.15.
(b) A source with Binomial distribution Bin(5, 0.25) and  = 2−10 , 2−6 .
(c) A source with Geometric distribution Geo(0.5) and  = 2−10 , 2−6 .

1
Q4 Consider sources X1 , X2 and X3 , all with a common alphbet X = {0, 1}10 . The first source
produces all sequences with equal probability, i.e.,

PX1 (x) = 2−10 , for all x ∈ {0, 1}10 .

The second source only produces sequences starting with 00000, each with equal probability,
i.e.,
(
2−5 , if x1 = x2 = x3 = x4 = x5 = 0,
PX2 (x1 , ..., x10 ) =
0, otherwise.

Let X3 be a uniform mixture of sources X1 and X2 , i.e.,


1 1
PX (x) = PX1 (x) + PX2 (x) , for all x ∈ X .
2 2
(i) Determine the entropies H(X1 ), H(X2 ), and H(X).

(ii) For  = 0.01, determine L (X1 ) and L (X2 ).

(iii) For  = 0.01, using Lemma 2.7 and Lemma 3.2 proved in the class find bounds for
L (X3 ). Does L (X3 ) equal H(X3 )?

(iv) Denote by X1n , X2n , and X3n discrete memoryless sources (DMSs) with common distri-
butions PX1 , PX2 , and PX3 , respectively. Let  = 0.01. For each of the DMS above,
determine R∗ , the least -achievable rate of a source code.

Q5∗ A graph is a collection of points, called vertices, and edges between the points. The degree
of a vertex in a graph is the number of edges incident on the vertex. For instance, a triangle
is a graph with 3 vertices and 3 edges, with each vertex having a degree 2.
For a graph G with n vertices and e edges, use a packing argument to show that the number
of vertices with degree greater than d is less than or equal to 2e
d.

Q6∗ For a pmf P, the Rényi entropy P of order α > 0, α 6= 1, is defined as

def 1 X
Hα (P) = log P(x)α .
1−α
x∈X

Consider a source X with pmf P. Let 0 <  < 1. Using the Little-Big Lemmas, show that for
every 0 < α < 1
1 1
L (X) ≤ Hα (P) + log + 1.
1−α 
Furthermore, for every β > 1 and 0 < δ < 1 − ,
1 1 1
L (X) ≥ Hβ (P) − log − log .
β−1 δ 1−−δ

Das könnte Ihnen auch gefallen