Sie sind auf Seite 1von 2

1. A) Describe the Shannon theorem for channel capacity.

Derive the expression


for channel

capacity.

B) Find the Channel capacity for probability

p=0.6 for given Binary

Symmetric channel.

2.

A) A discrete source transmits messages X1, X2 and X3 with probabilities


0.3, 0.4 and 0.3. The source is connected to the channel given figure.
Calculate all the entropies.

OR
B)Explain the mutual information and also prove the average of mutual
information is

I ( X ; Y )=H ( X )H

( YX )=H ( Y )H ( YX )

3. A) Apply the Shannon-Fano coding procedure for the following message

ensemble.
[X]= [x 1
x2
x3
x4
x5
x6
x7
x8 ]
[P]= [1/16 1/16
1/16 1/16 1/8
1/8 1/4
1/4]
Find the efficiency and redundancy.
B) Explain entropy. Find the maximum entropy for M equally probable
source.
4. A) Draw the channel matrix for lossless, deterministic channel, noiseless
channel and binary
symmetric channel.

B) For the given probabilities P(Xi)= , , , . Calculate amount of information and


entropy.
5. A) Describe the process of syndrome testing, error detection and error correction.
B) Explain hamming distance and hamming code for error detection

6. A) The encoder for convolutional code is shown below:

Find all the codewords for a 4-bit input data.


B) The generator polynomial of a (7,4) cyclic code is
code words

g ( x ) =1+ x+ x . Find the 16

of this code. Consider a message vector D= 0101

7. A) Explain the Reed Soloman codes perform so well in a bursty- noise environment.
B) Write the short notes on trellis codes.
8. A) Write the short notes on rate distortion theory for Gaussian source with memory.
B) How we describe the performance analysis of any coding system? Give the example.

Das könnte Ihnen auch gefallen