Beruflich Dokumente
Kultur Dokumente
1. Bhudev Sharma and Richa Gupta, Directions in Optimal Error Correction Coding:
Variable Length Error Coding, a class of Distances and Reversible Variable Length
Codes, Proceedings of the National Symposium on Mathematical Methods and
Applications 2009, Indian Institute of Technology Madras, Chennai, pp 1-12, 2009.
88
5.1.
INTRODUCTION
Variable length codes (VLCs) are widely used in multimedia applications, like in the
transmission of image files, video files or speech files, such as in JPEG, H.261, H.263,
MPEG-1, MPEG-2 and MPEG-4 image and video coding standards [112] [113] [114] [115]
[116] [117] [118]. VLCs are used to convert multimedia data into binary bit-stream which is
suitable for the transmission over the channel.
Huffman code is traditionally the optimal code for source coding. Almost all the image
coding standards, viz. the JPEG (still image coding standard), the ITU series of H.261 and
H.263 (video coding standards), and the ISO series of MPEG-1 and MPEG-2 standards, use
Huffman code as an entropy encoder. It is the best code for noiseless channel. But VLCs are
very sensitive to errors in noisy environments due to their variable length nature. In the
presence of noise and errors, unique decodability of VLCs may create a serious problem. For
any VLC-involved application, even one single bit error may induce the problem of error
propagation, because the data received after the bit error position will become useless and
may results in a total disruption. This may be overcome by using what are called as error
resilient codes. A class of such codes is Reversible Variable Length Codes (RVLCs) [37].
We undertake their study in this chapter and discuss their role in multimedia applications.
Nowadays, in industrial research, we use RVLCs in different multimedia coding standards
[37] [119] [120] [121] [122] [123] [124] [125], such as H.263++ video coder [36] [88] [125]
and MPEG-4 video coder [35] [126].
89
In earlier chapters, we have explored the area of VLECs in the terms of: the mathematical
results (combinatorial results), construction algorithms and decoding algorithms. In practical
channel, with number of errors introduced by the channel is more than one, RVLCs are also
unsuccessful in correctly decoding the bit-stream. In this chapter, we propose to use VLEC as
an entropy encoder in the image compression, instead of Huffman code or RVLC, to get
correctly decoded image at the receiver in a noisy environment.
This chapter discusses the performance of entropy encoders in two multimedia applications:
1. Image compression;
2. Video compression.
In JPEG image compression, three different entropy encoders have been taken- Huffman
codes, RVLCs and VLECs. After evaluating the performance of image compression with
these entropy encoders in the JPEG image compression, we propose to use VLECs in place of
Huffman codes in case of the noisy environments.
We also propose that in H.263++ video coding standard, improved performance can be
achieved by replacing conventional RVLCs (Golomb and Rice RVLCs) by Yan RVLCs [86].
This improvement in performance is achieved in terms of reduction in average codeword
length, maximum codeword length and total of encoded bits (for a file) with all other
perceptual parameters remaining the same as with the use of conventional RVLCs.
This chapter is divided into sections. Section 5.1 is introductory. Section 5.2 discusses
RVLCs and their construction algorithms. This section also reports simulation analysis and
results in the form of comparative analysis of the RVLC construction algorithms on the basis
of their average codeword length and the value of maximum codeword length. Section 5.3
describes performance measures which are used to evaluate the performance of a
compression algorithm. Section 5.4 presents the simulation results and analysis of image
compression with three entropy encoders: Huffman codes, RVLCs and VLECs. The scope of
improvement in H.263++ video compression is discussed in Section 5.5 while Section 5.6
concludes the chapter.
90
Definition 5.1-Bit Length Vector: The bit length vector of a code C, is defined as the
vector with the number of codewords of length
= 1,2,
) where,
Definition 5.3-Affix Index: The affix index of a set at a given level gives an exact
number of the symmetrical codewords available for use at higher levels after the
elimination process considering the prefix and the suffix condition [128].
For example if, in the construction of a code, the codewords 00, 010, 101, 111 have
been already selected, then affix index at level 4 will be 2 which clearly indicates that
91
2 codewords (0110 and 1001) are available at level 4 satisfying both the prefix and
the suffix criteria.
92
93
advantages over the other symmetrical RVLC construction methods. The advantages
are:
i. The earlier methods took longer time in the computation of symmetrical
codewords at every level.
ii. It is independent of Huffman code; as such it reduces the complexity in storing
the candidate words as only the derived codes of the asymmetrical codewords
are being pushed into the queue and not all the possible codewords.
iii. The average codeword length is minimized due to the users choice of the
minimum length,
codeword length of RVLCs was equal to that of the shortest codeword of the
given Huffman code.
iv. In this algorithm, the value of minimum codeword length depends on the
source probabilities.
v. The described algorithm simplifies the codeword selection mechanism and is
easier to realize practically.
We used Yan construction algorithm to generate symmetric RVLCs. The generated
codewords are subsequently used in multimedia applications to evaluate the performance.
94
SYMBO
LS
POBABILITI
ES
HUFFMA
NCODE
TAKISHI
MA
(SYMM
RVLC)
YAN
ALGORITH
M(SYMM
RVLC)
TSAI
ALGORITH
M(SYMM
RVLC)
Jeong
(SYMM
RVLC)
GOLOMBRICE
CODE
(ASYMM
RVLC)
TAKISHIM
A
(ASYMM
RVLC)
000
TSAI
ALGORITH
M
(ASYMM
RVLC)
000
0.127
001
000
000
010
01
010
0.0906
110
111
010
101
111
111
00
101
0.0817
0000
110
101
110
010
0101
111
0110
0.0751
0100
1001
111
1001
101
1010
110
1001
0.0697
0101
100
0110
0110
0110
0010
1011
0000
0.0674
0110
1010
1001
1111
1001
1101
1010
1111
0.0633
1000
1110
00100
01110
00100
0100
10011
01110
0.0609
1001
10001
01110
10001
11011
1011
10010
10001
0.0599
1010
10101
10001
00100
01110
0110
100011
00100
0.0425
00010
11011
11011
11011
10001
11001
100010
11011
0.0403
00011
1100
001100
011110
001100
10011
1000011
011110
0.0278
10110
10010
011110
100001
110011
01110
1000010
100001
0.0276
10111
11110
100001
001100
011110
10001
10000011
001100
0.0241
11100
100001
110011
110111
100001
001100
10000010
110011
0.0236
11110
101101
0010100
0111110
0010100
011110
100000011
0111110
0.0223
11111
110011
011100
1000001
1101011
100001
100000010
1000001
0.0202
011100
10100
0111110
0010100
0011100
1001001
1000000011
0010100
0.0197
011101
11100
1000001
0011100
1100011
0011100
1000000010
1101011
0.0193
011110
100010
1100011
1100011
0111110
1100011
10000000011
0011100
0.0149
011111
111110
1101011
1101011
1000001
0111110
10000000010
1100011
0.0098
111011
1000001
00111100
0001000
1000001
0.0077
1110100
1011101
01111110
1110111
0.0015
11101011
1011010
10000001
01000000
10000000001
1
10000000001
0
10000000000
0001000
0011110
0
1100001
1
0111111
95
00111100
11000011
1110111
01111110
0
X
0.0015
0.001
0.0006
Averagecodeword
length
11101010
1
11101010
00
11101010
01
4.2045
1111110
0
1000000
10
1001100
10
4.2046
11000011
00101010
0
00110110
0
4.7799
10
11
01001100
10
01100001
10
01110011
10
4.5418
1000000
1
0111111
10
1000000
01
4.6841
10010100
1
00111010
01
10010111
00
4.3662
10000000000
10
10000000000
011
10000000000
010
5.0394
011111110
011111111
0
100000000
01
4.6848
Table 5.1: RVLCs generated using different RVLC construction algorithms on Buttigieg
probability distribution
Similar to Table 5.1, Table 5.2 illustrates the reversible variable length codewords generated
using different RVLC construction algorithms on Tsai probability distribution.
SYMB
OLS
PROBABILITIES
HUFFMAN
CODE
TAKISHI
MA
(SYMM
RVLC)
YAN
ALGORIT
HM
(SYMM
RVLC)
TSAI
ALGORITH
M(SYMM
RVLC)
JEONG
(SYMM
RVLC)
TSAI
ALGORIT
HM
(ASYMM
RVLC)
GOLOMB
RICE
CODE
(ASYMM
RVLC)
TAKISHIMA
(ASYMM
RVLC)
0.1487857
001
000
000
010
000
000
01
010
0.09354149
110
111
010
101
111
111
00
101
0.08833733
0000
0110
101
0110
010
0101
111
0110
0.07245769
0100
1001
111
1001
101
1010
110
1001
0.06872164
0101
00100
0110
0000
0110
0010
1011
0000
0.06498532
0110
01010
1001
1111
1001
1101
1010
1111
0.05831331
1000
01110
00100
01110
00100
0100
10011
01110
0.05644515
1001
10001
01110
10001
11011
1011
10010
10001
0.05537763
1010
10101
10001
00100
01110
0110
100011
00100
0.04376834
00010
11011
11011
11011
10001
11001
100010
11011
0.04123298
00011
001100
001100
011110
001100
10011
1000011
011110
0.02762209
10110
010010
011110
100001
110011
01110
1000010
100001
0.02575393
10111
011110
100001
001100
011110
10001
001100
0.02455297
11100
100001
110011
110111
100001
001100
0.02361889
11110
101101
0010100
0111110
0010100
011110
0.02081665
11111
110011
0011100
1000001
1101011
100001
1000001
1
1000001
0
1000000
11
1000000
10
0.01868161
011100
0010100
0111110
0010100
0011100
1001001
1000000
011
0010100
0.01521216
011101
0011100
1000001
0011100
1100011
0011100
1101011
0.01521216
011110
0100010
1100011
1100011
0111110
1100011
0.0126768
011111
0111110
1101011
1101011
1000001
0111110
0.01160928
111011
1000001
0011110
0
0001000
0011110
0
1000001
1000000
010
1000000
0011
1000000
0010
1000000
00011
96
110011
0111110
1000001
0011100
1100011
0001000
0.0086736
1110100
1011101
0.00146784
11101011
0.00080064
111010101
0.00080064
0.00053376
111010100
0
111010100
1
4.155
0010110
100
0011111
100
0100000
010
0100110
010
4.7143
AverageCodewordLength
0111111
0
1000000
1
1100001
1
0010101
00
0011011
00
4.4608
1110111
010000001
0
010011001
0
011000011
0
011100111
0
4.6207
1100001
1
0111111
0
1000000
1
0111111
10
1000000
01
4.4608
0011110
0
1100001
1
1001010
01
0011101
001
1001011
100
4.3085
1000000
00010
1000000
000011
1000000
000010
1000000
0000011
1000000
0000010
4.892178
61
1110111
01111110
011111110
011111111
0
100000000
01
4.6078178
Table 5.2: RVLCs generated using different RVLC construction algorithms on Tsai
probability distribution
Table 5.3 presents a list of all the reversible variable length codewords generated using
different algorithms using 1-gram probability distribution.
SYMB
OLS
POBABILI
TIES
HUFFMA
NCODE
TAKISHI
MA
(SYMM
RVLC)
TSAI
ALGORIT
HM
(SYMM
RVLC)
010
JEONG
(SYMM
RVLC)
000
YAN
ALGORIT
HM
(SYMM
RVLC)
000
000
TSAI
ALGORIT
HM
(ASYMM
RVLC)
000
GOLOMB
RICECODE
(ASYMM
RVLC)
01
0.1304
011
0.1045
110
111
010
0.0856
0000
110
0.0797
0001
0.0707
010
101
111
111
00
101
101
110
010
0101
111
0110
1001
111
1001
101
1010
110
1001
0011
100
0110
0110
0010
1011
0000
0.0677
0100
1010
1001
1111
1001
1101
1010
1111
0.0627
1000
1110
00100
01110
00100
0100
10011
01110
0.0607
1001
10001
01110
10001
11011
1011
10010
10001
0.0528
1011
10101
10001
00100
01110
0110
100011
00100
0.0378
00101
11011
11011
11011
10001
11001
100010
11011
0.0339
01010
1100
001100
011110
001100
10011
1000011
011110
0.0289
01011
10010
011110
100001
110011
01110
1000010
100001
0.0279
10101
11110
100001
001100
011110
10001
10000011
001100
0.0249
11101
100001
110011
110111
100001
001100
10000010
110011
0.0249
11100
101101
0010100
0111110
011110
100000011
0111110
0.0199
001000
110011
011100
1000001
100001
100000010
1000001
0.0199
11111
10100
0111110
0010100
1001001
1000000011
0010100
0.0199
11110
11100
1000001
0011100
0011100
1000000010
1101011
0.0149
101000
100010
1100011
1100011
1100011
0.0139
101001
111110
1101011
1101011
0.0092
0010010
1000001
0011110
0001000
1000000001
1
1000000001
0
1000000000
0011100
001010
0
110101
1
001110
0
110001
1
011111
0
100000
1
001111
97
0111110
1000001
TAKISHIM
A
(ASYMM
RVLC)
1100011
0001000
0
K
0.0042
0.0017
0.0013
0.0012
0.0008
AverageCodeword
Length
0010011
1
0010011
000
0010011
001
0010011
010
0010011
011
4.155
1011101
1011010
0
1111110
0
1000000
10
1001100
10
4.7143
00
0111111
0
1000000
1
1100001
1
0010101
00
0011011
00
4.4608
1110111
0100000
010
0100110
010
0110000
110
0111001
110
4.6207
11
110000
11
011111
10
100000
01
011111
110
100000
001
4.4608
0011110
0
1100001
1
1001010
01
0011101
001
1001011
100
4.3085
1000000000
10
1000000000
011
1000000000
010
1000000000
0011
1000000000
0010
4.8877
1110111
01111110
01111111
0
01111111
10
10000000
001
4.6168
Table 5.3: RVLCs generated using different RVLC construction algorithms on 1- gram
probability distribution
The values of average codeword lengths and maximum codeword lengths obtained for
different construction algorithms are given in Table 5.4.
SYMMETRIC RCLC
Average
Codeword Length
Algorithms
Comparing
Parameters
Huff
man
Takishima
Tsai
Jeong
ASYMMETRIC RCLC
Yan Takisima
Tsai Golomb
and
Rice
4.30 4.8877
85
One Gram
distribution
4.156
8
4.7143
4.61
57
4.4608
4.460
8
4.6168
Buttigieg
distribution
4.204
6
4.7799
4.68
41
4.5418
4.541
8
4.6848
4.36
62
5.0394
Tsai
distribution
4.155
723
4.7002
9173
4.60
7417
43
4.4646
3681
4.464
63681
4.6078
18
4.30
8777
4.89217
9
10
10
11
10
14
Maximum codeword
length
Table 5.4: Average codeword length and maximum codeword length of RVLCs constructed
using different construction algorithms on different distributions
From Table 5.4, it can be seen that minimum value of average codeword length of RVLCs
can be obtained by either using Yan algorithm or Jeong algorithm over 1-gram distribution.
Although Huffman algorithm gives the minimum average codeword length but it doesnt
98
provide any error resiliency. It may be noted that 1-gram distribution is a well known
distribution in cryptography [99], but it has never been used before in the analysis of RVLCs.
The conclusion drawn from the comparison Table 5.4 can be summarized below:
If we compare on the basis of maximum codeword length, Golomb and Rice RVLC
construction algorithm is worst out of these algorithms.
The least value of maximum codeword length is obtained for the case of Tsai
symmetrical RVLC construction algorithm, Yan symmetrical RVLC construction
algorithm and Jeong symmetrical RVLC construction algorithm.
Among symmetric RVLCs, the least values of average codeword lengths are obtained
for Jeong symmetric RVLC algorithm and Yan symmetric RVLC algorithm.
Thus, it may be concluded that the performances of Yan symmetrical RVLC algorithm and
Jeong symmetrical RVLC algorithm, over 1-gram distribution, are better as compared to
those of other construction algorithms in terms of minimum average codeword length and
minimum value of maximum codeword length. Yan is a Huffman independent algorithm and
Jeong is Huffman dependent algorithm. Therefore, we choose Yan algorithm over other
algorithms to evaluate the performances of RVLCs in multimedia applications which are
discussed in the next section.
In this chapter, our purpose is to evaluate the performance of different entropy encoders. For
this, we require ways to mathematically measure it. A compression algorithm can be
formulated in a number of different ways. We could measure complexity of an algorithm, the
memory required to implement the algorithm, how fast the algorithm performs on a given
machine, the amount of compression, and how closely the reconstruction resembles the
original [129]. We have used the following measures of performance for both image coding
and video coding:
1. Compression Ratio;
2. Peak Signal to Noise Ratio (PSNR);
3. Total Encoding Bits;
99
10
where MAX is the maximum possible intensity value for a pixel in an image, i.e. 255
images, and , where one of the
images is considered a lossy approximation of the other, Mean Square Error (MSE) is
defined as:
where,
the
,
,
,
represents a pixel of
image at
position.
simply indicates how many bits are generated at the output of encoder of the image
compression or the video compression.
4. Maximum Codeword Length: A Maximum Codeword Length is an important
performance measure which is used to compare different coding methods. In practical
systems, a very high value of maximum codeword length (say, 80 or 90) creates
difficulty in assigning the codewords to the message symbols and creates problems in
decoding also.
5. Average Codeword Length: Here we consider a DMS (Discrete Memory-less Source)
emitting
messages, the
probabilities of the symbols equals to one. If the messages be encoded into binary
codes, with
message of length
code , is defined as the average number of bits required for transmitting a symbol:
,
.
The above mentioned performance measures have been used to evaluate the performances of
image compression and video compression. Next, we discuss simulation results obtained for
image compression and video compression.
101
70
70
Q10 =
90
120
245
255
60
60
50
70
80
95
65 80 120
85 110 145
110 185 255
185 255 255
255 255 255
255 255 255
16
12
14
14
Q50 =
18
24
49
72
11 10 16
12 14 19
24
26
40
58
51
60
13 16 24
40
57
69
17 22 29
51
87
80
22 37 56
35 55 64
68 109 103
81 104 113
3 2 2
2 2 3
3 3 3
3 3 4
Q90 =
4 4 7
5 7 11
10 13 16
14 18 19
12
5
6
8 11
10 17
11 14 22
13 16 12
17
21 24
20 22 20
61
55
56
62
77
92
101
99
10 12
12 11
14 11
16 12
21 15
23 18
24 21
20 20
Figure 5.1: Standard quantization matrices for different quality levels: Q10, Q50 and Q90
The effects of different quantization matrices are: use of 10 gives makes a large number of
zeros, while 90 results in very few zeros.
102
We first select a value of quality factor giving good results of JPEG image compression.
Next, using that quality factor, we analyze the performance of the image compression
technique for different entropy encoders. Thus, the steps for analyzing the performance of
image compression are:
1. Step 1: Analyze the performance of image coder for different quantization matrices
corresponding to different quality factors;
2. Step-2: Using the optimised quantization matrix, obtained from the results of step 1,
we evaluate the performance of image coder for the three entropy encoders: Huffman
codes, RVLCs and VLECs.
For the evaluation of the performance measures of image compression, we take four standard
images present in the literature- Cameraman, Lena, Airplane and Pepper [131].
103
It may be noted from Figure 5.3 that the decoded image is highly degraded if 10 is used as
the quantization matrix. The quality of image is worse at the places of lower frequency
components, like at the background. The image has small blocks of the same intensity.
There is not much difference in perceptual quality of Figure 5.4 comparing with Figure 5.5.
Figure 5.5 presents a better result as compared to Figure 5.4, due to a higher value of quality
factor and less compression of the original image. Figure 5.4 gives an optimised result in the
sense that the approximate perception of the image is obtained with less number of
transmitted bits. With a quality level of 50, this matrix renders both high compression and
excellent decompressed image quality.
The results for cameraman are already discussed in Figure 5.2 to Figure 5.5. We applied
image compression to several other images. The results on other images like Lena, Pepper
and Airplane [131] for different quantization matrices are given in Appendix B.
Similarly, Figure 5.4 and Figure 5.5 show decoded images for quantization matrices Q50 and
Q90 respectively. For a particular value of quality factor, the same decoded image is obtained
using different entropy encoders for a noiseless channel.
Simulation results are compiled in Table 5.5. The values of PSNR, compression ratio and
total number of encoding bits are computed for each entropy encoder. The values are
computed for each quality factor: Q10, Q50 and Q90. In the noiseless case (no error
introduced) PSNR and the compression ratio remain same for all the entropy encoders,
because the same decoded image is obtained at the decoder.
Quality Entropy
PSNR
Compression Maximum
Average
Level
(dB)
Ratio
Codeword of
Encoding
Codeword
105
Total Number
Encoding
Q = 10
Q = 50
Techniques
Length
Length
Huffman
12
4.7610
8014
15
5.0660
8529
VLEC
23
6.1940
10421
Huffman
14
5.2895
9656
17
5.9528
10274
35
6.5558
12556
15
5.5220
11200
21
6.1869
12559
35
6.6710
13555
RVLC
RVLC
VLEC
26.4635
31.6631
9.0082
7.4711
Huffman
Q = 90
RVLC
34.3669
6.4444
VLEC
Bits
Table 5.5: Results of image compression with no error introduction in the channel
For a noiseless channel, we may arrive at the following conclusions from Table 5.5:
PSNR ratio and the compression ratio remain same for the three coding techniques for
a particular quality factor;
PSNR and compression ratio decrease with the increase in quality factor from 10 to
90;
Case 2- Noisy channel with single bit error introduced into the channel
The performance of error resilient RVLCs over Huffman codes can be observed in a noisy
environment. If we use Huffman coding as entropy encoder in the presence of an error,
synchronisation will be lost in the forward decoding of the bit-stream at that point of error.
Here, we present MATLAB simulated results of image compression for cameraman input
image for different entropy encoders.
106
Figure 5.6 represents the original image that was encoded and transmitted using Huffman
entropy encoder. The error was introduced in the bit-stream at a random position and the
received bit-stream was decoded at the receiver. Figure 5.7 represents the decoded image.
107
Case 3- Noisy channel with more than one error introduced into the channel
As observed above, the performance of RVLC is no doubt better than Huffman code in noisy
environment. In noisy channel with single bit error introduced, Huffman code is unable to
completely decode, but RVLC decodes the bit-stream except a small segment containing that
particular erroneous bit. If more than one error is introduced in the channel, RVLC also fails
to decode completely. In such a situation, we demonstrate how VLECs perform better as
compared to RVLCs or Huffman codes. The VLEC gives the promising solution in the noisy
channel environment. Let us assume that the noisy channel has introduced two errors in the
entropy-encoded bit-stream. The original image is shown in Figure 5.12. Figure 5.13 displays
decoded Huffman decoded image. It can be seen that after the occurrence of first error,
Huffman has failed completely to decode it. Figure 5.14 shows the RVLC decoded image.
After the occurrence of first error, backward decoding is done. Combining the results of
forward decoding and backward decoding, RVLC decoded image is constructed that
demonstrates the loss of bit-stream information between the two extreme positioned errors.
Figure 5.15 represents the VLEC-decoded image.
108
large values of average codeword length, maximum codeword length and the compression
ratio. One possibility is to replace Golomb and Rice RVLC by another RVLC to get
improved results.
Different algorithms to construct RVLCs have been implemented and their comparative
analysis was reported in Section 5.2.2. The conclusion drawn from the analysis was that the
Yan construction algorithm for generating symmetric RVLCs is the best among all other
RVLC construction algorithms. In this section, we evaluate the performance of H.263++
video compression with three different entropy encoders:
1. Huffman coding method (VLC);
2. Conventional RVLCs used in H.263++ (Golomb and Rice RVLCs);
3. Yan RVLCs.
The reason to select Huffman is clear as Huffman is traditionally the best source code.
Conventional H.263++ video coder uses Golomb and Rice RVLCs and Yan Symmetric
RVLCs. Three videos (named as Video-1, Video-2 and Video-3) have been used in the
database to measure the performances of different coding schemes. The video details are
shown in Table 5.6.
Video Parameters
Video-1
(adobe.avi)
Video-2
(gunman.avi)
Video-3
(ball.avi)
3.265
10.62
43.71dB
38.02dB
268
500
29.97
30
Duration
2.141 sec
8.942 sec
16.667 sec
File size
4.13 MB
509 KB
1.54 MB
Frame height
538
240
240
Frame width
669
320
320
110
Table 5.6: Comparison of videos of different video quality measures- MSE and PSNR
The performance of different entropy codes Huffman codes, Golomb and Rice RVLCs and
Yan RVLCs is compared on the basis of the following indices:
1. Compression Ratio;
2. MSE (Mean Square Error);
3. PSNR (Peak Signal to Noise Ratio);
4. Total encoding bits;
5. Average number of bits per symbol.
Next, we present the MATLAB simulated results of video coder for rhino input video. Few
frames of the input video are shown in Figure 5.16.
111
Three different entropy encoders (Huffman code, Golomb and Rice RVLC and Yan RVLC)
are applied to the H.263++ video coding standard. The values of PSNR and MSE are given in
Table 5.6. The obtained values of maximum codeword length, average codeword length and
total encoding bits for different entropy encoders are given in Table 5.7 for Video-1.
L
Entropy encoding
Total
method
Huffman
Yan RVLC
encoding Average
bits
codeword length
17
1036396
2.99
>87
Very Large
87
Very Large
17
1053948
3.04
14
1106316
3.19
13
>1106316
28
1163627
3.35
>>28
>1163627
20
1096634
3.16
20
>1096634
20
>1096634
Table 5.7: Comparison table of all entropy encoders for first video
Similar comparison tables are obtained for the other Video-2 and Video-3. The details of
these tables are given in Appendix C. Table 5.8 shows the comparison of the different
entropy encoders, in terms of average codeword length, maximum codeword length and total
encoding bits. RVLCs have been generated using Yan construction algorithm and Golomb
and Rice construction algorithm for different values of
Video
Entropy
encoding
method
VIDEO-1
.
Total
encoding
bits
Average
codeword length
(bits/symbol)
Huffman coding
17
1036396
2.99
Yan RVLC
17
1053948
3.04
Yan RVLC
14
1106316
3.19
Conventional
20
1096634
3.16
H.263++ RVLC
112
Conventional
20
>1096634
> 3.16
Huffman coding
17
33845
2.99
Yan RVLC
17
38110
3.04
Yan RVLC
14
39706
3.19
Conventional
20
41209
3.16
20
>41209
> 3.16
Huffman coding
17
61437
2.99
Yan RVLC
17
38110
3.04
Yan RVLC
14
39706
3.19
Conventional
20
41209
3.16
20
>41209
> 3.16
H.263++ RVLC
VIDEO-2
H.263++ RVLC
Conventional
H.263++ RVLC
VIDEO-3
H.263++ RVLC
Conventional
H.263++ RVLC
Table 5.8: Comparison of different source codes, on the basis of the encoded file sizes
codeword length, maximum codeword length and total encoding bits are less as compared to
the corresponding values obtained using for conventional H.263++ RVLCs for the same
perceptual quality of the decoded video.
5.6. CONCLUSIONS
In coding, in addition to the problems of error control (error detection and correction), there
are other problems for reliable communication. Some of these problems are those related to
compression and error resilience. An important class of codes studied here is the Reversible
Variable Length Codes. RVLCs have been studied in detail by constructing these using
seven different algorithms and thereafter a comparative analysis has been made among all
construction algorithms. This analysis proves that Yan algorithm is the best algorithm to
113
that VLECs perform better than the other entropy encoders (Huffman codes and RVLCs).
Also, through MATLAB simulation of H.263++ video coding standard, we have shown that
the performance of video coder is improved by using Yan RVLCs in place of conventional
RVLCs (Golomb and Rice RVLCs), for the same values of perceptual quality parameters
(PSNR, MSE and compression ratio).
114