Sie sind auf Seite 1von 61

Chapter 3:

Multimedia Data
Compression
•Compression with loss and lossless
•Huffman coding
•Entropy coding
•Adaptive coding
•Dictionary-based coding(LZW)

Dr. Zafar Sheikh


chapter3: Multimedia Compression 1
Data Compression
• Branch of information theory
– minimize amount of information to be
transmitted
• Transform a sequence of characters into a
new string of bits
– same information content
– length as short as possible

chapter3: Multimedia
2
Compression
Why Compress
• Raw data are huge.
• Audio:
CD quality music
44.1kHz*16bit*2 channel=1.4Mbps
• Video:
near-DVD quality true color animation
640px*480px*30fps*24bit=220Mbps
• Impractical in storage and bandwidth

chapter3: Multimedia Compression 3


Compression
Graphic file formats can be regarded as being of three types.
• The first type stores graphics uncompressed.
– Windows BMP (.bmp) files are an example of this sort of
format.
• The second type is called "non-lossy“ or “lossless”
compression formats.
– Most graphic formats use lossless compression - the GIF
formats are among them.
• The third type of bitmapped graphic file formats is called
"lossy" compression.
– the details are what prevent areas from being all the same
color, and as such from responding well to compression.
– perhaps too subtle to be discernable by your eye
chapter3: Multimedia Compression 4
Lossless is not enough!
• The best lossless audio and image
compression ratio is normally a half
• Lossy audio compression like mp3 or ogg
achieve 1/20 ratio while remain
acceptable quality, and 1/5 ratio for
perfect quality
• Lossy video compression reduce a film to
1/300 size

chapter3: Multimedia Compression 5


Lossy Compression
• Massively reduce information we don’t
notice
• Highly content specific
• Psychology

chapter3: Multimedia Compression 6


Lossy Audio Compression
• Frequency domain
• Quantization
– The importance varies in bands
– Higher frequency, larger quantum
• Psychoacoustics
– Pitch resolution of ear is only 2Hz without
beating
– Threshold of hearing varies in bands
– Simultaneous and temporal masking effect

chapter3: Multimedia Compression 7


Lossy Image Compression
• Frequency domain
– Discrete Cosine Transform (in Jpeg)
– Discrete Wavelet Transform (in J2k)
• Quantization
– Reduce less important data

Image Entropy Output


Transform Quantization
data Coding data

chapter3: Multimedia Compression 8


Broad Classification
• Entropy Coding (statistical)
– lossless; independent of data characteristics
– e.g. RLE( Run Length Encoding), Huffman, LZW,
Arithmetic coding
• Source Coding
– lossy; may consider semantics of the data
– depends on characteristics of the data
– e.g. DCT, DPCM, ADPCM, color model transform
• Hybrid Coding (used by most multimedia systems)
– combine entropy with source encoding
– e.g., JPEG-2000, H.264, MPEG-2, MPEG-4, MPEG-7

chapter3: Multimedia
9
Compression
Huffman Coding
• Huffman codes can be used to compress
information
– Like WinZip – although WinZip doesn’t use the
Huffman algorithm
– JPEGs do use Huffman as part of their compression
process
• The basic idea is that instead of storing each
character in a file as an 8-bit ASCII value, we will
instead store the more frequently occurring
characters using fewer bits and less frequently
occurring characters using more bits
– On average this should decrease the filesize (usually ½)

chapter3: Multimedia Compression 10


Huffman Coding

• As an example, lets take the string:


“duke blue devils”
• We first to a frequency count of the characters:
• e:3, d:2, u:2, l:2, space:2, k:1, b:1, v:1, i:1, s:1
• Next we use a Greedy algorithm to build up a
Huffman Tree
– We start with nodes for each character

e,3 d,2 u,2 l,2 sp,2 k,1 b,1 v,1 i,1 s,1
chapter3: Multimedia Compression 11
Huffman Coding
• We then pick the nodes with the smallest
frequency and combine them together to
form a new node
– The selection of these nodes is the Greedy part
• The two selected nodes are removed from
the set, but replace by the combined node
• This continues until we have only 1 node
left in the set

chapter3: Multimedia Compression 12


Huffman Coding

e,3 d,2 u,2 l,2 sp,2 k,1 b,1 v,1 i,1 s,1

chapter3: Multimedia Compression 13


Huffman Coding

e,3 d,2 u,2 l,2 sp,2 k,1 b,1 v,1 2

i,1 s,1

chapter3: Multimedia Compression 14


Huffman Coding

e,3 d,2 u,2 l,2 sp,2 k,1 2 2

b,1 v,1 i,1 s,1

chapter3: Multimedia Compression 15


Huffman Coding

e,3 d,2 u,2 l,2 sp,2 3 2

k,1 2 i,1 s,1

b,1 v,1

chapter3: Multimedia Compression 16


Huffman Coding

e,3 d,2 u,2 4 3 2

l,2 sp,2 k,1 2 i,1 s,1

b,1 v,1

chapter3: Multimedia Compression 17


Huffman Coding

e,3 4 4 3 2

d,2 u,2 l,2 sp,2 k,1 2 i,1 s,1

b,1 v,1

chapter3: Multimedia Compression 18


Huffman Coding

e,3 4 4 5

d,2 u,2 l,2 sp,2 2 3

i,1 s,1 k,1 2

b,1 v,1

chapter3: Multimedia Compression 19


Huffman Coding

7 4 5

e,3 4 l,2 sp,2 2 3

d,2 u,2 i,1 s,1 k,1 2

b,1 v,1

chapter3: Multimedia Compression 20


Huffman Coding

7 9

e,3 4 4 5

d,2 u,2 l,2 sp,2 2 3

i,1 s,1 k,1 2

b,1 v,1

chapter3: Multimedia Compression 21


Huffman Coding

16

7 9

e,3 4 4 5

d,2 u,2 l,2 sp,2 2 3

i,1 s,1 k,1 2

b,1 v,1
chapter3: Multimedia Compression 22
Huffman Coding
• Now we assign codes to the tree by
placing a 0 on every left branch and a 1 on
every right branch
• A traversal of the tree from root to leaf
give the Huffman code for that particular
leaf character
• Note that no code is the prefix of another
code

chapter3: Multimedia Compression 23


e 00
Huffman Coding d 010
u 011
16 l 100
sp 101
7 9 i 1100
s 1101
e,3 4 4 5
k 1110
b 11110
d,2 u,2 l,2 sp,2 2 3
v 11111

i,1 s,1 k,1 2

b,1 v,1

chapter3: Multimedia Compression 24


Huffman Coding

• These codes are then used to encode the string


• Thus, “duke blue devils” turns into:
010 011 1110 00 101 11110 100 011 00 101 010 00 11111 1100 100 1101

• When grouped into 8-bit bytes:


01001111 10001011 11101000 11001010 10001111 11100100 1101xxxx

• Thus it takes 7 bytes of space compared to 16


characters * 1 byte/char = 16 bytes
uncompressed

chapter3: Multimedia Compression 25


Huffman Coding
• Uncompressing works by reading in the file bit
by bit
– Start at the root of the tree
– If a 0 is read, head left
– If a 1 is read, head right
– When a leaf is reached decode that character and start
over again at the root of the tree
• Thus, we need to save Huffman table
information as a header in the compressed file
– Doesn’t add a significant amount of size to the file for
large files (which are the ones you want to compress
anyway)
– Or we could use a fixed universal set of
codes/freqencies
chapter3: Multimedia Compression 26
Entropy Coding Algorithms
(Content Dependent Coding)
• Run-length Encoding (RLE)
– Replaces sequence of the same consecutive
bytes with number of occurrences
– Number of occurrences is indicated by a
special flag (e.g., !)
– Example:
• abcccccccccdeffffggg (20 Bytes)
• abc!9def!4ggg (13 bytes)

chapter3: Multimedia
27
Compression
Variations of RLE (Zero-suppression
technique)
• Assumes that only one symbol appears
often (blank)
• Replace blank sequence by M-byte and a
byte with number of blanks in sequence
– Example: M3, M4, M14,…
• Some other definitions are possible
– Example:
• M4 = 8 blanks, M5 = 16 blanks, M4M5=24 blanks

chapter3: Multimedia
28
Compression
Adaptive Coding
Motivations:
– The previous algorithms (Huffman) require the statistical
knowledge which is often not available (e.g., live audio, video).
– Even when it is available, it could be a heavy overhead.
– Higher-order models incur more overhead. For example, a 255
entry probability table would be required for a 0-order model. An
order-1 model would require 255 such probability tables. (A
order-1 model will consider probabilities of occurrences of 2
symbols)
The solution is to use adaptive algorithms. Adaptive
Huffman Coding is one such mechanism that we will
study.
The idea of “adaptiveness” is however applicable to other
adaptive compression algorithms.

chapter3: Multimedia Compression 29


Adaptive Coding

ENCODER
Initialize_model();
do { DECODER
c = getc( input ); Initialize_model();
encode( c, output ); while ( c = decode (input)) != eof) {
update_model( c ); putc( c, output)
} while ( c != eof) update_model( c );
}

r The key is that, both encoder and decoder use exactly the
same initialize_model and update_model routines.

chapter3: Multimedia Compression 30


The Sibling Property
The node numbers will be assigned in such a way
that:
1. A node with a higher weight will have a higher node
number
2. A parent node will always have a higher node number
than its children.
In a nutshell, the sibling property requires that the
nodes (internal and leaf) are arranged in order of
increasing weights.
The update procedure swaps nodes in violation of
the sibling property.
– The identification of nodes in violation of the sibling
property is achieved by using the notion of a block.
– All nodes that have the same weight are said to belong
to one block chapter3: Multimedia Compression 31
Flowchart of the update procedure
 The Huffman tree is
START initialized with a single
First
node, known as the Not-
NYT gives birth
To new NYT and
Yes appearance
of symbol
Yet-Transmitted (NYT) or
escape code. This code
external node
No
Increment weight
of external node
Go to symbol
external node
will be sent every time
that a new character,
and old NYT node;
Adjust node
numbers
Node
number max
No
Switch node with
highest numbered
which is not in the tree,
is encountered, followed
Go to old in block? node in block
NYT node

Yes
by the ASCII encoding of
the character. This allows
Increment
node weight

for the de-compressor


to distinguish between a
Is this No
the root Go to
node? parent node

Yes
code and a new
STOP character. Also, the
procedure creates a new
chapter3: Multimedia Compression 32
Example

NYT
#0

Initial Huffman
Counts: Tree
Root
(number of W=16
#8
occurrences)
W=6 E
B:2 #6 W=10
C:2 #7
W=2 W=4
D:2 #4 #5
E:10
NYT B C D
W=2 W=2 W=2
#0 #1 #2 #3

Example Huffman tree after some symbols have been


processed in accordance
with the Multimedia
chapter3: sibling property
Compression 33
Example

Counts:
(number of Root
W=16+1
occurrences) #10

A:1 W=6+1 E
#8 W=10
B:2 #9
C:2 W=2+1 W=4
#6 #7
D:2
E:10 W=1 B C D
#2 W=2 W=2 W=2
#3 #4 #5

NYT A
#0 W=1
#1

A Huffman tree after first appearance


of symbol A

chapter3: Multimedia Compression 34


Increment
Counts:
Root
A:1+1 W=17+1
#10
B:2
C:2 W=7+1 E
#8 W=10
D:2 #9
E:10 W=3+1 W=4
#6 #7

W=1+1 B C D
#2 W=2 W=2 W=2
#3 #4 #5

NYT A
W=1+1
#0
#1

An increment in the count for A propagates up


to the root

chapter3: Multimedia Compression 35


Swapping
Counts: Another increment in the count for A results in
A:2+1 swap Root
W=18
#10
B:2
C:2 W=8 E
#8 W=10
D:2 #9
E:10 W=4 W=4
#6 #7

W=2
B
#2
W=2
C
W=2
D
W=2 Swap nodes 1
A
#3 #4 #5 and 5
NYT W=2
#0 #1
Counts:
Root
A:3 W=18+1
#10
B:2
C:2 W=8+1 E
#8 W=10
D:2 #9
E:10 W=4 W=4+1
#6 #7

W=2
#2 B C A
W=2 W=2 W=2+1
#3 #4 #5
D
NYT W=2
#0 #1
chapter3: Multimedia Compression 36
Swapping … contd.
Counts:
Root
A:3+1 W=19+1
#10
B:2
C:2 W=9+1 E
#8 W=10
D:2 #9
E:10 W=4 W=5+1
#6 #7

W=2 B C A
#2 W=2 W=2 W=3+1
#3 #4 #5
D
NYT W=2
#0 #1

Another increment in the count for A


propagates up

chapter3: Multimedia Compression 37


Swapping … contd.
Another increment in the count for A causes
swap of sub-tree
Counts:
Root
A:4+1 W=20
#10
B:2
C:2 W=10 E
#8 W=10
D:2 #9
E:10 W=4 W=6
#6 #7

W=2 B C A
#2 W=2 W=2 W=4
#3 #4 #5
D
NYT W=2
#0 #1

Swap nodes 5
and 6

chapter3: Multimedia Compression 38


Swapping … contd.
Further swapping needed to fix the tree
Counts:
Root
A:4+1 W=20
#10
B:2
C:2 W=10 E
#8 W=10
D:2 #9
E:10 A W=6
W=4+1 #7
#6
C W=4
W=2 #5
#4

W=2 B
#2 W=2
#3
D
NYT W=2
#0 #1

Swap nodes 8
and 9

chapter3: Multimedia Compression 39


Swapping … contd.
Counts:
Root
A:5 W=20+1
#10
B:2
C:2 E W=10+1
W=10 #9
D:2 #8
E:10 A W=6
W=5 #7
#6
C W=4
W=2 #5
#4

W=2 B
#2 W=2
#3

D
NYT W=2
#0 #1

chapter3: Multimedia Compression 40


Lempel-Ziv-Welch (LZW) Compression Algorithm

 Introduction to the LZW Algorithm

 Example 1: Encoding using LZW

 Example 2: Decoding using LZW

 LZW: Concluding Notes

chapter3: Multimedia Compression 41


Introduction to LZW
 As mentioned earlier, static coding schemes require
some knowledge about the data before encoding takes
place.

 Universal coding schemes, like LZW, do not require


advance knowledge and can build such knowledge on-
the-fly.

 LZW is the foremost technique for general purpose data


compression due to its simplicity and versatility.

 It is the basis of many PC utilities that claim to “double


the capacity of your hard drive”

 LZW compression uses a code table, with 4096 as a


common choice for the number of table entries.
chapter3: Multimedia Compression 42
Introduction to LZW (cont'd)
 Codes 0-255 in the code table are always assigned to
represent single bytes from the input file.

 When encoding begins the code table contains only the


first 256 entries, with the remainder of the table being
blanks.

 Compression is achieved by using codes 256 through


4095 to represent sequences of bytes.

 As the encoding continues, LZW identifies repeated


sequences in the data, and adds them to the code table.

 Decoding is achieved by taking each code from the


compressed file, and translating it through the code table
to find what character or characters it represents.
chapter3: Multimedia Compression 43
LZW Encoding Algorithm
1 Initialize table with single character strings
2 P = first input character
3 WHILE not end of input stream
4 C = next input character
5 IF P + C is in the string table
6 P=P+C
7 ELSE
8 output the code for P
9 add P + C to the string table
10 P=C
11 END WHILE

12 output code for P

chapter3: Multimedia Compression 44


Example 1: Compression using LZW

Example 1: Use the LZW algorithm to compress the string

BABAABAAA

chapter3: Multimedia Compression 45


Example 1: LZW Compression Step 1

BABAABAAA P=A
C=empty
ENCODER OUTPUT STRING TABLE
output code representing codeword string
66 B 256 BA

chapter3: Multimedia Compression 46


Example 1: LZW Compression Step 2

BABAABAAA P=B
C=empty
ENCODER OUTPUT STRING TABLE
output code representing codeword string
66 B 256 BA
65 A 257 AB

chapter3: Multimedia Compression 47


Example 1: LZW Compression Step 3

BABAABAAA P=A
C=empty
ENCODER OUTPUT STRING TABLE
output code representing codeword string
66 B 256 BA
65 A 257 AB
256 BA 258 BAA

chapter3: Multimedia Compression 48


Example 1: LZW Compression Step 4

BABAABAAA P=A
C=empty
ENCODER OUTPUT STRING TABLE
output code representing codeword string
66 B 256 BA
65 A 257 AB
256 BA 258 BAA
257 AB 259 ABA

chapter3: Multimedia Compression 49


Example 1: LZW Compression Step 5

BABAABAAA P=A
C=A
ENCODER OUTPUT STRING TABLE
output code representing codeword string
66 B 256 BA
65 A 257 AB
256 BA 258 BAA
257 AB 259 ABA
65 A 260 AA

chapter3: Multimedia Compression 50


Example 1: LZW Compression Step 6

BABAABAAA P=AA
C=empty
ENCODER OUTPUT STRING TABLE
output code representing codeword string
66 B 256 BA
65 A 257 AB
256 BA 258 BAA
257 AB 259 ABA
65 A 260 AA
260 AA
chapter3: Multimedia Compression 51
LZW Decompression

 The LZW decompressor creates the same string table


during decompression.

 It starts with the first 256 table entries initialized to


single characters.

 The string table is updated for each character in the


input stream, except the first one.

 Decoding achieved by reading codes and translating


them through the code table being built.

chapter3: Multimedia Compression 52


LZW Decompression Algorithm
1 Initialize table with single character strings
2 OLD = first input code
3 output translation of OLD
4 WHILE not end of input stream
5 NEW = next input code
6 IF NEW is not in the string table
7 S = translation of OLD
8 S=S+C
9 ELSE
10 S = translation of NEW
11 output S
12 C = first character of S
13 OLD + C to the string table
14 OLD = NEW
15 END WHILE

chapter3: Multimedia Compression 53


Example 2: LZW Decompression 1

Example 2: Use LZW to decompress the output sequence of


Example 1:

<66><65><256><257><65><260>.

chapter3: Multimedia Compression 54


Example 2: LZW Decompression Step 1
<66><65><256><257><65><260> Old = 65 S=A
New = 66 C=A
ENCODER OUTPUT STRING TABLE
string codeword string
B
A 256 BA

chapter3: Multimedia Compression 55


Example 2: LZW Decompression Step 2
<66><65><256><257><65><260> Old = 256 S = BA
New = 256 C = B

ENCODER OUTPUT STRING TABLE


string codeword string
B
A 256 BA
BA 257 AB

chapter3: Multimedia Compression 56


Example 2: LZW Decompression Step 3
<66><65><256><257><65><260> Old = 257 S = AB
New = 257 C = A

ENCODER OUTPUT STRING TABLE


string codeword string
B
A 256 BA
BA 257 AB
AB 258 BAA

chapter3: Multimedia Compression 57


Example 2: LZW Decompression Step 4
<66><65><256><257><65><260> Old = 65 S = A
New = 65 C = A

ENCODER OUTPUT STRING TABLE


string codeword string
B
A 256 BA
BA 257 AB
AB 258 BAA
A 259 ABA

chapter3: Multimedia Compression 58


Example 2: LZW Decompression Step 5
<66><65><256><257><65><260> Old = 260 S = AA
New = 260 C = A

ENCODER OUTPUT STRING TABLE


string codeword string
B
A 256 BA
BA 257 AB
AB 258 BAA
A 259 ABA
AA 260 AA
chapter3: Multimedia Compression 59
LZW: Some Notes
 This algorithm compresses repetitive sequences of data
well.

 Since the codewords are 12 bits, any single encoded


character will expand the data size rather than reduce it.

 In this example, 72 bits are represented with 72 bits of


data. After a reasonable string table is built, compression
improves dramatically.

 Advantages of LZW over Huffman:


 LZW requires no prior information about the input data stream.
 LZW can compress the input stream in one single pass.
 Another advantage of LZW its simplicity, allowing fast
execution.

chapter3: Multimedia Compression 60


LZW: Limitations
 What happens when the dictionary gets too large (i.e., when all the
4096 locations have been used)?
 Here are some options usually implemented:

 Simply forget about adding any more entries and use the table
as is.

 Throw the dictionary away when it reaches a certain size.

 Throw the dictionary away when it is no longer effective at


compression.

 Clear entries 256-4095 and start building the dictionary again.

 Some clever schemes rebuild a string table from the last N


input characters.
chapter3: Multimedia Compression 61

Das könnte Ihnen auch gefallen