Sie sind auf Seite 1von 155

Training material of NHK-CTI

Error Detection and


Error Correction
Training material of NHK-CTI

2007.7.24
NHK Communications Training Institute
Hideo Tsuji tsuji@nhk-cti.jp
1
Training material of NHK-CTI

Contents Training material of NHK-CTI

Conception of error correction

Fundamental of error correction

Various error correction codes

Applications in broadcasting field

2
Training material of NHK-CTI

Concept of error Training material of NHK-CTI

correction

3
Training material of NHK-CTI

Program production to transmission Training material of NHK-CTI

for broadcasting
Recording
Transmission
Editing

Studio

Location Reception

Field pick up

live
4
Training material of NHK-CTI

Error correction on channel Training material of NHK-CTI

Digital Data reduction Convert to suitable signal for


data transmission

Noise
Information Source Channel
(Error)
(Video/audio/ coding Modulation
coding
data etc.
Channel
(Satellite,
Terrestrial)

Received Source Channel


Demodulation
information decoding decoding

5
Training material of NHK-CTI

Example of error in digital signal Training material of NHK-CTI

Original
signal 0 1 0 0 1 0 1 0

Small Threshold
noise level
0 1 0 0 1 0 1 0

Large Threshold
noise level

0 1 0 1 1 0 1 0

Advantage of digitalization
Only the distinction between 0 and 1, Strong against noise → The high quality
Error correction technology
6
Training material of NHK-CTI

Error factor generated on the channel Training material of NHK-CTI

Random error
 Irregularly and disorderly generated in the
transmission bit sequence
 Caused by a thermal noise and a wave distortion.

Burst error
 Concentrate partially and occur continuously
 Caused by loss of synchronization, impulse noise and
persistent attenuation like fading, etc.

Quality
 Evaluated by BER (Bit Error Rate)
 Ex.10-3 , 10-6

7
Training material of NHK-CTI

Error control method in communication system Training material of NHK-CTI

Reducing method of noise influence


Boost up power
Waste of energy
Cause probability of interference
Add redundancy to the transmission data according
to the rule and transmit
Roughly there are two methods to control error on
channel
 ARQ (Automatic Repeat Request)
 Issue the repeat request when the transmission error is detected on
the reception side.
 FEC (Forward Error Correction)
 Correct the error on the reception side while decoding
8
Training material of NHK-CTI

ARQ & FEC Training material of NHK-CTI

ARQ (Automatic Repeat Request)


If a receiver detects an error, the receiver requests
retransmission to the transmitter
Difficulty in dealing with a lot of receivers
Used by the packet type communication including TCP/IP
FEC: Forward Error Correction
If a receiver detects an error, the receiver corrects using
redundant data
Used by broadcasting, mobile communication and storage
media (DVD/CD-ROM)

9
Training material of NHK-CTI

Sequence of ARQ & FEC Training material of NHK-CTI

ARQ

TX data-1 data-2 data-2 data-3


ACK NAK ACK

RX data-1 data-2 data-2 data-3 Time

No error Error detect No error No error

FEC
TX data-1 data-2 data-3

RX data-1 data-2 data-3 Time


Error Error Error
correction correction correction
10
Training material of NHK-CTI

Feature of ARQ & FEC Training material of NHK-CTI

ARQ FEC

Return circuit is not


High reliability
needed
Strong against unexpected
Advantage Easy control as the system
error

Simple decoder Small delay

Difficult to deal with a


Return circuit is needed
very long burst error

Disadvantage Sometime large delay Large redundancy

Necessary to repeat sending


Complex decoder
under the bad channel
11
Training material of NHK-CTI

Error correction scheme Training material of NHK-CTI

Principle of error correction


Error correction can be performed adding
redundancy to the information transmitted
Encoder
Add redundancy (parity check bits) under some rule
Decoder
Detect and correct errors using parity check bits

12
Training material of NHK-CTI

What is error detection and error Training material of NHK-CTI

correction?
Error may occur during transmission
impossible decoding the data
External Noise
Coding Decoding
A 0100 010? ?

Add redundant data before transmission to


detect/correct error when decoding
Error corrective coding Error Correction
A 0100 0100 010 010? 010 0100 A
Decoding
13
Training material of NHK-CTI

Parity check Training material of NHK-CTI

Add a parity bit to information data according to


the following rule
if No. of “1” in the data is odd, add a parity bit “1”
if No. of “1” in the data is even, add a parity bit “0”
Able to detect 1 bit error

Information sequence Transmitted sequence Received sequence

10010001 10010001 1 100110011 !


00101110 00101110 0 101011000 ?
14
Training material of NHK-CTI

Example of parity check code (ASCII) Training material of NHK-CTI

ASCII stands for American Standard Code for


Information Interchange
Comprised of 7 digits (0 to 127)d
Add 1 bit for parity check comprising 8 bits
Character “A” is expressed by (100 0001) assii
for even parity (0100 0001)
for odd parity (1100 0001)
Receiver side check the number of “1” every 8
bit

15
Training material of NHK-CTI

ASCII code Training material of NHK-CTI

Character Binary Decimal Hexa-decimal


A 100 0001 65 41
B 100 0010 66 42
C 100 0011 67 43
D 100 0100 68 44
E 100 0101 69 45
F 100 0110 70 46

Character Binary Decimal Hexa-decimal


a 110 0001 97 61
b 110 0010 98 62
c 110 0011 99 63
d 110 0100 100 64
e 110 0101 101 65
f 110 0110 102 66
16
Training material of NHK-CTI

Example of checksum (ISBN) Training material of NHK-CTI

ISBN (International Standard Book Number)


The last digit is a check digit

Calculation
(modulus 11 with weights 10-2)
 4-87966-931-?
 4x10+8x9+7x8+9x7+6x6+6x5+9x4+3x3+1x2
= 40 +72 +56 +63 +36 +30 +36 +9 +2
=344
 344 ⁄ 11 = 31 … 3
 11-3 = 8  check digit is 8

17
Training material of NHK-CTI

Example of error detective code Training material of NHK-CTI

 Is “0-440-21861-5” a correct ISBN code?


 0x10+4x9+4x8+0x7+2x6+1x5+8x4+6x3+1x2
= 0 +36 +32 +0 +12 +5 +32 +18 +2
=137
 137 ⁄ 11 = 12 … 5
 11-5 = 6  calculated check digit is 6
 this code is wrong!

18
Training material of NHK-CTI

Example of errors (VTR) Training material of NHK-CTI

Expand a portion of image

Error
correction

Black and white spots are errors


(In case of pixel-wise coding )

19
Training material of NHK-CTI

Example of errors (broadcasting) Training material of NHK-CTI

Expand a portion of image

Error
correction

Error is observed like a rectangular mass (block)

(In case of MPEG-2 TS Video coding)

20
Training material of NHK-CTI

Example of errors Training material of NHK-CTI

Apart Spatially / Temporally

Original message Transmission/ Received message


Recoding

“This is a pen” “This is b pen”



Error
1001101011 1011101011

In digital, 0→1,1→0
21
Training material of NHK-CTI

Error detection (example of sentence) Training material of NHK-CTI

Error
Original Recognized
Received message
message message
“This is a pen” “This is a pbn” ?

No word

Error detect Can not correct

Candidate of pbn is pen, pan, pun,pin etc.


Impossible to correct, because there are several
candidates similar to pbn.
22
Training material of NHK-CTI

Error correction (example of sentence) Training material of NHK-CTI

Error
Original Recognized
Received message
message message
“This is a pen” “This is b pen” “This is a pen”

Grammar error

Error detect Correct

We can correct the sentence, because if you know


the grammar.
23
Training material of NHK-CTI

Error in digital transmission Training material of NHK-CTI

Error
Original Recognized
Received sequence
sequence sequence
1001101011 1011101011 ??

Can not detect error Can not correct

It is unknown whether this is right or not,


because we have no rule to check it.

24
Training material of NHK-CTI

Example of creating codewords Training material of NHK-CTI

(digital transmission)

Rule: Add one bit to the original message, and


make the number of “1” even number.

Original Add
[Number of “1”] Codeword [number of “1”]
sequence bit
sequence
0101 [2(even)] 0 01010 [2(even)]
+ =
1011 [3(odd)] 1 10111 [4(even)]

25
Training material of NHK-CTI

Error detection (digital transmission) Training material of NHK-CTI

Error
Original Codeword Received Recognized
sequence sequence sequence sequence
[number of “1”]
0101 01010 01110 [3(odd)] ?
1011 10111 10101 [3(odd)] ?

even number of “1” Error detect Can not correct

26
Training material of NHK-CTI

Example of creating codewords Training material of NHK-CTI

(digital transmission)
Rule: Add the same two bits to the original
one bit

Original bit Add bits Codewords


sequence

0 + 00 000
=
1 11 111

27
Training material of NHK-CTI

Error correction (digital transmission) Training material of NHK-CTI

Error
Original Codeword Received Recognized
bit sequence sequence sequence

0 000 001 000 0


1 111 110 111 1
0 000 100 000 0
1 111 101 111 1
1 111 011 111 1
find the
Invalid
nearest
codeword
codeword
000 or 111 Error detect Correct

28
Training material of NHK-CTI

Example of error correction Training material of NHK-CTI

X11 X12 X13 H1 1.Decide all H, as the sum of row will


be even number
X21 X22 X23 H2 2.Decide all V, as the sum of column
X31 X32 X33 H3 will be even number
3. Decide P, as the sum of H, V, and P
V1 V2 V3 P will be even number

0 1 0 1 0 1 1 1 ← Odd number

1 1 0 0 1 1 0 0
1 0 1 0 1 0 1 0
0 0 1 0 0 0 1 0

Odd number
29
Training material of NHK-CTI

Key point Training material of NHK-CTI

For error correction or detection,

Create codeword from information bits


(original message) under some rule.

30
Training material of NHK-CTI

Is error correcting Training material of NHK-CTI

code almighty?

31
Training material of NHK-CTI

Misscorrection (example of sentence) Training material of NHK-CTI

Error
Original Recognized
Received message
message message
“This is a pen” “This is a pbn” “This is a pun”

No word

Error detect Try to correct

Candidate of pbn is pen, pan, pun,pin etc.


Force to correct the nearest word

32
Training material of NHK-CTI

Error detection (digital transmission) Training material of NHK-CTI

Original Codeword Received Recognized


bit sequence sequence sequence

0 000 2 bits 101 Misscorrection 111 1


1 111 1 bit 110 111 1
0 000 1 bit 100 Correct 000 0
1 111 1 bit 101 properly 111 1
1 111 1 bit 011 111 1
find the nearest codeword
Invalid codeword

000 or 111 Error detect


33
Training material of NHK-CTI

Undetected error (example of sentence) Training material of NHK-CTI

Error
Original Recognized
Received message
message message
“This is a pen” “This is a pun” “This is a pun”

The grammar is right

34
Training material of NHK-CTI

Undetected error (digital transmission) Training material of NHK-CTI

Error
Original Codeword Received Recognized
sequence sequence sequence sequence

[number of “1”]
0101 01010 11110 [4(even)] 1111

1011 10111 10100 [2(even)] 1010

Undetected error

even number of “1” number of “1” is even.


These are codewords.
35
Training material of NHK-CTI

Key point Training material of NHK-CTI

Number of correctable errors is finite.


If the number of errors exceeds the error
correction capability,
 Misscorrection
 Undetected error might occur

36
Training material of NHK-CTI

Fundamental of Training material of NHK-CTI

error correction

37
Training material of NHK-CTI

Example of codewords and Hamming distance Training material of NHK-CTI

Codewords by even parity


Add Codeword
Information [Number of “1”] bit [number of “1”]
00 [0(even)] 0 000 [0(even)]
01 [1(odd)] 1 011 [2(even)]
+ =
10 [1(odd)] 1 101 [2 (even) ]
11 [2(even)] 0 110 [2 (even) ]
Hamming distance: dH
Number of different bits between the two corresponding
codewords.
Codeword 1 0 1
dH=2
Codeword 1 1 0 38
Training material of NHK-CTI

Example of codewords and Hamming distance Training material of NHK-CTI

Minimum Hamming distance


Minimum value of Hamming distance of all codewords.

Codewords
All the codewords are 2bit or more different.
000
011
Minimum Hamming distance =2
101
110

It doesn't become a codeword by one bit error.

One bit error can be detected.


39
Training material of NHK-CTI

Geometrical allocation of codewords Training material of NHK-CTI

Z
(001) (011)
Codeword
000 011
101 110 (101) (111)

Codeword
Invalid codeword (000) (010) Y
dH:Hamming distance
(100) (110)
X dH=1
In case of 1 bad bit →Code of one Hamming distance =Invalid codeword
In case of 2 bad bit →Code of two Hamming distance =Another codeword
40
Training material of NHK-CTI

Image of code space Training material of NHK-CTI

Code space (finite)

In case of Hamming code (7,4)


all number of word:128=(27),
codeword:16(=24),
Invalid codeword: 112=(128-16)

Invalid codeword Codeword

Minimum Hamming distance dmin:


Smallest Hamming distance between any pairs of code vectors in the code
Determine error detecting or error correcting capability
41
Training material of NHK-CTI

Error detection and minimum Hamming distance Training material of NHK-CTI

Hamming code (7,4) minimum Hamming distance=3


0000 000 1000 101
0001 011 1001 110
0010 110 1010 011 Can not become codeword within
0011 101 1011 000 2bit errors
0100 111 1100 010
0101 100 1101 001
0110 001 1110 100
0111 010 1111 111 Possible 2bit error detection
Information, Parity

42
Training material of NHK-CTI

Error detection and minimum Hamming distance Training material of NHK-CTI

Cn=(x1,x2,・・,xN)
C3 dmin≧s+1
s: Number of error detecting
capability

Error detection area


Possible to detect less than
s 1 or equal s bits
C1 C2
dmin Codeword
Invalid codeword

43
Training material of NHK-CTI

Error correction and minimum Hamming distance Training material of NHK-CTI

Hamming code (7,4) minimum Hamming distance=3


0000 000 1000 101
0001 011 1001 110 The nearest codeword becomes
0010 110 1010 011 the codeword as an original
0011 101 1011 000 codeword in single bit error
0100 111 1100 010
0101 100 1101 001
0110 001 1110 100
0111 010 Possible single error correction
1111 111
Information, Parity

44
Training material of NHK-CTI

Error correction and minimum Hamming distance Training material of NHK-CTI

Cn=(x1,x2,・・,xN)
C3 dmin≧2t+1
t: Number of error
correcting capability
Error correction area
Possible to correct less
than or equal t bits

t 1 t
Codeword
C1 C2
dmin Invalid codeword

45
Training material of NHK-CTI

Various error Training material of NHK-CTI

correction codes

46
Training material of NHK-CTI

Classification of error correction code Training material of NHK-CTI

Hamming code

BCH code
Random error
correction code Golay code

Difference-set cyclic code


Block code
Burst error Reed-Solomn code
correction code Fire code

Wyner-Ash code

Random error Punctured code /


correction code Viterbi decode

Sequential decode
Convolutional code
Burst error Hangelberger code
correction code Iwatare code

Turbo code/
Soft decision decode
Concatenated code
117 Product code 47
Training material of NHK-CTI

Type of error correction code


Block code Convolutional code
Information

Codeword

Coding is done independently from The information coded in the


other blocks past influences on the current
Information message is divided into coding
the constant length The error correction capability
Able to be a system based on the is higher than the block coding
algebra Viterbi decoding is used for
When t is assumed to be a capable decoding
number of error corrections, the block
code is often written as (n,k,t). 48
Training material of NHK-CTI

Block code & Convolutional code


Block codeword c(n,k) n
k n-k
Key word; code rate: k/n
Information bit Check bit

Function
Convolutinal coder
Key word; code rate:1/2, constraint length:7
D Delay + Modulo-2 addition
Xoutput

G1=171oct

Input D D D D D D

G2=133oct

113 Youtput 49
Training material of NHK-CTI

Expression of polynomial Training material of NHK-CTI

 The coefficient of n-
n-th power of x in polynomial
expression shows the corresponding bit position
in the sequence

Code comprising of n bit sequence c=(c1, c2, c3, ・・・・・, cn)


is expressed by following polynomial
n-1 n-2
c(x) = c1x + c2 x + ⋅⋅⋅⋅⋅ + cn-1x + cn

Example c=(1,1,0,0,0,1)
5 4
c(x) = x + x + 1

50
Training material of NHK-CTI

Operation for code processing Training material of NHK-CTI

Operation 0,1,,,,Xn( n=0,1,2,3,….)


Calculation of 0 and 1

(Addition) 0+0=0, 0+1=1, 1+0=1,1+1=0


modulo2,Exclusive OR
(Multiplication) 0×0=0, 0×1=0, 1×0=0, 1×1=1

Operation including X
(Addition ) 0+X=X, 1+X=1+X, X+X=0
(Multiplication) 0×X=0, 1×X =X, X×X = X2

51
Training material of NHK-CTI

Encoding single error detection Training material of NHK-CTI

Information bit sequence i = (i 1 , i 2 ,⋅ ⋅ ⋅ ⋅ ⋅, i k )


Codeword sequence c = (c 1 , c 2 ,⋅ ⋅ ⋅ ⋅ ⋅, c k , c k + 1 )
Redundant bit
c1 = i1
Coding c2 = i2

ck = ik
Parity check bit ck + 1 = c1 + c2 + ⋅⋅⋅⋅⋅ + ck
Parity check equation c 1 + c 2 + ⋅ ⋅ ⋅ ⋅ ⋅ + c k + c k + 1 = 0
Parity check equation is necessary and sufficient
condition to be a codeword
52
Training material of NHK-CTI

Hamming code Training material of NHK-CTI

53
Training material of NHK-CTI

Hamming code (7,4) Training material of NHK-CTI

Information bit sequence i = (i 1 , i 2 , i 3 , i 4 )


Codeword sequence c = (c 1, c 2, c 3, c 4, c 5, c 6, c 7)

c1 = i1
Coding
c2 = i2
c3 = i3
c4 = i4
c5 = c 1 + c 2 + c3
Parity check bit c6 = c2 + c3 + c4
c7 = c1 + c2 + c4

54
Training material of NHK-CTI

How to find parity check bits? Training material of NHK-CTI

C1 C2 C1+C3 C1+C2+C4

1011 C1 C2 C3 C4 C5 C6 C7
C1 0 C1 C1
C2 C1+C3 C1+C4 C5
C2 0 C2 C2
C1+C3 C1+C2+C4 C2+C5 C6
C1+C3 0 C1+C3 C1+C3
C1+C2+C4 C1+C2+C3+C5 C1+C3+C6 C7
C1+C2+C4 0 C1+C2+C4 C1+C2+C4
C1+C2+C3+C5 C2+C3+C4+C6 C1+C2+C4+C7

The codeword is divisible by the generator bit sequence, therefore


C1+C2+C3+C5 =0 C2+C3+C4+C6 =0 C1+C2 +C4+C7 =0
Therefore parity check bit becomes as follows
C5 = C1+C2+C3 C6 = C2+C3+C4 C7 = C1+C2 +C4
55
Training material of NHK-CTI

Parity check equation Training material of NHK-CTI

Transform the parity check equation


The necessary and sufficient condition to be
a codeword is “parity check equation =0”

Hamming (7,4) parity check equation

c1 + c2 + c3 + c5 =0
c2 + c3 + c4 + c6 = 0
c1 + c2 + c4 + c7 = 0

56
Training material of NHK-CTI

Channel model Training material of NHK-CTI

Noise source Error pattern


e(0,0,1,0,0)
Transmitted sequence Received sequence
c(1,0,1,1,1) r=c+e
c(1,0,0,1,1)

Received sequence r = c + e
= (c1 + e1, c2 + e2, ⋅ ⋅ ⋅ ⋅ ⋅ , ck + 1 + ek + 1)
= (r1, r2, ⋅ ⋅ ⋅ ⋅ ⋅ , rk + 1)

57
Training material of NHK-CTI

Syndrome of Hamming code (7,4) Training material of NHK-CTI

Syndrome s:
Syndrome (s) is the substitution of the element in
received sequence (r) for the parity check equation

s1 = r1 + r2 + r3 + r5
= c1 + e1 + c2 + e2 + c3 + e3 + c5 + e5
= e1 + e2 + e3 + e5
s2 = r2 + r3 + r4 + r6
= c2 + e2 + c3 + e3 + c4 + e4 + c6 + e6
= e2 + e3 + e4 + e6
s3 = r1 + r2 + r4 + r7
= c1 + e1 + c2 + e2 + c4 + e4 + c7 + e7
= e1 + e2 + e4 + e7
58
Training material of NHK-CTI

Syndrome for single error Training material of NHK-CTI

Error pattern Syndrome

e1 e2 e3 e4 e5 e6 e7 s1 s2 s3
1 0 0 0 0 0 0 1 0 1
0 1 0 0 0 0 0 1 1 1
0 0 1 0 0 0 0 1 1 0
0 0 0 1 0 0 0 0 1 1
0 0 0 0 1 0 0 1 0 0
0 0 0 0 0 1 0 0 1 0
0 0 0 0 0 0 1 0 0 1
0 0 0 0 0 0 0 0 0 0

Error patterns are substituted into the previous page syndrome S1, S2, S3,
then obtain Syndrome pattern
59
Training material of NHK-CTI

Example of Hamming code (7,4)


Information code Parity code
p1 = a 1 ⊕ a 2 ⊕ a 3
a1 a2 a3 a4 p1 p2 p3 p2 = a2 ⊕ a3 ⊕ a4
p 3 = a1 ⊕ a 2 ⊕ a4
1 0 1 1 0 0 0
S1 S2 S3 Error position
Transmission code 1 0 1 a1
1 1 1 a2
Degradation Error
1 1 0 a3
through channel detect
0 1 1 a4
Reception code Syndrome 1 0 0 P1
a1 a2 a3 a4 p1 p2 p3 s1 s2 s3 0 1 0 P2
Error
0 0 1 P3 correct
1 0 0 1 0 0 0 1 1 0
0 0 0 None
(0 1)
+
q1 = a1 ⊕ a 2 ⊕ a 3 +
q2 = a 2 ⊕ a3 ⊕ a4 Detection of bit error
+
q 3 = a1 ⊕ a 2 ⊕ a4 position by the
1 1 0 Syndrome value
q1 q2 q3
60
Training material of NHK-CTI

Example of Hamming coder Training material of NHK-CTI

C4 c4
c3 c3
c2 c2
c1 c1
c7
c6
c5

61
Training material of NHK-CTI

Example of Hamming decoder Training material of NHK-CTI

c4 c4
c3 c3
c2 c2
c1 c1

c7

c6

c5

62
Training material of NHK-CTI

Important point (CRC) Training material of NHK-CTI

As an example of the rule, if the information


polynomial is divided by some signal
(generator polynomial) and the remainder is
added to the information, information can be
divisible. CRC (Cyclic Redundancy Check)

 divisible → No error
 indivisible → Error

63
Training material of NHK-CTI

Example of CRC Training material of NHK-CTI

Principal
Information desired to transmit data is 5,678
As a rule, decide divisor as 98 beforehand
2 digits redundant data is added 5,678 567,800
Divide 567,800 by 98, remainder becomes 86
Add (98-86=12) to the 567,800
567,800+12=567,812 is divisible by 98
Transmitting information is 567,812
In receiver side, divide the received data by 98, if there is
no remainder, no error, it is true data, discard 2bit from
LSB,
...........otherwise erroneous data
64
Training material of NHK-CTI

Coding and decoding by generator polynomial Training material of NHK-CTI

Encoding
Encode codeword polynomial c(x) which is divisible by
generator polynomial g(x)
n−k
c( x) = x i( x) + p( x) = q( x) g ( x)

Decoding
Divide received polynomial r(x) by g(x), which contains error
polynomial e(x),
s ( x ) = r ( x ) mod g ( x ) = { c ( x ) + e( x ) }mod g ( x )
= e(x) mod g(x)
From syndrome polynomial s(x), find the syndrome pattern and
correct the reception message and get the right codeword

65
Training material of NHK-CTI

How to create CRC Training material of NHK-CTI

i(x): Information polynomial i (i1,12,,,,ik:kbits)


f(x): Shift i(x) to the left by (n-k) bits for CRC (n-
k)bits, f(x)=xn-ki(x)
If f(x) is divide by generator polynomial g(x), then
the quotient is q(x) and remainder is p(x), therefore
f(x) is expressed f(x)=q(x)g(x)+p(x)
Add both side p(x) to examine the codeword
polynomial c(x)=f(x)+p(x)
 Left side f(x)+p(x)= xn-ki(x)+p(x)
 Right side q(x)g(x)+p(x)+p(x)=q(x)g(x)
 Therefore codeword polynomial c(x)=xn-ki(x)+p(x) is divisible by g(x)
 Codeword c (i1,12,,,,ik,,p1p2,,,pn-k:nbits)
66
Training material of NHK-CTI

Example of CRC Training material of NHK-CTI

Information code i=(1,0,1,0)


Polynomial expression of information code
 i(x)=x3+x
Shift i(x) by (n-k:3) bit to the left
 f(x)=x3i(x)=x6+x4
Divide f(x) by generator polynomial g(x)=x3+x+1
Find the remainder p(x):CRC polynomial

f ( x) x6 + x4
= 3 = x3 + 1 remainder p(x)=x+1
g ( x) x + x + 1

Therefore c(x)=f(x)+p(x)=x6+x4+x+1
 Convert to information code c=(1,0,1,0,0,1,1)
67
Training material of NHK-CTI

CRC generation circuit Training material of NHK-CTI

1 x x2 Xn-1 Xn

g0 g1 g2 gn-1

Input: f(x) Output: q(x)


D0 D1 Dn

gi coefficient Di shift register modulo2

Generator polynomial g(x)=g0+g1x+g2x2+ +gn-1xn-1+xn


Input:f(x), Output:q(x), Remainder:R(x) D0,D1,D2,,,,Dn

124
68
Training material of NHK-CTI

Review of polynomial Training material of NHK-CTI

The coefficient of n-th power of x in polynomial


expression shows the corresponding bit position of the
code sequence
Code sequence c = (c1, c2, c3, ・・・・・, cn) is expressed by
polynomial
c (x ) = c 1 x n -1 + c 2 x n - 2 + ⋅ ⋅ ⋅ ⋅ ⋅ + c n -1 x + c n
Information sequence i = (i1, i2, i3, ・・・・・, ik) is expressed by
polynomial
i( x ) = i 1 x k -1 + i 2 x k - 2 + ⋅ ⋅ ⋅ ⋅ ⋅ + i k -1 x + i k
Codeword
c( x ) = x n − k i( x )g( x ) + p( x ) p(x): remainder

Generator polynomial 69
Training material of NHK-CTI

Hamming code Training material of NHK-CTI

Capable of single error correction code


Code length n

ik-1 ・ ・ ・ ・ i1 i0 pm-1 ・ p1 p0

Information bit length: k Parity bit length: m

 Code length n = 2r - 1(bit)


 Information bit length k = n - m(bit)
 Parity bit length m = rt (bit)
 Minimum length dmin = 2t + 1 = 3 (bit) ( t = 1)

r:Integer of three or more


t:Number of error correcting capability
70
Training material of NHK-CTI

Hamming code (7,4) Training material of NHK-CTI

「Hamming code (7,4)」


Code length n = 7,information bit length k = 4,parity bit
length m = 3
Generator polynomial: X3+X+1
Information code Codeword
i3 i2 i1 i0 i3 i2 i1 i0 p2 p1 p0

i3 ii2 i1 i0 000
remainder p=(p2 p1 p0)
1011

Expression of generator polynomial


71
Training material of NHK-CTI

How to find parity check bit Training material of NHK-CTI

Information bit sequence (i3 i2 i1 i0)) is shifted to the left by


three bits, it becomes (i3 i2 i1 i0 000), then divide it by
generator polynomial, and find remainder
1 0 0 1
1 0 1 1 (i3) (i2) (i1) (i0) (p2) (p1) (p0)
1 0 1 0 0 0 0
1 0 1 1
1 0 0 0
1 0 1 1
Information bit
0 1 1
Parity check bit

Codewords is 1010011
72
Training material of NHK-CTI

Hamming codeword (7,4) Training material of NHK-CTI

Information bit Codeword


i3 i2 i1 i0 i3 i2 i1 i0 p2 p1 p0
0 0 0 0 0 0 0 0 0 0 0
0 0 0 1 0 0 0 1 0 1 1
0 1 1 0 0 1 1 0 1 1 0
0 0 1 1 0 0 1 1 1 0 1
0 1 0 0 0 1 0 0 1 1 1
0 1 0 1 0 1 0 1 1 0 0
0 1 1 0 0 1 1 0 0 0 1
0 1 1 1 0 1 1 1 0 1 0
1 0 0 0 1 0 0 0 1 0 1
1 0 0 1 1 0 0 1 1 1 0
1 0 1 0 1 0 1 0 0 1 1
1 0 1 1 1 0 1 1 0 0 0
1 1 0 0 1 1 0 0 0 1 0
1 1 0 1 1 1 0 1 0 0 1
1 1 1 0 1 1 1 0 1 0 0
1 1 1 1 1 1 1 1 1 1 1

73
Training material of NHK-CTI

Error example in Hamming code (7,4) Training material of NHK-CTI

Information bit sequence (1010)→codeword sequence (1010011)


If error is generated to the first bit
Codeword (1010011) →(0010011)
Divide by generator polynomial: 0010011
(1011)→ remainder (101) - 1011
101

If error is generated to the second bit


Codeword (1010011) →(1110011) 1110011
- 1011
Divide by generator polynomial: 101011
(1011) → remainder (111) - 1011
111
74
Training material of NHK-CTI

Single bit error in Hamming code (7,4) Training material of NHK-CTI

Codeword Remainder divided by g(x) when one bit error occurs


i3 i2 i1 i0 p2 p1 p0 i3 error i2 error i1error i0error p2 error p1 error p0 error
0 0 0 0 0 0 0 101 111 110 011 100 010 001
0 0 0 1 0 1 1 101 111 110 011 100 010 001
0 1 1 0 1 1 0 101 111 110 011 100 010 001
0 0 1 1 1 0 1 101 111 110 011 100 010 001
0 1 0 0 1 1 1 101 111 110 011 100 010 001
0 1 0 1 1 0 0 101 111 110 011 100 010 001
0 1 1 0 0 0 1 101 111 110 011 100 010 001
0 1 1 1 0 1 0 101 111 110 011 100 010 001
1 0 0 0 1 0 1 101 111 110 011 100 010 001
101 111 110 011 100 010 001
1 0 0 1 1 1 0 101 111 110 011 100 010 001
1 0 1 0 0 1 1 101 111 110 011 100 010 001
1 0 1 1 0 0 0 101 111 110 011 100 010 001
1 1 0 0 0 1 0 101 111 110 011 100 010 001
1 1 0 1 0 0 1 101 111 110 011 100 010 001
1 1 1 0 1 0 0 101 111 110 011 100 010 001
1 1 1 1 1 1 1 101 111 110 011 100 010 001

75
Training material of NHK-CTI

Error correction by Hamming code Training material of NHK-CTI

 Divide received word by generator polynomial

 if divisible → No error
 if indivisible → Error detected

An erroneous bit position is judged by the form of the


remainder
The error is corrected by reversing (0→1,1→0) the bit of
the erroneous bit position

Possible to correct single error


More than or equal to two bits error
→ misscorrection or undetection
76
Training material of NHK-CTI

Example of Hamming code Training material of NHK-CTI

Single error correction code (dmin=3)

Hamming (7,4)
Generator bit: 「1011」 x3+x+1 dmin=3

Hamming (15,11) Generator


Generator bit: 「10011」 x4+x+1 polynomial

Hamming (31,26)
Generator bit: 「100101」 x5+ x2 +1

77
Training material of NHK-CTI

Eb/No vs BER (Hamming code) Training material of NHK-CTI

10 0

10 -1

10 -2
Bit error rate

10 -3 No coding
H(63,57)

10 -4
H(7,4)

10 -5 2.4dB
0.4dB

(63,57) (7,4)
10 -6
0 1 2 3 4 5 6 7 8 9 10
Eb/No (dB) C/N(dB)

78
Training material of NHK-CTI

BER before vs after error correction Training material of NHK-CTI

(Hamming code)
0
Bit error rate after error correction
10
-1
10
-2
10
-3 (63,57)
10
-4
10 (7,4)
-5
10
-6
10 -3 -2 -1
10 10 10
Bit error rate before error correction

79
Training material of NHK-CTI

BCH code
Training material of NHK-CTI

(Bose Chaudhuri
Hochquenghem )

80
Training material of NHK-CTI

BCH code Training material of NHK-CTI

Capable of more than two or equal bits error correction code


The degree of freedom in encoding is high

Code length: n

ik-1 ・ ・ ・ ・ i1 i0 pm-1 ・ p1 p0

Information bit length: k Parity bit length: m


r
Code length n ≦ 2 -1 (bit)
Information bit length k=n-m (bit)
Parity bit length m ≦rt (bit)
Minimum distance dmin≧2t + 1 (bit)
r: Integer of more than equal three
t: Number of error correction capability
81
Training material of NHK-CTI

BCH code (15,7) Training material of NHK-CTI

「BCH(15, 7)codeword」 (r=4,t=2)


Code length: n = 15, Information bit length: k = 7, Parity bit
length: m = 8
Generator polynomial:g(x)=g1(x)g3(x)=x8+x7+x6+x4+1
Information bits
i6 i5 i4 i3 i2 i1 i0

Codeword
i6 i5 i4 i3 i2 i1 i0 p7 p6 p5 p4 p3 p2 p1 p0

i6i5i4i3i2i1io00000000
Remainder: R=p7p6p5p4p3p2p1p0
111010001
82
Training material of NHK-CTI

BCH code (15,7) Training material of NHK-CTI

Bit expression of Generator polynomial:111010001


Generator polynomial: g(x)=x8+x7+x6+x4+1
=(x4+x+1)(x4+x3+x2+x+1)
Where let g1(x), g3(x) be as follows;
g1(x)= x4+x+1
g3(x)=x4+x3+x2+x+1
g(x) becomes as follows;
g(x)=g1(x)g3(x)
「If codeword polynomial is divisible by generator polynomial
g(x) 」 means
「it is also divisible by polynomial g1(x) or g3(x) 」

83
Training material of NHK-CTI

Single bit error in BCH code (15,7) Training material of NHK-CTI

i6 i5 i4 i3 i2 i1 i0 p7 p6 p5 p4 p3 p2 p1 p0

Position of Remainder of Remainder of


error g1(x) g3(x)
Bit error position i6 1001 1111
(single bit error) i5 1101 1000
i4 1111 0100
i3 1110 0010
g1(x)=x4+x+1 i2 0111 0001
i1 1010 1111
「10011」 i0 0101 1000
p7 1011 0100
g3(x)=x4+x3+x2+x+1 p6 1100 0010
p5 0110 0001
「11111」 p4 0011 1111
p3 1000 1000
Remainder divided by p2 0100 0100
p1 0010 0010
g1(x), g3(x) p0 0001 0001 84
Training material of NHK-CTI

2 bits error in BCH code (15,7) Training material of NHK-CTI

i6 i5 i4 i3 i2 i1 i0 p7 p6 p5 p4 p3 p2 p1 p0
Position R of g1(x) R of g3(x) Position R of g1(x) R of g3(x) Position R of g1(x) R of g3(x)
i6, i5 0100 0111 i 4 , p3 0111 1100 i 0 , p6 1001 1010
i6, i4 0110 1011 i 4 , p3 1011 0000 i 0 , p5 0011 1001
i6, i3 0111 1101 i 4 , p1 1101 0110 i 0 , p4 0110 0111
i6, i2 1110 1110 i 4 , p0 1110 0101 i 0 , p3 1101 0000
i6, i1 0011 0000 i3, i2 1001 1011 i 0 , p2 0001 1100
i6, i0 1100 0111 i3, i1 0100 1101 i 0 , p1 0111 1010
i 6 , p7 0010 1011 i3, i0 1011 1010 i 0 , p0 0100 1001
i 6 , p6 0101 1100 i 3 , p7 0101 0101 p7, p6 0111 0110
i 6 , p5 1111 1110 i 3 , p6 0010 0000 p7, p5 1101 0101
i 6 , p4 1010 0000 i 3 , p5 1000 0011 p7, p4 1000 1011
i 6 , p3 0001 0111 i 3 , p4 1101 1101 p7, p3 0011 1100
i 6 , p2 1101 1011 i 3 , p3 0110 1010 p7, p2 1111 0000
i 6 , p1 1011 1101 i 3 , p2 1010 0110 p7, p1 1001 0110
i 6 , p0 1000 1110 i 3 , p1 1100 0000 p7, p0 1010 0101
i5, i4 0010 1100 i 3 , p0 1111 0011 p6, p5 1010 0011
i5, i3 0011 1010 i2, i1 1101 1110 p6, p4 1111 1101
i5, i2 1010 1001 i2, i0 0010 1001 p6, p3 0100 1010
i5, i1 0111 0111 i 2 , p7 1100 0101 p6, p2 1000 0110
i5, i0 1000 0000 i 2 , p6 1011 0011 p6 , p 1 1110 0000
i 5 , p7 0110 1100 i 2 , p5 0001 0000 p6, p0 1101 0011
i 5 , p6 0001 1010 i 2 , p4 0100 1110 p5, p4 0101 1110
i 5 , p5 1011 1001 i 2 , p3 1111 1001 p5, p3 1110 1001
i 5 , p4 1110 0111 i 2 , p2 0011 0001 p5, p2 0010 0101
i 5 , p3 0101 0000 i 2 , p1 0101 0101 p5, p1 0100 0011
i 5 , p2 1001 1100 i 2 , p0 0110 0000 p5, p0 0111 0000
i 5 , p1 1111 1010 i1, i0 1111 0111 p4, p3 1011 0111
i 5 , p0 1100 1001 i 1 , p7 0001 1011 p4, p2 0111 1011
i4, i3 0001 0110 i 1 , p6 0110 1101 p4, p1 0001 1101
i4, i2 1000 0101 i 1 , p5 1100 1110 p4, p0 0010 1110
i4, i1 0101 1011 i 1 , p4 1001 0000 p3, p2 1100 1100
i4, i0 1010 1100 i 1 , p3 0010 0111 p3, p1 1010 1010
i 4 , p7 0100 0000 i 1 , p2 1110 1011 p3, p0 1001 1001
i 4 , p6 0011 0110 i 1 , p1 1000 1101 p2, p1 0110 0110
i 4 , p5 1001 0101 i 1 , p0 1011 1110 p2, p0 0101 0101
i 4 , p4 1100 1011 i 0 , p7 1110 1100 p1, p0 0011 0011 85
Training material of NHK-CTI

Remainder and bit error position in BCH (15,7) Training material of NHK-CTI

i6 i5 i4 i3 i2 i1 i0 p7 p6 p5 p4 p3 p2 p1 p0
Remainder of g3(x)

0000 1000 0100 1100 0010 1010 0110 1110 0001 1001 0101 1101 0011 1011 0111 1111

0000
1000 i5,i0 p3 p6,p2 i6,p0 i4,i2 i1,p1 i3,p5 p7,p4
0100 i4,p7 p2 p6,p3 i2,p4 i0,p0 i3,i1 p5,p1 i6,i5
1100 i3,p1 p3,p2 p6 i1,p5 i5,p0 i2,p7 i4,p4 i6,i0
Remai 0010 i3,p6 i5,i4 p1 p4,p0 i2,i0 p5,p2 i6,p7 i1,p3
nder of 1010 i6,p4 i4,i0 p3,p1 i3,p2 i5,i2 p7,p0 p6,p5 i0,p4 i1
g1(x) 0110 i2,p0 i5,p7 i3,p3 p2,p1 p5 i1,p6 i6,i4 i5,p4
1110 p6,p1 i0,p7 i3 i6,i2 p5,p3 i4,p0 i1,p2 i6,p3
0001 i2,p5 i0,p2 i5,p6 i4,i3 p0 p4,p1 i1,p7
1001 i1,p4 i5,p2 i0,p6 p7,p1 p3,p0 i4,p5 i3,i2 i6
0101 i5,p3 i0 i3,p7 p5,p4 p2,p0 i6,p6 i2,p1 i4,i1
1101 i0,p3 i5 i4,p1 p2,i1 p7,p5 i3,p4 p6,p0 i6,p2
0011 i6,i1 p7,p3 i5,i3 i4,p6 i0,p5 i2,p2 p1,p0 p4
1011 i4,p2 p7 i3,i0 i1,p0 i5,p5 i6,p1 i2,p6 p4,p3
0111 p5,p0 i4,p3 i0,p1 p7,p6 i2 i6,i3 p4,p2 i5,i1
1111 p7,p2 i4 i5,p1 i6,p5 i2,p3 p6,p4 i3,p0 i1,i0

Ex. single bit error: i3 ←→ remainder of g1(x) :1110, remainder of g3(x): 0010
Ex. 2 bits error :i5, i3← → remainder of g1(x) :0011, remainder of g3(x): 1010
86
Training material of NHK-CTI

Example of BCH code Training material of NHK-CTI

t bit error correction capability (dmin≧2t+1)


BCH (7,4) :t=1
Generator bit :「1011」 dmin=3
g(x)=x3+x+1
BCH (15,7) :t=2
Generator bit:「111010001」 dmin=5
g(x)=x8+x7+x6+x4+1
dmin=7
BCH (31,16) : t=3
Generator bit:「1000111110101111」
g(x)=x15+x11+x10+x9+x8+x7+x5+x3+x2+x+1
87
Training material of NHK-CTI

Reed--Solomon code
Reed Training material of NHK-CTI

88
Training material of NHK-CTI

Reed--Solomon code
Reed Training material of NHK-CTI

Symbol error correction code (in general byte)


The degree of freedom in encoding is high
Code length: n

ik-1 ・ ・ ・ ・ i1 i0 pm-1 ・ p1 p0

Information symbol length: k Parity symbol length: m


Code length n ≦ 2r - 1 (Symbol)
Information symbol length k = n - m (Symbol)
Parity symbol length m = 2t (Symbol)
Minimum distance dmin≧2t+1 (Symbol)
r:Number of bits comprising symbol
t: Number of symbol capable of error correction 89
Training material of NHK-CTI

Feature of roots of generator polynomial Training material of NHK-CTI

Let α be a primitive element in GF(24) which is constructed based


on the primitive polynomial ; g(x)
where g(x)=x3+x+1
α satisfies the following equation
α3+α+1=0 3
α =1+α
4 2
3
Field of 8 elements generated by x +x+1 α =α(1+α)==α+α
Polynomial 3-Tuple 5 2 2 3
Power form
form form
Symbol α =α(α+α )==α +α
0 0 000 0   =1+α+α2
0
α =1 1 100 4 6 2
1
α 010 2
α =α(1+α+α )
α =α 2 3
α2 2
α 001 1 =α+α +α
α3
1 +α 110 6  =α+α2+(1+α)
4 2
α α +α 011 3 2
5 2
111 7
=1+α
α 1 +α +α
7 2 3
α6 2
1 + α 1 0 1 5 α =α(1+α )=α+α
7
α =α
0
1 100 1   =α+(1+α)=1
90
Training material of NHK-CTI

Addition of roots for generator polynomial Training material of NHK-CTI

Vector expression 000 100 010 001 110 011 111 101
2 3 4 5 6
3-Tuple Power 0 1 α α α α α α
000 0 0 1 α α2 α3 α4 α5 α6
3 6 5 4 2
100 1 1 0 α α α α α α
010 α α α3 0 α4 1 α2 α6 α5
001 α2 α2 α6 α4 0 α5 α α3 1
3 3 5 6 2 4
110 α α α 1 α 0 α α α
011 α4 α4 α5 α2 α α6 0 1 α3
5 5 4 6 3 2
111 α α α α α α 1 0 α
101 α6 α6 α2 α5 1 α4 α3 α 0

Vector expression 000 100 010 001 110 011 111 101
3-Tuple Power 0 1 α α2 α3 α4 α5 α6
000 0 000 100 010 001 110 011 111 101
100 1 100 000 110 101 010 111 011 001
010 α 010 110 000 011 100 001 101 111
2
001 α 001 101 011 000 111 010 110 100
110 α3 110 010 100 111 000 101 001 011
4
011 α 011 111 001 010 101 000 100 110
5
111 α 111 011 101 110 001 100 000 010
6
101 α 101 001 111 100 011 110 010 000
91
Training material of NHK-CTI

Multiplication of roots for generator polynomial Training material of NHK-CTI

Vector expression 000 100 010 001 110 011 111 101
3-Tuple Power 0 1 α α2 α3 α4 α5 α6
000 0 0 0 0 0 0 0 0 0
2 3 4 5 6
100 1 0 1 α α α α α α
010 α 0 α α2 α3 α4 α5 α6 1
2 2 3 4 5 6
001 α 0 α α α α α 1 α
3 3 4 5 6 2
110 α 0 α α α α 1 α α
011 α4 0 α4 α5 α6 1 α α2 α3
5 5 6 2 3 4
111 α 0 α α 1 α α α α
6 6 2 3 4 5
101 α 0 α 1 α α α α α
Vector expression 000 100 010 001 110 011 111 101
2 3 4 5 6
3-Tuple Power 0 1 α α α α α α
000 0 000 000 000 000 000 000 000 000
100 1 000 100 010 001 110 011 111 101
010 α 000 010 001 110 011 111 101 100
001 α2 000 001 110 011 111 101 100 010
110 α3 000 110 011 111 101 100 010 001
4
011 α 000 011 111 101 100 010 001 110
111 α5 000 111 101 100 010 001 110 011
101 α6 000 101 100 010 001 110 011 111
92
Training material of NHK-CTI

RS code (5,3) Training material of NHK-CTI

「RS code(5,3)」(r=3, 1symbol=3 bits)


Code length: n=5, information symbol length: k=3, Parity bit
symbol length: m=2
Information bits Codeword
A B C A B C P Q

α: a root of g(x)=x3+x+1=0
A+B+C+P+Q=0
2 3 4 5
Parity check equation
αA+α B+α C+α P+α Q=0
From above equations, P and Q are as follows:
P= α6A+α3B+α2C
Q=α2A+αB+α6C
93
Training material of NHK-CTI

Characteristics of error in RS code (5,3) Training material of NHK-CTI

Set syndrome S as follows;


S1=A+B+C+P+Q
S2=αA+α2B+α3C+α4P+α5Q Codeword
A B C P Q
No error
S1=0 and S2=0

One symbol error


If symbol A is error, and it becomes A+A’
S1=(A+A’)+B+C+P+Q=A’
S2=α(A+A’)+α2B+α3C+α4P+α5Q=αA’
∴S2=αS1

If symbol B is error, and it becomes B+B’


S1=A+(B+B’)+C+P+Q=B’
S2=αA+α2(B+B’)+α3C+α4P+α5Q= α2B’
∴S2=α2S1 94
Training material of NHK-CTI

Characteristics of error in RS code (5,3) Training material of NHK-CTI

One symbol error


If symbol C is error, and it becomes C+C’
S1=A+B+(C+C’)+P+Q=C’
S2=αA+α2B+α3(C+C’)+α4P+α5Q=α3C’
∴S2=α3S1

Symbol A → S2=αS1
Symbol B → S2=α2S1
Symbol C → S2=α3S1

Detection of error location → from above equations


and S1is error pattern
Error correction → Add S1 to the error detected symbol

Possible to correct one symbol


95
Training material of NHK-CTI

Example of RS code (5,3) encoding Training material of NHK-CTI

Let A,B,C set as follows; (1 symbol is 3 bits)


A=100, B=010, C=001
A=1, B=α, C=α2

P=α6A+α3B+α2C Q=α2A+αB+α6C
=α6・1+α3・α+α2・α2 =α2・1+α・α+α6・α2
=α6+α4+α4 =α2+α2+α8
=α6 =α8
=101 =α
=010

A B C P Q
100 010 001 101 010
96
Training material of NHK-CTI

Error correction example of RS code (5,3) Training material of NHK-CTI

If symbol A is error
A=100 → 011 (3 bits error)

S1=A+B+C+P+Q S2=αA+α2B+α3C+α4P+α5Q
=α4+α+α2+α6+α =α・α4+α2・α+α3・α2 +α4・α6 +α5・α
=α2+α4+α6 =α5+α3+α5 +α10+α5
=α+α6 =α3+α3+α6
= α5 =α6
=111 =αS1

S2 =αS1, ;error location is symbol A


A= 011+S1=011+111=100

97
Training material of NHK-CTI

Examples of RS code Training material of NHK-CTI

 t symbol error correction code (dmin=2t+1)

Suitable for burst error correction

RS(32,30) code, etc. :t=1


RS(64,60) code, etc. :t=2
RS(128,120) code, etc. :t=4
RS(204,188) code, etc. :t=8
RS(207,187) code, etc. :t=10

98
Training material of NHK-CTI

Convolutional codeTraining material of NHK-CTI

99
Training material of NHK-CTI

Trellis diagram of convolutional code Training material of NHK-CTI

:1bit delay 10
+ c1 + :Adder 11/0 01/0
11010
a1 a2 a3 00/0 00 10/0 00/1 11 10/1
Input Output c1c2
a1 a2 a3
+ c2 c1=a1+a2 +a3 11/1 01/1
01
c2=a1 +a3
State (a3 a2) c1c2/a1
State A:(0 0) 00/0
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1
01/0
State D:(1 1) 01/1
133 100
Training material of NHK-CTI

Principal of Viterbi decoding Training material of NHK-CTI

Examine cheaper route, whenever the railroad joins


Select the route which a fare is cheaper
Examine the cheaper route one after another, finally
find the cheapest route
130/440 180/620
Kasumigaseki 160 Shinjyuku 150/310 Simokitazawa
Uehara Seijyou
Seijyo
150/310
160/470
Harajyuku
160 120/430
160/320 210/520
Omotesandou
150/310

160/320 Shibuya

101
Training material of NHK-CTI

Error correction by Viterbi decoding Training material of NHK-CTI

Information message 0 1 0 1 1 0 0 0
Transmitted message 00 11 10 00 01 01 11 00
Received message 10 11 11 00 01 01 11 00
1 (1) 2 (3) (5)
State A:(0 0)
1 (1) (2)
0 (1)
(3)
State B:(0 1)
1 (2) (4)
(2)
State C:(1 0)
(3)
1 (2) (2)
State D:(1 1) (3)
1 (2) c1c2/a1
00/0 10/0
00/1 10/1
Blanch metric Path metric 11/0 01/0
11/1 01/1
145 102
Training material of NHK-CTI

Key point Training material of NHK-CTI

Block code
Hamming code
Single error correction
BCH code
t bit error correction
Reed solomon code
t symbol error correction

Convolutional coding +Viterbi decode

103
Training material of NHK-CTI

Application in Training material of NHK-CTI

broadcasting field

104
Training material of NHK-CTI

Applications of error correcting code Training material of NHK-CTI

Usage Device Type of error Feature Ex. 0f Code

Digital Random & Convolutional


Transmission Receiver Low error rate
broadcasting Burst + RS
Semiconductor Hight speed
Random Hamming
memory & low error rate
Conputor Magnetic memory,
Data Optical memory (Random) CRC
Low error rate
(HDD,Tape,CD-ROM) Burst RS product
Record
Magnetic memory,
Relatively high
Video Optical memory (Random)
error rate than RS product
Audio (VTR,CD,DVD) Burst
data

RS: Reed Solomon

105
Training material of NHK-CTI

Applications for broadcasting VTR Training material of NHK-CTI

VTR Width of tape Error correction code


Video
System (inch) Inner code Outer code

D1 SD Component 3/4 RS (64,60) RS (32,30)

D2 SD Composite 3/4 RS (93,85) RS (68,64)

D3 SD Composite 1/2 RS (95,87) RS (136,128)

D5 SD,HD Component 1/2 RS (95,87) RS (128,120)

β CAM SD Component 1/2 RS (178,164) RS (106,96)

D6 HD Component 3/4 RS (227,211) RS (254,240)

HD1 HD Component 1 RS (110,104) RS (65,60)

106
Training material of NHK-CTI

Applications for broadcasting Training material of NHK-CTI

Error correction code


Broadcasting
Inner code Outer code
Digital Terrestrial Television Broadcasting Convolutional code
RS (204,188)
DVB-T, ISDB-T (r=1/2,2/3,3/4,5/6,7/8)
Trellis code (r=2/3)
ATSC RS (207,187)
Convolutional code(1/2)
Digital Satellite Television Broadcasting Trellis code (r=2/3)
Convolutional code RS (204,188)
ISDB-S
(r=1/2,2/3,3/4,5/6,7/8)

Digital Communication Satellite Convolutional code


RS (204,188)
Television Broadcasting (r=1/2,2/3,3/4,5/6,7/8)

107
Training material of NHK-CTI

Application of error correction technique Training material of NHK-CTI

Magnetic tape (RS code)


Magnetic disk (RS code)
Compact disk (RS code)
Digital audio tape (RS code)
Mobile phone (RS & Convolutional coding)
Satellite communication (RS & Convolutional
coding)
Satellite broadcasting (RS & Convolutional coding)
Teletext
Teletext broadcasting (Cyclic code )
Digital broadcasting (RS & Convolutional coding)
Intercontinental optical communications (RS code)

108
Training material of NHK-CTI

Finally Training material of NHK-CTI

The error correction is indispensable


for digital technique.

Importance of the error correction


technology is increasing further because
of high compression of source coding.

109
Training material of NHK-CTI

Informative reference
Training material of NHK-CTI

110
Training material of NHK-CTI

Reed--Solomon generator
Reed Training material of NHK-CTI

Connect for first (k) Bytes


Open for last (N-k) Bytes

Gate

α=174 α=165 α=121 α=121 α=181 α=152

X1 X2 X3 X19 X20

N-k=20 Parity Bytes


Mod(256) add two field elements (Bytes)
Mod(256) multiply a field element with a fixed element α B
N=207 Encoded Data
X Source one element (Byte)
k=187 Data Bytes A
Primitive Field Generator Polynomial (Galois):P(X) Connect A for first (k) Bytes
P(X)=X8+X4+X3+X2+1 Open B for last (N-k) Bytes
Code Generator Polynomial: G(X)
G(X)=X20+152X19+185X18+240X17+5X16+111X15+99X14+6X13+220X12+112X11+150X10+69X9+36X8+187X7+22X6
+228X5+198X4+121X3+121X2+165X1+174

111
Training material of NHK-CTI

Punctured code Training material of NHK-CTI

Convolutional X Convolutional code r=1/1


B0 coder
P1
Punctured
P0
Y
Improvement of coding rate
r=6/8=3/4
P1=X1,Y2,Y3
X P0=Y1,X3,Y4
171octal
+ X:X1,X2,X3,X4
Y:Y1,Y2,Y3,Y4
B0 ● D ● D ● D ● D D ● D ●
r=4/6=2/3
P1=X1,Y2
+ P0=Y1,X3
133octal
Y X:X1,X2,X3
D : 1bit delay + :Exclusive OR Y:Y1,Y2,Y3
112
Training material of NHK-CTI

Linear code and cyclic code Training material of NHK-CTI

Linear code Cyclic code


Block code
None linear code None cyclic code

Linear code
If C1 and C2 is two arbitrary codewords, added (C1+C2 ) is also codeword

Cyclic code
If message (n-bits in length) is expressed by polynomial, cyclic code is the all
codewords (code polynomial) that are divisible by generator polynomial G(X)

If Cn=(x1,x2,x3,・・,xN) is codeword,
each element shifted Cm=(x2,x3,・・ xN,x1)is also codeword

113
Training material of NHK-CTI

Example Training material of NHK-CTI

Linear code
Hamming code (7,4) 0001 011 (C1)
0000 000 1000 101
+ 0010 110 (C2)
0001 011 1001 110
0010 110 1010 011
0011 101 (C1+C2=C3)
0011 101 1011 000
0100 111
0101 100
1100 010
1101 001
Cyclic code
0110 001 1110 100
0111 010 1111 111 0111 010 (C7)
1bit shift to left
1110 100 (C7→C14)

114
Training material of NHK-CTI

Coding rate Training material of NHK-CTI

Number of bits for information (k)


Coding rate =
Number of bits for codeword (n)

Block code : (n,k)


coding rate r=k/n

Capability of error

low
correction
Code B

high
Example : (6,4) code
1011 01
coding rate: 2/3 Code A

high low

Coding rate

115
Training material of NHK-CTI

Appendix Training material of NHK-CTI

Concatenated code

116
Training material of NHK-CTI

Example of concatenated code Training material of NHK-CTI

MPEG-2 Source Coding &


Multiplexing
1
Encoder Mux
Adaptation Outer Outer Inner
Energy coder interleaver coder
Encoder n
Dispersal

Encoder 1 Mux
Adaptation Outer Outer Inner
Energy coder interleaver coder
Encoder n Dispersal

To Antenna

Frame Guard
Inner Interval D/A Front
Mapper OFDM
interleaver Adaptation Insertion End
LPF

Pilots &
TPS
Signals
Terrestrial Channel Adapter

DVB-T functional block diagram


117
Training material of NHK-CTI

Concept of interleave Training material of NHK-CTI

118
Training material of NHK-CTI

Interleaving Training material of NHK-CTI

Interleaving is a way to rearrange data in order to


increase performance to cope with burst error.
Direction of write

Direction of read

Conceptional diagram of interleaving


119
Training material of NHK-CTI

Example of concatenated code (Turbo code) Training material of NHK-CTI

Deinterleaver
RSC Additive
White
Gaussian Interleaver DEC2
DEC1
Interleaver Noise
Channel
(AWGN)
Deinterleaver
RSC

Turbo encoder Turbo decoder

RSC: Recursive systematic convolutional encoder


DEC: Soft-in / soft-out decoder

120
Training material of NHK-CTI

Example of concatenated code (Product code) Training material of NHK-CTI

The random & burst error is effectively corrected


n1
k1

Parity of C1
n2 k2 Information
C1:(n1,k1) code

Parity of C2

C2(n2,k2) code

121
Training material of NHK-CTI

Example of product code by RS code Training material of NHK-CTI

Error

B11 B12 B13 B14 B15 P11 P12 (1) Can not correct

B21 B22 B13 B24 B25 P21 P22 (2) B21 correct

B31 B32 B33 B34 B35 P31 P32


Bii Information
B41 B42 B43 B44 B45 P41 P42 symbol
Q11 Q12 Q13 Q14 Q15 P51 P52 Pik Qnj Parity symbol

Q21 Q22 Q23 Q24 Q25 P61 P62


(5) B13 correct
(3) B11 correct (6) B14 correct
(4) B12 correct
122
Training material of NHK-CTI

Appendix Training material of NHK-CTI

Cyclic coder

123
Training material of NHK-CTI

Cyclic code by shift register (1) Training material of NHK-CTI

1 1 0 1 0 1 0

Output: q(x) 0 0 0 x6 +x5 +x3 +x


0 0 Input: f(x)

x3 +x +1

+x3+x2+x+1 Output: q(x)


Cyclic code: x3+x+1 x6+x5 +x3 +x
x6 +x4+x3
(x6+x5+x3+x)+x+1= x6+x5+x3+1
x5+x4
x5 +x3+x2
x4+x3+x2+x
x4 +x2+x
x3
x3 +x+1
x+1
Remainder: R(x)
124
Training material of NHK-CTI

Cyclic code by shift register (2) Training material of NHK-CTI

1 0 1 0 1 0
0 0 1 x5 +x3 +x
0 0
x3 +x +1

125
Training material of NHK-CTI

Cyclic code by shift register (3) Training material of NHK-CTI

0 1 0 1 0
0 1 1 +x3 +x
0 0
x3 +x +1

126
Training material of NHK-CTI

Cyclic code by shift register (4) Training material of NHK-CTI

1 0 1 0
1 1 0 +x3 +x
1 1
x3 +x +1

127
Training material of NHK-CTI

Cyclic code by shift register (5) Training material of NHK-CTI

1 0 1 0
1 1 0 +x
1 1
x3 +x +1

1 x3
1011 1101010 x3+x+1 x6+x5 +x3 +x
1011 x6 +x4+x3
1100 x5+x4

128
Training material of NHK-CTI

Cyclic code by shift register (6) Training material of NHK-CTI

11 1 0
1 1 1 +x
1 1
x3 +x +1

11 +x3+x2
1011 1101010 x3+x+1 x6+x5 +x3 +x
1011 x6 +x4+x3
1100 x5+x4
1011 x5 +x3+x2
1111 x4+x3+x2+x

129
Training material of NHK-CTI

Cyclic code by shift register (7) Training material of NHK-CTI

111 0
1 0 0
1 1
x3 +x +1

111 +x3+x2+x
1011 1101010 x3+x+1 x6+x5 +x3 +x
1011 x6 +x4+x3
1100 x5+x4
1011 x5 +x3+x2
1111 x4+x3+x2+x
1011 x4 +x2+x
1000 x3

130
Training material of NHK-CTI

Cyclic code by shift register (8) Training material of NHK-CTI

1111
Output: q(x)
0 1 1
x3 +x +1
Remainder (contents of resisters) : R(x)
1111 +x3+x2+x+1 Output: q(x)
1011 1101010 x3+x+1 x6+x5 +x3 +x
1011 x6 +x4+x3
1100 x5+x4
1011 x5 +x3+x2
1111 x4+x3+x2+x
1011 x4 +x2+x
1000 x3
1011 x3 +x+1
011 x+1
Remainder: R(x)
131
Training material of NHK-CTI

Convolutional Training material of NHK-CTI

coder & decoder

132
Training material of NHK-CTI

Trellis diagram
:1bit delay 10
+ c1 + :Adder 11/0 01/0
11010
a1 a2 a3 00/0 00 10/0 00/1 11 10/1
Input Output c1c2
a1 a2 a3
+ c2 c1=a1+a2 +a3 11/1 01/1
01
c2=a1 +a3
State (a3 a2) c1c2/a1
State A:(0 0) 00/0
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1
01/0
State D:(1 1) 01/1
133
Training material of NHK-CTI

Trellis diagram (1)


:1bit delay 10
+ c1 + :Adder 11/0 01/0
11010
0 0 00/0 00 10/0 00/1 11 10/1
Input Output c1c2
a1 a2 a3
+ c2 c1=a1+a2 +a3
11/1 01/1
01
c2=a1 +a3
State (a3 a2) c1c2/a1
00/0
State A:(0 0)
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1

01/0
01/1
State D:(1 1)
134
Training material of NHK-CTI

Trellis diagram (2)


:1bit delay 10
+ 0 + :Adder 11/0 01/0
1101
0 0 0 00/0 00 10/0 00/1 11 10/1
Input Output c1c2
a1 a2 a3
+ 0 c1=a1+a2 +a3 0
11/1 01/1
01
0
c2=a1 +a3
State (a3 a2) 0 c1c2/a1
00
00/0
State A:(0 0)
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1

01/0
01/1
State D:(1 1)
135
Training material of NHK-CTI

Trellis diagram (3)


:1bit delay 10
+ c1 + :Adder 11/0 01/0
110
1 0 0 00/0 00 10/0 00/1 11 10/1
Input Output c1c2
a1 a2 a3

c2 c1=a1+a2 +a3
11/1 01/1
01
c2=a1 +a3
State (a3 a2) 0 c1c2/a1
00
00/0
State A:(0 0)
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1

01/0
01/1
State D:(1 1)
136
Training material of NHK-CTI

Trellis diagram (4)


:1bit delay 10
+ 1 + :Adder 11/0 01/0
110
1 0 0 00/0 00 10/0 00/1 11 10/1
Input Output c1c2
a1 a2 a3
+ 1 c1=a1+a2 +a3 1
11/1 01/1
01
c2=a1 +a3 1
State (a3 a2) 0 1 c1c2/a1
00 11
State A:(0 0) 00/0
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1
01/0
State D:(1 1) 01/1
137
Training material of NHK-CTI

Trellis diagram (5)


:1bit delay 10
+ c1 + :Adder 11/0 01/0
11
0 1 0 00/0 00 10/0 00/1 11 10/1
Input Output c1c2
a1 a2 a3
+ c2 c1=a1+a2 +a3 11/1 01/1
01
c2=a1 +a3
State (a3 a2) 0 1 c1c2/a1
00 11
State A:(0 0) 00/0
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1
01/0
State D:(1 1) 01/1
138
Training material of NHK-CTI

Trellis diagram (6)


:1bit delay 10
+ 1 + :Adder 11/0 01/0
11
0 1 0 00/0 00 10/0 00/1 11 10/1
Input Output c1c2
a1 a2 a3
+ 0 c1=a1+a2 +a3 1
11/1 01/1
01
c2=a1 +a3 0
State (a3 a2) 0 1 0 c1c2/a1
00 11 10
State A:(0 0) 00/0
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1
01/0
State D:(1 1) 01/1
139
Training material of NHK-CTI

Trellis diagram (7)


:1bit delay 10
+ c1 + :Adder 11/0 01/0
1
1 0 1 00/0 00 10/0 00/1 11 10/1
Input Output c1c2
a1 a2 a3
c2 11/1 01/1
+ c1=a1+a2 +a3 01
c2=a1 +a3
State (a3 a2) 0 1 0 c1c2/a1
00 11 10
State A:(0 0) 00/0
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1
01/0
State D:(1 1) 01/1
140
Training material of NHK-CTI

Trellis diagram (8)


:1bit delay 10
+ 0 + :Adder 11/0 01/0
1
1 0 1 00/0 00 10/0 00/1 11 10/1
Input Output c1c2
a1 a2 a3
+ 0 c1=a1+a2 +a3 0
11/1 01/1
01
c2=a1 +a3 0
State (a3 a2) 0 1 0 1 c1c2/a1
00 11 01 00
State A:(0 0) 00/0
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1
01/0
State D:(1 1) 01/1
141
Training material of NHK-CTI

Trellis diagram (9)


:1bit delay 10
+ c1 + :Adder 11/0 01/0

1 1 0 00/0 00 10/0 00/1 11 10/1


Input Output c1c2
a1 a2 a3
11/1 01/1
+ c2 c1=a1+a2 +a3 01
c2=a1 +a3
State (a3 a2) 0 1 0 1 c1c2/a1
00 11 01 00
State A:(0 0) 00/0
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1
01/0
State D:(1 1) 01/1
142
Training material of NHK-CTI

Trellis diagram (10)


:1bit delay 10
+ 0 + :Adder 11/0 01/0

1 1 0 00/0 00 10/0 00/1 11 10/1


Input Output c1c2
a1 a2 a3
+ 1 c1=a1+a2 +a3 0
11/1 01/1
01
c2=a1 +a3 1
State (a3 a2) 0 1 0 1 1 c1c2/a1
00 11 01 00 01
State A:(0 0) 00/0
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1
01/0
State D:(1 1) 01/1
143
Training material of NHK-CTI

Trellis diagram (11)


:1bit delay 10
+ c1 + :Adder 11/0 01/0

1 1 00/0 00 10/0 00/1 11 10/1


Input Output c1c2
a1 a2 a3
c2 11/1 01/1
+ c1=a1+a2 +a3 01
c2=a1 +a3
State (a3 a2) 0 1 0 1 1 c1c2/a1
00 11 01 00 01
State A:(0 0) 00/0
00/1
11/0
State B:(0 1) 11/1
10/0
State C:(1 0) 10/1
01/0
State D:(1 1) 01/1
144
Training material of NHK-CTI

Error correction by Viterbi decoding


Information message 0 1 0 1 1 0 0 0
Transmitted message 00 11 10 00 01 01 11 00
Received message 10 11 11 00 01 01 11 00
1 (1) 2 (3) (5)
State A:(0 0)
1 (2)
(1) 0 (1) (3)
State B:(0 1)
1 (2) (4)
(2)
State C:(1 0)
(3)

1 (2) (2)
State D:(1 1)
(3)
1 (2) c1c2/a1
00/0 10/0
00/1 10/1
Blanch metric Path metric 11/0 01/0
11/1 01/1 145
Training material of NHK-CTI

Error correction by Viterbi decoding


Information message 0 1 0 1 1 0 0 0
Transmitted message 00 11 10 00 01 01 11 00
Received message 10 11 11 00 01 01 11 00
(1) (3) (2)
State A:(0 0) (2) (4)
(1) (1) (4)
(3)
State B:(0 1) (2)
(2) (2) (4)

State C:(1 0)
(3)
(2)
(2) (4)
State D:(1 1) (3)

c1c2/a1 00/0 10/0


00/1 10/1
11/0 01/0
11/1 01/1
146
Training material of NHK-CTI

Error correction by Viterbi decoding


Information message 0 1 0 1 1 0 0 0
Transmitted message 00 11 10 00 01 01 11 00
Received message 10 11 11 00 01 01 11 00
(1) (3) (2) (3)
State A:(0 0)
(2) (4)
(1) (1) (3)
(3)
State B:(0 1) (2)
(2) (4)
(2)
(4)
State C:(1 0)
(3)
(2) (3)
(2)
State D:(1 1) (3) (5)

c1c2/a1 00/0 10/0


00/1 10/1
11/0 01/0
11/1 01/1 147
Training material of NHK-CTI

Error correction by Viterbi decoding


Information message 0 1 0 1 1 0 0 0
Transmitted message 00 11 10 00 01 01 11 00
Received message 10 11 11 00 01 01 11 00
(1) (2) (3) (4)
State A:(0 0)
(2) (4)
(1) (1) (3) (4)
State B:(0 1) (2)
(2) (4)
(2) (5)
State C:(1 0)
(3) (3) (2)
(2)
(2) (3)
State D:(1 1) (3) (4)

c1c2/a1 00/0 10/0


00/1 10/1
11/0 01/0
11/1 01/1 148
Training material of NHK-CTI

Error correction by Viterbi decoding


Information message 0 1 0 1 1 0 0 0
Transmitted message 00 11 10 00 01 01 11 00
Received message 10 11 11 00 01 01 11 00
(1) (2) (6)
State A:(0 0)
(2) (4) (2)
(1) (1) (3) (4)
State B:(0 1) (2)
(2) (4) (4)
(2)
(5)
State C:(1 0)
(3) (2) (4)
(2)
(2) (3) (5)
State D:(1 1) (3) (4)

c1c2/a1 00/0 10/0


00/1 10/1
11/0 01/0
11/1 01/1 149
Training material of NHK-CTI

Error correction by Viterbi decoding


Information message 0 1 0 1 1 0 0 0
Transmitted message 00 11 10 00 01 01 11 00
Received message 10 11 11 00 01 01 11 00
(1) (2) (2)
State A:(0 0) (6)
(2) (2)
(1) (1) (3) (4)
State B:(0 1) (2)
(2) (4)
(2) (4)
(5)
State C:(1 0)
(2) (4) (5)
(2) (3) (5)
State D:(1 1)
(4) (5)

c1c2/a1 00/0 10/0


00/1 10/1
11/0 01/0
11/1 01/1 150
Training material of NHK-CTI

Error correction by Viterbi decoding


Information message 0 1 0 1 1 0 0 0
Transmitted message 00 11 10 00 01 01 11 00
Received message 10 11 11 00 01 01 11 00
(1) (2) (2)
State A:(0 0)
(2) (2) (6)
(1) (1) (3) (4)
State B:(0 1) (2)
(2) (4)
(2) (5)
State C:(1 0)
(2) (4)
(2) (3) (5)
State D:(1 1)
(4)

c1c2/a1 00/0 10/0


00/1 10/1
11/0 01/0
11/1 01/1 151
Training material of NHK-CTI

Error correction by Viterbi decoding


Information message 0 1 0 1 1 0 0 0
Transmitted message 00 11 10 00 01 01 11 00
Received message 10 11 11 00 01 01 11 00

Error
Corrected message 00 11 10 00 01 01 11 00
(1) (2)
State A:(0 0)
(2)
(1)

State B:(0 1) (2)


(2)

State C:(1 0)
(2)
(2)
State D:(1 1)

152
Training material of NHK-CTI

Appendix Training material of NHK-CTI

Concept of error correction

153
Training material of NHK-CTI
Relationship between error correction, detection and
minimum Hamming distance

Corrected area C3 Cn=(x1,x2,・・,xn)

dmin≧2t*+s*+1
Possible to correct less and equal t*bits
Possible to detect less and equal than
(t*+s*) bits

t* s* 1 t*
C1 C2 Codeword
dmin
Invalid codeword

Detected area 154


Training material of NHK-CTI

Example error correction and detection

t* t* t*=2、s*=0
dmin=5 C1 C2 Possible to correct and detect less and
dmin equal than 2bits errors
Error
corrected
area t*=1、s*=2
t* s* t*
Possible to correct 1bit error and detect
C1 C2 less and equal than 3bits errors
dmin
Error
detected
area
t*=0、s*=4
s* Impossible to correct.
C1 C2 Possible to detect less and equal than
dmin 4bits errors

155

Das könnte Ihnen auch gefallen