Sie sind auf Seite 1von 6

Error Control Coding week 2: Encoding of convolutional codes

Solutions to selected problems from Chapter 11


11.1 (a) The structure of the encoder realized in the controller canonical form
(CCF) is shown in Figure 1.
v (1)
v (2)


v (3)




u


Figure 1: Encoder in the CCF for the rate R = 1/3 convolutional code.
(b) The encoder has memory m = 2 and rate R = 1/3. The generator matrix
is
G(D) = [g (0) (D) g (1) (D) g (2) (D)] = [1 + D 1 + D2 1 + D + D2 ].
Equivalently, we can write
G(D) = G0 + G1 D + G2 D2
with
G0 = [1 1 1];

G1 = [1 0 1];

G2 = [0 1 1].

In time domain, the generator matrix is the semi-infinite matrix with the
structure

G0 G1 G2

G0 G1 G2

G=

G
G
G
0
1
2

... ... ...


which yields

111 101 011

111 101 011

G=
.
1
1
1
1
0
1
0
1
1

..
..
..
.
.
.
(c) The codeword corresponding to the information sequence u = (1 1 1 0 1)
is given by
v = uG = (111 010 001 110 100 101 011 000 ...).

Error Control Coding week 2: Encoding of convolutional codes

11.2 (a) The generator sequences of the convolutional encoder in Figure 11.3 on
page 460 are given by expressions (11.21a)-(11.21c) on the same page in
the book. Thus we obtain that the generator matrix of this rate R = 3/4
code is
(0)

(1)
(2)
(3)
g 1 (D) g 1 (D) g 1 (D) g 1 (D)
1
1
1
1


(1)
(2)
(3)
D
1 .
G(D) = g (0)
2 (D) g 2 (D) g 2 (D) g 2 (D) = 0 1 + D
2
(0)
(1)
(2)
(3)
0
D
1 + D 1 + D2
g 3 (D) g 3 (D) g 3 (D) g 3 (D)
Equivalently, we can write
G(D) = G0 + G1 D + G2 D2
with

1 1 1 1
G0 = 0 1 0 1 ;
0 0 1 1

0 0 0 0
G1 = 0 1 1 0 ;
0 1 0 0

0 0 0 0
G2 = 0 0 0 0 .
0 0 1 1

(b) The time-domain generator matrix is given by

1111 0000
0101 0110

G0 G1 G2

0011 0100

G
G
G
0
1
2

1111
G=
=
G
G
G
0
1
2

0101

.. .. ..

.
.
.
0011

0000

0000

0011

0000 0000
.

0110 0000

0100 0011

...
... ...

(c) The codeword corresponding to the information sequence u = (110 011 101)
is given by
v = uG = (1010 0000 1110 0111 0011 0000 ...).
11.3 (a) See solution of Problem 11.1 b).
(b) The input sequence u(D) = 1 + D2 + D3 + D4 (which corresponds to
the time domain sequence u=1111) is encoded as the codeword v(D) =
(v (0) (D) v (1) (D) v (2) (D)) according to
v(D) = u(D)G(D) = (1 + D2 + D3 + D4 )[1 + D 1 + D2 1 + D + D2 ]
= (1 + D + D2 + D5 1 + D3 + D5 + D6 1 + D + D4 + D6 ).
11.4 (a) The memory m = 1, rate R = 2/3 convolutional encoder shown in Figure
11.2 on page 457 in the book has the generator matrix


1+D D 1+D
G=
.
D
1
1
The composite generator polynomials are (cf. expression (11.37) in the
book)
g 1 (D) = 1 + D2 + D3 + D4 + D5 ;

g 2 (D) = D + D2 + D3 .

Error Control Coding week 2: Encoding of convolutional codes

(b) The information sequence u(D) = (u(1) (D) u(2) (D)) = (1 + D + D3 1 +


D2 + D3 ) is encoded as the codeword v(D) = (v (0) (D) v (1) (D) v (2) (D))
according to


1+D D 1+D
3
2
3
v(D) = u(D)G(D) = (1 + D + D 1 + D + D )
D
1
1
= (1 + D + D2 1 + D + D3 + D4 D4 ),
which corresponds to time-domain sequence of 3-tuples
v = (110 110 100 010 011),
and, which, equivalently, can be written as a polynomial
1 + D + D3 + D4 + D6 + D10 + D13 + D14 .
11.5 (a) The generator polynomials of this memory m = 5, rate R = 1/3 systematic encoder are, in D-domain,
g (0) (D) = 1;

g (1) (D) = 1 + D2 + D3 + D5 ;

g (2) (D) = 1 + D + D4 + D5 ,

which yields
G(D) = G0 + G1 D + G2 D2 + G3 D3 + G4 D4 + G5 D5
with
G0 = [1 1 1]; G1 = [0 0 1]; G2 = [0 1 0]; G3 = [0 1 0]; G4 = [0 0 1]; G5 = [0 1 1].
Thus, the time-domain generator matrix

111 001 010 010 001

111 001 010 010

G=
111 001 010

..
.

is

011

001 011

.
010 001 011

..
.. ..
.
.
.

(b) The parity sequences corresponding to the input sequence u = (1 1 0 1),


that is, u(D) = 1 + D + D3 , are
v (1) (D) = u(D)g (1) (D) = 1 + D + D2 + D3 + D4 + D8
v (2) (D) = u(D)g (2) (D) = 1 + D2 + D3 + D6 + D7 + D8
or, in time-domain,
v (1) = (1 1 1 1 1 0 0 0 1)
v (2) = (1 0 1 1 0 0 1 1 1).

Error Control Coding week 2: Encoding of convolutional codes

v (0)

u(1)




v (1)
v (2)

u(2)




Figure 2: CCF realization of the rate R = 2/3 convolutional encoder.


u(1)

v (0)






u(2)


v (1)

v (2)

Figure 3: OCF realization of the rate R = 2/3 convolutional encoder.


11.6 The generator matrix of this memory m = 3, rate R = 2/3 systematic encoder
is given by
# 
"

(0)
(1)
(2)
1 0 1 + D2 + D3
g 1 (D) g 1 (D) g 1 (D)
.
=
G(D) = (0)
(1)
(2)
0 1 1 + D + D3
g 2 (D) g 2 (D) g 2 (D)
(a) The controller canonical form (CCF) realization of the encoder requires
6 memory elements and it is shown in Figure 2.
(b) The observer canonical form (OCF) realization of the encoder requires
only 3 memory elements and it is shown in Figure 3.
11.8 The OCF realization of the systematic generator matrix
h
i
1+D2
1+D
G(D) = 1 1+D+D2 1+D+D2
is shown in Figure 4. The overall constraint length of G(D) is = 2. Note
that this is the number of memory elements required for the CCF, not for the
OCF!
11.14 (a) The generator polynomials are relatively prime, that is, their greatest
common divisor is GCD = 1.
(b) Since the GCD = 1, the right inverse G1 (D) must satisfy
G(D)G1 (D) = I,
that is,
[1 + D2 1 + D + D2 ] G1 (D) = 1.

Error Control Coding week 2: Encoding of convolutional codes

v (0)

u


v (1)




v (2)


Figure 4: OCF realization of the rate R = 1/3 systematic convolutional encoder.


By inspection, we easily find



1+D
G (D) =
.
D
1

11.15 (a) The generator polynomials are not relatively prime. Their greatest common divisor is GCD = 1 + D2 . Thus, the encoder is catastrophic and the
polynomial right inverse of the generator matrix does not exist.
(b) The encoders state diagram is shown in Figure 5.
1/10
100
0/01

1/11
0/00

000

110
1/11

1/01

1/00
1/10

0/01

010

0/00

0/11
001

101

0/10

1/00

0/11

1/01
0/10

111

011

Figure 5: State transition diagram of a catastrophic encoder.


(c) The cycles through the states 2 5 2 and 7 7 both produce zero
output weight, while the corresponding input is non-zero.
(d)+(e) The encoder is catastrophic, hence, there exists an infinite-weight input
sequence that produces finite-weight output code sequence. In our case,
input sequences that bring the encoder in one of the two zero-weight
cycles we found in part c) result in finite-weight output sequences. For
example, infinite weight sequence
u = 1010101010...

Error Control Coding week 2: Encoding of convolutional codes

that is,
u(D) = 1 + D2 + D4 + D6 + D8 + D10 + ...
yields the state sequence 0425252.... and the corresponding
output code sequence has only weight 3:
v = (11 01 00 00 00 ...)
that is,
v(D) = (v (0) (D) v (1) (D)) = (1 1 + D).
11.16 For a systematic (n, k, ) encoder, the k n generator matrix is of the form
G(D) = [I k | P (D)]
where I k is the k k identity matrix corresponding to the systematic symbols
and P (D) is a k (n k), in general rational, matrix corresponding to parity
symbols. The polynomial (feedforward) right inverse G1 (D) with delay l = 0
must be such that
G(D)G1 (D) = I k .
Clearly, the matrix satisfying this condition is given by


Ik
1
G (D) =
,
0(nk)k
where 0(nk)k is the all-zero matrix of size (n k) k.

Das könnte Ihnen auch gefallen