Code
Code
B n = {b1 b2 . . . bn |bi ∈ B}
+ 0 1 · 0 1
0 0 1 0 0 0
1 1 0 1 0 1
(b1 b2 . . . bn , c1 c2 . . . cn ) 7→ d1 d2 . . . dn .
1
i) verify that (B n , +) is an abelian group,
Noise
w E(w) r D(r)
Definition 1.1. Let k, n ∈ N be such that m < n. A binary (n, m) code (or
code) compose of :
2
In general, M 6= B m . WLOG, we assume for a convenience that M = B m .
Then a code C := E(M ) = E(B m ) and |C| = 2m .
E : B m → B m+1 by b1 b2 . . . bm 7→ b1 b2 . . . bm bm+1
where
0 if the number of 1s0 in b1 b2 . . . bm is even
bm+1 =
1 if the number of 1s0 in b1 b2 . . . bm is odd
and
D : B m+1 → B m
by
b1 b2 . . . bm if the number of 1s0 in b1 b2 . . . bm is even
b1 b2 . . . bm bm+1 7→
00 . . . 0 if the number of 1s0 in b1 b2 . . . bm is odd
message word 000 001 010 100 011 101 110 111
code word 0000 0011 0110
3
The following received words are decoded as in the table :
E : B m → B 3m
is defined by
b1 b2 . . . bm 7→ b1 b2 . . . bm b1 b2 . . . bm b1 b2 . . . bm
is defined by
x1 x2 . . . xm y1 y2 . . . ym z1 z2 . . . zm 7→ b1 b2 . . . bm
where
0 if 0 occurs in xi yi zi at least twice
bi 7→
1
if 1 occurs in xi yi zi at least twice
4
For example, B 3 is encoded as follow :
received word 101 101 101 010 111 110 011 101 110 001 101 001 111 000, 101
message word 101
5
The distance and weight defined above are called the Hamming-distance
and Hamming-weight , respectively.
Lemma 1.1. Let u, v ∈ B n . Then w(u) = d(u, 0) and d(u, v) = w(u + v).
i) d(u, v) ≥ 0,
0000000
1001100 3
1101101
0110011
6
The following table displays H-distance between any two code words in C :
0000000 0 3
1001100
1101101
0110011 5
1. Find the closest code word v ∈ C such that d(r, v) ≤ d(r, u) for all
u ∈ C:
Assume that 0001001, 1010100, 1001001, 0100101, 1110100, 1111111 are received
words. We decode them as follows :
By 1st method,
0001001 2
1010100 3 2 4 5 1001100
1001001
0100101
1110100
1111111
7
By 2nd method,
0001001
1010100
1001001
0100101
1110100
1111111
Definition 1.4. Let C be a code such that |C| 6= 1. The minimum distance
d(C) of C is
d(C) = min{d(u, v)|u, v ∈ C, u 6= v}.
8
The minimum weight w(C) of C is
The minimum distance of a code tell me about the correction (and de-
tection) capability of its.
1. Even parity check code in Example 1.1 has the minimum distance 2
and hence it can detect at most 1−error but cannot correct any error.
(Verify !)
9
Example 1.6. Let C = {00000000, 11101011, 01011110, 10110101} be a (8, 2)
code. Distance between any two code words display on the table :
00000000 0 6 5 5
11101011 6 0 5 5
01011110 5 5 0 6
10110101 5 5 6 0
Then C has the minimum distance 5. This means that can correct at most
2−errors.
Assume complete nearest neighbor decoding is used. If words 11111111, 00001011
and 11110000 are received, we can decode as follow :
When size of code is large, the minimum distance of code is hard to com-
pute. Next, we introduce you a more efficiency code which is called a linear
code (or group code).
10
2 Linear Codes (group codes)
Recall that (B n , +) is an abelian group.
Definition 2.1. A (n, k) code C ⊆ B n is called a linear code (or group code)
if for all u, v ∈ C, u + v ∈ C.
1. Even parity check code in Example 1.1 is a linear code with the mini-
mum distance 2. Hence it is a [m + 1, m, 1] code. (Verify !)
11
Example 2.3. Consider the code C = {111111, 100110, 010001, 011010}.
Then C has the minimum distance d(C) = 3 is not equal to w(C) = 2. Why?
For any code, we can decode by methods which described in Example 1.3.
Now, If C is a linear code, we have more efficiency methods.
The above table is called the standard decoding array (or standard array).
12
Theorem 2.2. Coset decoding is nearest neighbor decoding.
Assume that coset decoding is used. If words 0101, 1010, 1111, 1011, 0111 are
received, then we decode them as r + v where r is a received word and v is a
coset leader :
13
Example 2.6. Construct the standard array for the linear [6, 3, 3] code
C + 000000 C+ C+ C+ C+ C+ C+ C+
000000
001110
010101
011011 .
100011
101101
110110
111000
14
2.2 Generator Matrix, Parity-check Matrix and De-
coding
Definition 2.4. Let G be a binary k × n matrix such that k < n and the
first k columns is an identity matrix Ik . Define E : B k → B n by E(w) = wG.
Then C := {wG|w ∈ B k } is called a code generated by G and G is called the
(standard) generator matrix for C.
Lemma 2.3. If G and H are generator matrix and parity-check matrix for
a linear code C,respectively, then HGt = [0]
15
Theorem 2.4. If G = [Ik A] is a generator matrix for a linear [n, k] code
C, then H = [At In−k ] is a parity check matrix for C.
Conversely, if H = [B In−k ] is a parity check for a linear [n, k] code C, then
G = [Ik B t ] is a generator matrix for C.
Example 2.7. Even parity check code in Example 1.1 is a linear code with
the generator matrix
1
..
G = Im . .
1
Then
C : = {wG|w ∈ B 3 }
={ }.
16
2. The parity-check matrix
H=
C + 000000 C+ C+ C+ C+ C+ C+ C+
17
Example 2.9. Let
1 0 0 0 1 1 1
0 1 0 0 1 1 0
G=
.
0 0 1 0 1 0 1
0 0 0 1 0 1 1
Then
C : = {wG|w ∈ B 4 }
={
}.
18
3. All cosets and coset leaders
C + 000000 C+ C+ C+ C+ C+ C+ C+
Definition 2.6. Let H be the parity-check matrix for a linear [n, k] code C.
For each v ∈ B n , the syndrome S(v) of v is defined by S(v) = Hv t
Theorem 2.5. Let H be the parity-check matrix for a linear [n, k] code C
and u, v ∈ B n .Then
19
i) S(u + v) = S(u) + S(v),
iii) S(u) = S(v) if and only if u and v are in the same coset.
Definition 2.7. A table which matches each coset leader e with its syndrome
is called a syndrome look-up table.
Example 2.10. Construct a syndrome look-up table for a [6, 3] code in Ex-
ample 2.8.
coset leader v syndrome S(v)
20
Assume that syndrome decoding is used. Decode following received words :
Exercise 2.3. Construct a syndrome look-up table for a [7, 4] code in Ex-
ample 2.9. Assume that syndrome decoding is used. Then decode following
received words : 0001001, 1010100, 1001001, 0100101, 1110100, 1111111.
2. If S(r) 6= [0] and S(r) is column i of H, decode by changing its ith bit.
Exercise 2.4. For a [7, 4] code in Example 2.9. Assume that parity-check
matrix decoding is used. Then decode followings received words :
21
References
[1] F.J. MacWilliams and N.J.A. Sloan, The Theory of Error-Correcting
Codes. New York:Elsevier/North Halland, 1977.
[2] San Ling and Chaoping Xing, Coding Theory : A First Course. Cam-
bridge University Press, 2004.
22