0% found this document useful (0 votes)
60 views

Channel Coding: Binit Mohanty Ketan Rajawat

Channel coding techniques allow reliable transmission of information over noisy channels by adding redundant bits. Traditional techniques like repetition coding, parity check codes, and convolutional codes improve reliability but do not achieve the theoretical channel capacity limit. Turbo codes proposed in 1993 combine simple component codes to achieve performance very close to the capacity limit, while keeping decoding complexity feasible. This breakthrough enabled reliable communication at rates previously considered impossible.

Uploaded by

sam moha
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
60 views

Channel Coding: Binit Mohanty Ketan Rajawat

Channel coding techniques allow reliable transmission of information over noisy channels by adding redundant bits. Traditional techniques like repetition coding, parity check codes, and convolutional codes improve reliability but do not achieve the theoretical channel capacity limit. Turbo codes proposed in 1993 combine simple component codes to achieve performance very close to the capacity limit, while keeping decoding complexity feasible. This breakthrough enabled reliable communication at rates previously considered impossible.

Uploaded by

sam moha
Copyright
© © All Rights Reserved
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 16

Channel Coding

Binit Mohanty
Ketan Rajawat
Recap…
 Information is transmitted through
channels (eg. Wires, optical fibres
and even air)
 Channels are noisy and we do not
receive what was transmitted
System Model
 A Binary Symmetric Channel

 Crossover with probability p


Repetition Coding
 Assume 1/3 repetition 0  000
 What is the probability of 1  111
error ?
Pe  C2 p (1  p )  p
3 2 3

 If crossover probability p = 0.01, Pe ≈


0.0003
 Here coding rate R = 1/3. Can we do
better? How much better?
Shannon’s Theorem
 Given,
 A noisy channel (some fixed p)
 A value of Pe which we want to achieve

“We can transmit through the channel and achieve


this probability of error at a maximum coding
rate of C(p)”

 Is it counterintuitive?
 Do such good codes exist?
Channel Capacity
 C(p) is called the channel capacity
 For binary symmetric channel,

C ( p  0.01)  0.9192
 Can we really design codes that achieve
this rate? How?
Parity Check Codes
 #information bits transmitted = k
 #bits actually transmitted = n = k+1
 Code Rate R = k/n = k/(k+1)

 Error detecting capability = 1


 Error correcting capability = 0
2-D Parity Check
 Rate? 1 0 0 1 0 0
0 1 0 0 0 1
 Error detecting Last column consists
1 0 0 1 0 0 of check bits for each
capability? 1 1 0 1 1 0 row

 Error correcting 1 0 0 1 1 1
Bottom row consists of
capability? check bit for each column
Linear Block Codes
 #parity bits n-k (=1 for Parity Check)
 Message m = {m1 m2 … mk}
 Transmitted Codeword c = {c1 c2 … cn}
 A generator matrix Gkxn

c  mG
 What is G for repetition code?
 For parity check code?
Linear Block Codes
 Linearity c1  m1G ,
c1  c2  ( m1  m2 )G c2  m2G

 Example : 4/7 Hamming Code


 k = 4, n = 7
 4 message bits at (3,5,6,7)
 3 parity bits at (1,2,4)
 Error correcting capability =1
 Error detecting capability = 2
 What is G?
Cyclic codes
 Special case of Linear Block Codes
 Cyclic shift of a codeword is also a
codeword
 Easy to encode and decode,
 Can correct continuous bursts of errors
 CRC (used in Wireless LANs), BCH codes,
Hamming Codes, Reed Solomon Codes
(used in CDs)
Convolutional Codes
 Block codes require a buffer
 What if data is available serially bit by

bit? Convolutional Codes


 Example

k=1
n=2
Rate R = ½
Convolutional Codes
 Encoder consists of shift registers
forming a finite state machine
 Decoding is also simple – Viterbi
Decoder which works by tracking
these states
 First used by NASA in the voyager
space programme
 Extensively used in coding speech
data in mobile phones
Achieving Capacity
 Do Block codes and Convolutional
codes achieve Shannon Capacity?
Actually they are far away
 Achieving Capacity requires large k
(block lengths)
 Decoder complexity for both codes
increases exponentially with k – not
feasible to implement
Turbo Codes
 Proposed by
Berrou & Glavieux
in 1993
 Advantages

 Use very large block lengths


 Have feasible decoding complexity
 Perform very close to capacity
 Limitation – delay, complexity
Summary
 There is a limit on the how good
codes can be
 Linear Block Codes and Convolutional
Codes have traditionally been used
for error detection and correction
 Turbo codes in 1993 introduced a
new way of designing very good
codes with feasible decoding
complexity

You might also like