Itc
Itc
INFORMATION
THEORY &
CODING
Father of Digital Communication
z
Message Encoder
e.g. English symbols e.g. English to 0,1 sequence
Information
Coding
Source
Communicat
ion Channe
l
Destination Decoding
The lower bound corresponds to no uncertainty which occurs when one symbol
has probability 𝑃(𝑥𝑖)=1)=0 for j ≠I
X emits the same symbol xi all the time
The upper bound corresponds to the maximum uncertainty which occurs when
𝑃(𝑥𝑖)=1/m for all 𝑖, that is, when all symbols are equally likely to be emitted by X.
Consider a source alphabet S={s1, s2} with probabilities
z
p={1/256, 255/256}. Find entropy.
Consider a source alphabet S={s1, s2} with probabilities p={7/16,
z
9/16}. Find entropy.
Consider a source alphabet S={s1, s2} with probabilities p={1/2,
z
1/2}. Find entropy.
Consider a source alphabet S={s1, s2} with probabilities p={1/2,
z
1/2}. Find entropy.
Source Entropy Rate (Information Rate)
z
V(i) V1 V2 V3 V4 V5 V6
P(i) 1/6 1/3 1/12 1/12 1/6 1/6
A discrete source emits one of the six symbols once every msec. The symbol
probabilities
z are ½, ¼, 1/8, 1/16, 1/32 and 1/32 respectively. Find source entropy
and information rate.
A card is drawn from a deck.
a) How much
z information did you receive if you are told it is a spade.
b) Repeat a if it is an ace.
c) Repeat a it is an ace of spade
d) Verify that the information obtained in c is sum of information obtained in a and b
An analog signal is bandlimited to 500 Hz and is sampled at
Nyquist zrate. The samples are quantized into 4 levels. The
quantization levels are assumed to be independent and occur
with probability p1 = p4 = 1/8, p2 = p3 = 3/8. Find information rate.
A zero memory source has a source alphabet S={s 1, s2, s3} with
z
p={1/2, ¼, 1/4}. Find entropy of second extension and verify
H(s2) = 2H(s)
Entropy
z
Communication Channel
z
Interchanging A and B
DMS has alphabet with K different symbols and that the kth
symbol sk occurs with probability pk, k = 0, 1, , K – 1.
SOURCE CODING
z
S1 0
S2 10
S3 110
S4 1110
S5 1111
Coding efficiency & Coding redundancy
z
Nth extension
Shannon Fano encoding algorithm
z
Divide the whole set of source symbols into two subsets, each one
containing only consecutive symbols of the list, in such way that the two
probabilities of the subsets are as close as possible.Then, assign “ 1”
(respectively “ 0” ) to the symbols of the top (respectively bottom) subset.
Apply the process of the previous step to the subsets containing at least
two symbols.
The algorithm ends when there are only subsets with one symbol left
Symbol S1 S2 S3 S4 S5 s6 s7
Probability 0.3 0.3 0.12 0.12 0.06 0.06 0.04
z
z
Huffman Coding
z
q = r + (r-1)α
α = (q-3)/2
q = r + (r-1)α
α = (q-4)/2
Where
SPECIAL CHANNELS
z
4. Noiseless channel
5. Deterministic channel
6. Cascade channel
BINARY SYMMETRIC CHANNEL
z
p is probability of error
BINARY SYMMETRIC CHANNEL
z
Channel matrix
BINARY SYMMETRIC CHANNEL
z
On substitution
BINARY SYMMETRIC CHANNEL
z
B2 = 3 kHz
Noise power is N = η B
N1 = η B1 = η 4kHz N2 = η B2 = η 3kHz
S2/S1 = 15N2/7N1 = 15 η 4kHz/7 η 3kHz = 1.6
This means 60% increase in signal power is required for maintaining the
same channel capacity when BW reduced from 4 to 3kHz.
BANDWIDTH – SNR TRADEOFF
z
b. SNR is 30 dB.
Differential entropy
Consider a continuous random variable having distribution as
given below.
z Find differential entropy H(x)
A continuous random variable X is uniformly distributed in the
internal [0,4].
z Find differential entropy H(x). Suppose that X is a
voltage which is applied to an amplifier whose gain is 8. Find the
differential entropy of the output of the amplifier
z
z
INTRODUCTION TO ALGEBRA
z
GROUP
z
GROUP
z
GROUP
z
GROUP
z
FIELDS
z
FIELDS
z
FIELDS
z
FIELDS
z
FIELDS
z
FIELDS
z
FIELDS
z
FIELDS
z
CODES FOR ERROR DETECTION AND
z CORRECTION
Three approaches can be used to cope with data
transmission errors.
1. Using codes to detect errors.
Dataword d = (d1,d2…,dk)
Dataword length k = 4
Codeword length n = 7
That is, the additional bit ensures that there are an ‘even’ or
‘odd’ number of ‘1’s in the codeword
PARITY CODES – EXAMPLE
z
Even parity
Dataw Codew
ord ord
0 0 0 0 0 0 0
0 0 1 0 0 1 1
0 1 0 0 1 0 1
0 1 1 0 1 1 0
1 0 0 1 0 0 1
1 0 1 1 0 1 0
1 1 0 1 1 0 0
1 1 1 1 1 1 1
PARITY CODES
z
To decode
Calculate sum of received bits in block (mod 2)
If sum is 0 (1) for even (odd) parity then the dataword is the first k
bits of the received codeword Otherwise error
Code can detect single errors
But cannot correct error since the error could be in any bit
words
HAMMING DISTANCE
z
dmin - 1
That is the maximum number of correctable errors is given
by,
The coefficients pij are chosen in such a way that the rows of
the generator matrix are linearly independent and the parity-
check equations are unique
LINEAR BLOCK CODES
z
b = mP
LINEAR BLOCK CODES
z
LINEAR BLOCK CODES
z
The generator matrix G is in the canonical form, in that its k rows are
linearly independent
It is not possible to express any row of the matrix G as a linear
combination of the remaining rows.
The full set of codewords, referred to simply as the code, is generated
as
C = mG
by passing the message vector m range through the set of all 2k binary
k-tuples (1-by-k vectors)
Sum of any two codewords in the code is another codeword
[ ][ ]
z 𝑔0 1 1 0 1 0 0 0
𝑔1
𝐺= = 0 1 1 0 1 0 0
𝑔2 1 1 1 0 0 1 0
𝑔3 1 0 1 0 0 0 1
[ ][ ]
𝑔0 1 1 0 1 0 0 0
𝑔1
𝐺= = 0 1 1 0 1 0 0
𝑔2 1 1 1 0 0 1 0
𝑔3 1 0 1 0 0 0 1
Syndrome , s= rHT
SYNDROME: PROPERTIES
z
From the closure property of linear block codes, the sum (or
difference) of two code vectors is another code vector.
w(e) ≤ t
MINIMUM DISTANCE CONSIDERATIONS
z
The receiver has the task of partitioning the 2n possible received vectors
into 2k disjoint subsets in such a way that the ith subset Di corresponds to
code vector ci for 1 < i < 2k.
The received vector r is decoded into ci if it is in the ith subset.
To construct
1. The 2k code vectors are placed in a row with the all-zero code
vector c1 as the leftmost element.
of single-error patterns = 7
c6=d1⊕d3
1. Write down the G matrix
2. Construct all possible code words
3. Suppose r=[010111], the find syndrome
z
z
A (7,4) linear code with n=7 and k=4 corresponding G matrix
c1 c2 c3 … cn-1 c0
1010001 1110010
1101000 0111001
0110100 1011100
0011010 0101110
0001101 0010111
1000110 1001011
0100011 1100101
CYCLIC CODES
z
CYCLIC CODES
z
notice that the row are merely cyclic shifts of the basis vector
CYCLIC CODES
z
CYCLIC CODES
z
CYCLIC CODES
z
CYCLIC CODES
z
CYCLIC CODES
z
CYCLIC CODES
z
CYCLIC CODES
z
CYCLIC CODES
z
z
z
A (7,4) cyclic code generated by g(X)=1+X=X 3. Find the code
polynomial
z and codeword
SYSTEMATIC FORM OF GENERATOR MATRIX
z
SYSTEMATIC FORM OF GENERATOR MATRIX
z
SYSTEMATIC FORM OF GENERATOR MATRIX
z
In a (7,4) code g(X) = 1+X+X3 if m=(1010) find c.
z
Converting to systemic form
z
In a (7,4) code g(X) = 1+X+X3 and m(X)= 1+X3 . Find c(X) in
z
systematic form
ENCODER FOR CYCLIC CODES
z
ENCODER FOR CYCLIC CODES
z
z
Draw encoder for (7,4) code. Find c if m=(1011) and
g(X)=1+X+X
z 3
CONVOLUTIONAL CODES
z
In block coding encoder accepts k message blocks and generates n bit code
word.
Thus code words are produced on block by block basis
Here encoder contains memory and ‘n’ encoder output at any given time unit
depends not only on the ‘k’ inputs but also on the ‘m’ previous input blocks
(n,k,m) convolutional codes can be implemented with a ‘k’ input, ‘n’ output,
linear sequential circuit with input memory ‘m’.
ENCODING OF CONVOLUTIONAL CODES –
z TIME DOMAIN REPRESENTATION
Two categories
Feed forward
Feedback
Systematic
Non-systematic
GENERAL ENCODER
z
c = (1 1 0 1 0 0 0 1 1 1)
STATE DIAGRAM REPRESENTATION
z
State of an (n,1,m)
convolutional encoder is
defined as the contents of first
m-1 shift registers.
Thus encoder can be
represented as (m-1) state
machine.
The zero state is the state
when each of first m-1 shift
registers contain 0.
Total 2m-1 possible states
BINARY NON-SYMMETRIC FEED FORWARD
z ENCODER
We have
THANK YOU