Communication Channels: Course Name: Information Theory and Coding Course Code: 19ECE312 Semester: 6 Sem - ECE
Communication Channels: Course Name: Information Theory and Coding Course Code: 19ECE312 Semester: 6 Sem - ECE
1
Communication Channels - Introduction
Discrete Data Communication Channel
Modulation Channel
(Analog)
Electrical Noise
Communication
(1) (2) (3) (4) (5) (6) Channel (7)
Channel Channel De-
Modulator modulator
Binary
Encoder or Decoder Binary
Input Transmission Output
Medium
3
Communication Channels - Introduction
• Between points (2) and (6) in figure 3.1 represents the discrete
channel referred to as “coding channel”, which accepts a
sequence of symbols at its input and produces a sequence of
symbols at its output.
6
Discrete Communication Channels
• A communication channel whose input and output each
have an alphabet of distinct letters, or, in the case of a
physical channel, whose input and output are signals that
are discrete in time and amplitude.
7
Discrete Communication Channels
• Due to errors in the channel, the output symbols may be
different from the input symbol during any symbol
interval.
8
Representation of a Channel
10
Channel Matrix
• Totally there are, r x s number of conditional probabilities
which are represented in a “matrix” form with all the
input symbols represented row-wise and output symbols
column-wise.
11
12
Communication Channels
Course Name : Information Theory and Coding
Course Code : 19ECE312
Semester : 6th Sem – ECE
1
Channel Diagram
Example 3.1: Let us consider a transmitter emitting
discrete symbols from an input alphabet A. Let the
symbols be {a1, a2, a3} which are encoded using binary
coding as 00, 01 and 10 respectively. Due to noise
present in the channel, there may be 4 symbols received
at the receiver with output alphabet B given by {b1, b2,
b3, b4} with code words 00, 01, 10 and 11. Note that no
code word with code “11” is not transmitted and is
present in the receiver due to noise. Complete diagram
showing all the symbols at the input and output of the
channel is as shown in figure 3.3. Such a diagram called
“Channel Diagram” or “Noise Diagram”.
2
Channel Diagram
4
Channel Diagram
5
Channel Diagram
6
Channel Diagram
7
Joint Probability
8
Joint Probability
9
Properties of Joint Probability Matrix
10
Properties of Joint Probability Matrix
11
Properties of Joint Probability Matrix
12
Example 2
13
Example 2
14
Example 2
15
Example 2
16
Example 2
17
Example 2
18
Example 2
19
20
Entropy Functions and Equivocation
Course Name : Information Theory and Coding
Course Code : 19ECE312
Semester : 6th Sem – ECE
1
Priori Entropy
2
Posteriori (Conditional) Entropy
3
Equivocation
4
Equivocation
5
Mutual Information
6
Mutual Information
7
Mutual Information
8
Mutual Information
9
Mutual Information
10
Mutual Information
11
Mutual Information
12
Mutual Information
13
Properties of Mutual Information
14
15
Entropy Functions and Equivocation
Course Name : Information Theory and Coding Techniques
Course Code : 19ECE312
Semester : 6th Sem – ECE
1
Example 1
Compute various entropies of the channel which is
described by the JPM as shown below
2
Example 1 - Solution
3
Example 1 - Solution
4
Example 1 - Solution
5
Example 1 - Solution
6
Example 1 - Solution
7
Example 2
8
Example 2 - Solution
9
Example 2 - Solution
10
Example 2 - Solution
11
Example 2 - Solution
12
Example 3
13
Example 3 - Solution
14
Example 3 - Solution
15
Example 3 - Solution
16
Example 3 - Solution
17
Example 3 - Solution
18
Example 3 - Solution
19
Example 3 - Solution
20
Example 3 - Solution
21
Example 3 - Solution
22
23
Channel Capacity
Course Name : Information Theory and Coding Techniques
Course Code : 19ECE312
Semester : 6th Sem – ECE
1
Rate of Information Transmission over a
Discrete Channel
2
Rate of Information Transmission over a
Discrete Channel
3
Rate of Information Transmission over a
Discrete Channel
4
Capacity of Discrete Memoryless Channel
5
Shannon’s Theorem on Channel Capacity
6
Shannon’s Theorem on Channel Capacity
7
Channel Efficiency and Redundancy
8
Special Channels
9
Symmetric / Uniform Channel
10
Symmetric / Uniform Channel
11
Symmetric / Uniform Channel
12
Channel Capacity of Symmetric / Uniform
Channel
13
Channel Capacity of Symmetric / Uniform
Channel
14
Channel Capacity of Symmetric / Uniform
Channel
15
Channel Capacity of Symmetric / Uniform
Channel
16
Channel Capacity of Symmetric / Uniform
Channel
17
Example 1
18
19
20
21
22
23
24
25
26
27
28
29
30
Different Types of Channels
Course Name : Information Theory
Course Code : 19ECE312
Semester : 6th Sem – ECE
1
Binary Erasure Channel (BEC)
2
Binary Erasure Channel (BEC)
3
Binary Erasure Channel (BEC)
4
Binary Erasure Channel (BEC)
5
Binary Erasure Channel (BEC)
6
Binary Erasure Channel (BEC)
7
Binary Erasure Channel (BEC)
8
Binary Erasure Channel (BEC)
9
Noiseless Channel
10
Example 1
11
Example 1 - Solution
12
Example 1 - Solution
13
Example 1 - Solution
14
Deterministic Channel
15
Example 2
16
Example 2 - Solution
17
Example 2 - Solution
18
Example 2 - Solution
19
Cascaded Channel
20
Example 3
21
Example 3 - Solution
22
Example 3 - Solution
23
Example 3 - Solution
24
Example 3 - Solution
25
Example 4
26
Example 4
27
Example 5
28
Example 5 - Solution
29
Example 5 - Solution
30
Example 5 - Solution
31
Example 5 - Solution
32
Example 5 - Solution
33
Example 5 - Solution
34
Example 5 - Solution
35
Example 6
36
Example 6 - Solution
37
Example 6 - Solution
38
Example 6 - Solution
39
Example 7
40
Example 8
41
42
Channel Capacity Theorem [Shannon
Hartley Law ] & its Implications
Course Name : Information Theory and Coding
Course Code : 19ECE312
Semester : 6th Sem – ECE
1
Introduction
• The waveform received at the receiver is corrupted by the
random process called NOISE.
1. Johnson Noise (White Noise) – due to thermal motion of
electrons
2. Additive noise – when noise adds to the signal
3. Fading noise – When noise multiplies to the signal
“ In channels, the noise is almost white and it has a
distribution which resembles Gaussian (or normal) distribution
with zero mean and some variance. Hence, this noise is also
called additive white Gaussian noise (AWGN) ”
2
Statement of Shannon Hartley Law
3
Shannon Hartley Law (contd..)
• Significance of Eq. (1) is that , it is possible to transmit over a
channel and bandwidth B Hz, corrupted by AWGN at a rate of
‘ C ’ bits/sec with an arbitrarily small probability of error if the
signal is encoded in such a manner that the samples are all
Gaussian signals.
4
1st Implication :
5
1st Implication (Contd..)
6
1st Implication (Contd..)
7
1st Implication (Contd..)
8
2nd Implication :
Bandwidth – (S/N) Trade Off:
9
2nd Implication (Contd..)
10
2nd Implication (Contd..)
11
2nd Implication (Contd..)
12
BW-SNR trade–off curve
13
BW-SNR trade–off curve (Contd..)
14
Numerical 1:
15
Solution
16
Solution (Contd…)
17
Numerical 2:
A Gaussian channel has 10 MHz bandwidth. If (S/N) ratio is 100,
calculate the channel capacity and the maximum information rate.
18
Solution
19
Numerical 3:
20
Solution
21
Solution (Contd..)
22
23
Numerical on Channel Capacity Theorem
[Shannon Hartley Law ] & its Implications
Course Name : Information Theory and Coding
Course Code : 19ECE213
Semester : 6th Sem – ECE
1
Numerical 1:
A black and white television picture may be viewed as
consisting approximately 3 x 105 elements, each one of which
may occupy one of 10 distinct brightness levels with equal
probability. Assume (a) the rate of transmission is 30 picture
frames per second and (b) the signal - to – noise ratio is 30 dB.
Using the channel capacity theorem (Shannon-Hartley law),
calculate the minimum bandwidth required to support the
transmission of the resultant video signal.
2
Numerical 2:
An analog signal has a 4KHz bandwidth. The signal is sampled
at 2.5 times the Nyquist rate and each sample quantized into 256
equally likely levels. Assume that the successive samples are
statistically independent.
(i) Find the information rate of this source.
(ii) Can the output of this source be transmitted without errors
over a Guassian channel of bandwidth 50KHz and (S/N)
ratio of 20 dB?
(iii) If the output of this source is to be transmitted without
errors over an analog channel having (S/N) of 10 dB,
compute the bandwidth requirement of the channel.
3
Numerical 3:
A Gaussian channel has a 10MHz bandwidth. If (S/N) ratio is
100, calculate the channel capacity and the maximum
information rate.
4
Numerical 4:
A Gaussian channel has a bandwidth of 4KHz and a two sided
noise power spectral density η/2 of 10-14 watts/Hz. Signal power
at the receiver has to be maintained at a level less than or equal
to 0.3mW. Calculate the capacity of the channel.
5
6