0% found this document useful (0 votes)
28 views

Communication Channels: Course Name: Information Theory and Coding Course Code: 19ECE312 Semester: 6 Sem - ECE

The document discusses communication channels and their components. It describes: 1. The basic block diagram of a communication system consisting of a transmitter, physical channel, and receiver. 2. The different channels within the system including the coding channel, modulation channel, and data communication channel. 3. Sources of noise and disturbances that can corrupt signals and limit data transfer rates over channels. 4. Key concepts like channel capacity which represents the maximum error-free data transfer rate over a channel. 5. Representation of discrete channels using input/output alphabets and conditional probability matrices relating channel inputs and outputs.

Uploaded by

bts army
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
28 views

Communication Channels: Course Name: Information Theory and Coding Course Code: 19ECE312 Semester: 6 Sem - ECE

The document discusses communication channels and their components. It describes: 1. The basic block diagram of a communication system consisting of a transmitter, physical channel, and receiver. 2. The different channels within the system including the coding channel, modulation channel, and data communication channel. 3. Sources of noise and disturbances that can corrupt signals and limit data transfer rates over channels. 4. Key concepts like channel capacity which represents the maximum error-free data transfer rate over a channel. 5. Representation of discrete channels using input/output alphabets and conditional probability matrices relating channel inputs and outputs.

Uploaded by

bts army
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 171

Communication Channels

Course Name : Information Theory and Coding


Course Code : 19ECE312
Semester : 6th Sem – ECE

1
Communication Channels - Introduction
Discrete Data Communication Channel

Coding Channel (Discrete)

Modulation Channel
(Analog)

Electrical Noise
Communication
(1) (2) (3) (4) (5) (6) Channel (7)
Channel Channel De-
Modulator modulator
Binary
Encoder or Decoder Binary
Input Transmission Output
Medium

Transmitter Physical Channel Receiver


2
Fig. 3.1: Block Diagram of Communication System
Communication Channels - Introduction
• A “channel”, in the broad sense, is defined as the medium
through which the coded signals generated by an
information source are transmitted.

• The complete block diagram of a communication system


consisting of various terminological channels is shown in
figure 3.1.

• A practical communication system consists of a


transmitter, physical channel and a receiver.

• The transmitter consists of an encoder and modulator,


while the receiver consists of a demodulator and a decoder.

3
Communication Channels - Introduction
• Between points (2) and (6) in figure 3.1 represents the discrete
channel referred to as “coding channel”, which accepts a
sequence of symbols at its input and produces a sequence of
symbols at its output.

• The communication channel between points (3) and (5)


provides the electrical connection between the transmitter
and the receiver.

❖ The input and output are analog electrical waveforms.

❖ This portion of the channel is often called a “continuous or


modulation channel”.

❖ Examples are voiceband telephone systems and wideband


telephone systems, high frequency radio systems and
troposcatter systems. 4
Communication Channels - Introduction
• Between points (1) and (7) in the diagram represents the
binary data being presented at the input and binary data
back at the output and hence the channel is called “Data
Communication Channel (discrete)”.

• In the channel, there are various kinds of disturbance.

• These disturbances may be due to amplitude and


frequency response variations of the channel within its
passband, variations in channel characteristics and also
due to non-linearities in the channel.

• In addition, channel can also corrupt the signal due to


various types of additive and multiplicative noise.
5
Communication Channels - Introduction
• All these disturbances introduce errors in data
transmission and limit the maximum rate at which data
can be transferred over the channel.

• An important characteristic of a data communication


system is the “channel capacity” which represents the
maximum rate at which data is transferred across the
channel, with an arbitrarily small probability of error.

6
Discrete Communication Channels
• A communication channel whose input and output each
have an alphabet of distinct letters, or, in the case of a
physical channel, whose input and output are signals that
are discrete in time and amplitude.

• The size of the alphabet, or the number of amplitude levels,


is usually finite.

• The input to the channel is a symbol belonging to an


alphabet “A” with “r” symbols.

• The output of the channel is a symbol belonging to some


other alphabet “B” with “s” symbols.

7
Discrete Communication Channels
• Due to errors in the channel, the output symbols may be
different from the input symbol during any symbol
interval.

• Errors are mainly due to noise in the analog portion of the


channel.

8
Representation of a Channel

Fig. 3.2: Representation of a channel


• A communication channel is represented by a set of input
alphabet A = {a1, a2,……., ar} consisting of ‘r’ symbols, a
set of output alphabet B = {b1, b2,……., bs} consisting of ‘s’
symbols and a set of conditional probabilities P(bj/ai) with
i = 1, 2, ….., r and j = 1, 2, ….., s as shown in figure 3.2.
9
Representation of a Channel
• These conditional probabilities come into existence due to
the presence of noise in the channel.

• Because of this noise, there will be some amount of


uncertainty about the reception of any symbol.

• For this reason, we have a different number of symbols ‘s’


at the receiver from ‘r’ symbols at the transmitter.

10
Channel Matrix
• Totally there are, r x s number of conditional probabilities
which are represented in a “matrix” form with all the
input symbols represented row-wise and output symbols
column-wise.

• Such a matrix is called “Channel Matrix” or “Noise


Matrix” as given below”

11
12
Communication Channels
Course Name : Information Theory and Coding
Course Code : 19ECE312
Semester : 6th Sem – ECE

1
Channel Diagram
Example 3.1: Let us consider a transmitter emitting
discrete symbols from an input alphabet A. Let the
symbols be {a1, a2, a3} which are encoded using binary
coding as 00, 01 and 10 respectively. Due to noise
present in the channel, there may be 4 symbols received
at the receiver with output alphabet B given by {b1, b2,
b3, b4} with code words 00, 01, 10 and 11. Note that no
code word with code “11” is not transmitted and is
present in the receiver due to noise. Complete diagram
showing all the symbols at the input and output of the
channel is as shown in figure 3.3. Such a diagram called
“Channel Diagram” or “Noise Diagram”.

2
Channel Diagram

Fig. 3.3 Illustrating Channel or Noise diagram 3


Channel Diagram

4
Channel Diagram

5
Channel Diagram

6
Channel Diagram

7
Joint Probability

8
Joint Probability

9
Properties of Joint Probability Matrix

10
Properties of Joint Probability Matrix

11
Properties of Joint Probability Matrix

12
Example 2

13
Example 2

14
Example 2

15
Example 2

16
Example 2

17
Example 2

18
Example 2

19
20
Entropy Functions and Equivocation
Course Name : Information Theory and Coding
Course Code : 19ECE312
Semester : 6th Sem – ECE

1
Priori Entropy

2
Posteriori (Conditional) Entropy

3
Equivocation

4
Equivocation

5
Mutual Information

6
Mutual Information

7
Mutual Information

8
Mutual Information

9
Mutual Information

10
Mutual Information

11
Mutual Information

12
Mutual Information

13
Properties of Mutual Information

14
15
Entropy Functions and Equivocation
Course Name : Information Theory and Coding Techniques
Course Code : 19ECE312
Semester : 6th Sem – ECE

1
Example 1
Compute various entropies of the channel which is
described by the JPM as shown below

2
Example 1 - Solution

3
Example 1 - Solution

4
Example 1 - Solution

5
Example 1 - Solution

6
Example 1 - Solution

7
Example 2

8
Example 2 - Solution

9
Example 2 - Solution

10
Example 2 - Solution

11
Example 2 - Solution

12
Example 3

13
Example 3 - Solution

14
Example 3 - Solution

15
Example 3 - Solution

16
Example 3 - Solution

17
Example 3 - Solution

18
Example 3 - Solution

19
Example 3 - Solution

20
Example 3 - Solution

21
Example 3 - Solution

22
23
Channel Capacity
Course Name : Information Theory and Coding Techniques
Course Code : 19ECE312
Semester : 6th Sem – ECE

1
Rate of Information Transmission over a
Discrete Channel

2
Rate of Information Transmission over a
Discrete Channel

3
Rate of Information Transmission over a
Discrete Channel

4
Capacity of Discrete Memoryless Channel

5
Shannon’s Theorem on Channel Capacity

6
Shannon’s Theorem on Channel Capacity

7
Channel Efficiency and Redundancy

8
Special Channels

9
Symmetric / Uniform Channel

10
Symmetric / Uniform Channel

11
Symmetric / Uniform Channel

12
Channel Capacity of Symmetric / Uniform
Channel

13
Channel Capacity of Symmetric / Uniform
Channel

14
Channel Capacity of Symmetric / Uniform
Channel

15
Channel Capacity of Symmetric / Uniform
Channel

16
Channel Capacity of Symmetric / Uniform
Channel

17
Example 1

18
19
20
21
22
23
24
25
26
27
28
29
30
Different Types of Channels
Course Name : Information Theory
Course Code : 19ECE312
Semester : 6th Sem – ECE

1
Binary Erasure Channel (BEC)

2
Binary Erasure Channel (BEC)

3
Binary Erasure Channel (BEC)

4
Binary Erasure Channel (BEC)

5
Binary Erasure Channel (BEC)

6
Binary Erasure Channel (BEC)

7
Binary Erasure Channel (BEC)

8
Binary Erasure Channel (BEC)

9
Noiseless Channel

10
Example 1

11
Example 1 - Solution

12
Example 1 - Solution

13
Example 1 - Solution

14
Deterministic Channel

15
Example 2

16
Example 2 - Solution

17
Example 2 - Solution

18
Example 2 - Solution

19
Cascaded Channel

20
Example 3

21
Example 3 - Solution

22
Example 3 - Solution

23
Example 3 - Solution

24
Example 3 - Solution

25
Example 4

26
Example 4

27
Example 5

28
Example 5 - Solution

29
Example 5 - Solution

30
Example 5 - Solution

31
Example 5 - Solution

32
Example 5 - Solution

33
Example 5 - Solution

34
Example 5 - Solution

35
Example 6

36
Example 6 - Solution

37
Example 6 - Solution

38
Example 6 - Solution

39
Example 7

40
Example 8

41
42
Channel Capacity Theorem [Shannon
Hartley Law ] & its Implications
Course Name : Information Theory and Coding
Course Code : 19ECE312
Semester : 6th Sem – ECE

1
Introduction
• The waveform received at the receiver is corrupted by the
random process called NOISE.
1. Johnson Noise (White Noise) – due to thermal motion of
electrons
2. Additive noise – when noise adds to the signal
3. Fading noise – When noise multiplies to the signal
“ In channels, the noise is almost white and it has a
distribution which resembles Gaussian (or normal) distribution
with zero mean and some variance. Hence, this noise is also
called additive white Gaussian noise (AWGN) ”
2
Statement of Shannon Hartley Law

3
Shannon Hartley Law (contd..)
• Significance of Eq. (1) is that , it is possible to transmit over a
channel and bandwidth B Hz, corrupted by AWGN at a rate of
‘ C ’ bits/sec with an arbitrarily small probability of error if the
signal is encoded in such a manner that the samples are all
Gaussian signals.

4
1st Implication :

5
1st Implication (Contd..)

6
1st Implication (Contd..)

7
1st Implication (Contd..)

8
2nd Implication :
Bandwidth – (S/N) Trade Off:

An important implication of Shannon – Hartley law is the


exchange of bandwidth with signal to noise power ratio and
vice versa as given below:

9
2nd Implication (Contd..)

10
2nd Implication (Contd..)

11
2nd Implication (Contd..)

12
BW-SNR trade–off curve

13
BW-SNR trade–off curve (Contd..)

14
Numerical 1:

15
Solution

16
Solution (Contd…)

17
Numerical 2:
A Gaussian channel has 10 MHz bandwidth. If (S/N) ratio is 100,
calculate the channel capacity and the maximum information rate.

18
Solution

19
Numerical 3:

20
Solution

21
Solution (Contd..)

22
23
Numerical on Channel Capacity Theorem
[Shannon Hartley Law ] & its Implications
Course Name : Information Theory and Coding
Course Code : 19ECE213
Semester : 6th Sem – ECE

1
Numerical 1:
A black and white television picture may be viewed as
consisting approximately 3 x 105 elements, each one of which
may occupy one of 10 distinct brightness levels with equal
probability. Assume (a) the rate of transmission is 30 picture
frames per second and (b) the signal - to – noise ratio is 30 dB.
Using the channel capacity theorem (Shannon-Hartley law),
calculate the minimum bandwidth required to support the
transmission of the resultant video signal.

2
Numerical 2:
An analog signal has a 4KHz bandwidth. The signal is sampled
at 2.5 times the Nyquist rate and each sample quantized into 256
equally likely levels. Assume that the successive samples are
statistically independent.
(i) Find the information rate of this source.
(ii) Can the output of this source be transmitted without errors
over a Guassian channel of bandwidth 50KHz and (S/N)
ratio of 20 dB?
(iii) If the output of this source is to be transmitted without
errors over an analog channel having (S/N) of 10 dB,
compute the bandwidth requirement of the channel.

3
Numerical 3:
A Gaussian channel has a 10MHz bandwidth. If (S/N) ratio is
100, calculate the channel capacity and the maximum
information rate.

4
Numerical 4:
A Gaussian channel has a bandwidth of 4KHz and a two sided
noise power spectral density η/2 of 10-14 watts/Hz. Signal power
at the receiver has to be maintained at a level less than or equal
to 0.3mW. Calculate the capacity of the channel.

5
6

You might also like