0% found this document useful (0 votes)
86 views

DCS Module 1

1. This document discusses key concepts in digital communication systems including information theory. 2. It describes the basic digital communication system block diagram which includes an information source, source encoder, channel encoder, modulator, channel, demodulator, and decoders. The goal is to transmit digital data over the channel efficiently while controlling errors. 3. Key information theory concepts are introduced such as entropy, which is a measure of the information content associated with a random variable, and Shannon's source coding theorem which relates entropy to the minimum possible data rate needed to represent a source.

Uploaded by

Sudarshan Gowda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
86 views

DCS Module 1

1. This document discusses key concepts in digital communication systems including information theory. 2. It describes the basic digital communication system block diagram which includes an information source, source encoder, channel encoder, modulator, channel, demodulator, and decoders. The goal is to transmit digital data over the channel efficiently while controlling errors. 3. Key information theory concepts are introduced such as entropy, which is a measure of the information content associated with a random variable, and Shannon's source coding theorem which relates entropy to the minimum possible data rate needed to represent a source.

Uploaded by

Sudarshan Gowda
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 46

DEPARTMENT

OF
ELECTRONICS & COMMUNICATION ENGINEERING

Digital Communication System

(Theory Notes)

Autonomous Course

Prepared by

Prof. Navya Holla K

Module – 1 Information Theory


Digital Communication block diagram, Information, Entropy,
Shannon’s Encoding Algorithm, Huffman coding, Discrete
memoryless channels, BSC, channel capacity, Shannon Hartley
Theorem and its implications.

Dayananda Sagar College of Engineering


Shavige Malleshwara Hills, Kumaraswamy Layout,
Banashankari, Bangalore-560078, Karnataka
Tel : +91 80 26662226 26661104 Extn : 2731 Fax : +90 80 2666 0789
Web - http://www.dayanandasagar.edu Email : [email protected]
( An Autonomous Institute Affiliated to VTU, Approved by AICTE & ISO 9001:2008 Certified )
( Accredited by NBA, National Assessment & Accreditation Council (NAAC) with 'A' grade )
Digital Communication System DCS Module - 1

1.1 Digital Communication block diagram

Fig. 1: Elements of digital communication system.


1.1.1 Information source:
The source of information can be analog or digital, e.g. analog: audio or video signal, digital:
like teletype signal. In digital communication the signal produced by this source is converted
into digital signal which consists of 1′s and 0′s.

1.1.2 Source Encoder


The Source encoder (or Source coder) converts the input i.e. symbol sequence into a binary
sequence of 0‟s and 1‟s by assigning code words to the symbols in the input sequence. For eg.
:-If a source set is having hundred symbols, then the number of bits used to represent each
symbol will be 7 because 27=128 unique combinations are available. The important
parameters of a source encoder are block size, code word lengths, average data rate and the
efficiency of the coder (i.e. actual output data rate compared to the minimum achievable rate)
At the receiver, the source decoder converts the binary output of the channel decoder into a
symbol sequence. The decoder for a system using fixed – length code words are quite simple,
but the decoder for a system using variable – length code words will be very complex.
Aim of the source coding is to remove the redundancy in the transmitting information, so that
bandwidth required for transmission is minimized. Based on the probability of the symbol
code word is assigned. Higher the probability, shorter is the code word.
Ex: Huffman coding.

1.1.3 Channel Encoder


Dept. of ECE, DSCE Page 2
Digital Communication System DCS Module - 1

Error control is accomplished by the channel coding operation that consists of systematically
adding extra bits to the output of the source coder. These extra bits do not convey any
information but helps the receiver to detect and / or correct some of the errors in the
information bearing bits.

There are two methods of channel coding:


 Block Coding: The encoder takes a block of “k” information bits from the source
encoder and adds “r” error control bits, where “r” is dependent on “k” and error
control capabilities desired.
 Convolution Coding: The information bearing message stream is encoded in a
continuous fashion by continuously interleaving information bits and error control
bits.

The Channel decoder recovers the information bearing bits from the coded binary stream.
Error detection and possible correction is also performed by the channel decoder.

The important parameters of coder / decoder are: Method of coding, efficiency, error control
capabilities and complexity of the circuit.
1.1.4 Modulator
It is performed for the efficient transmission of the signal over the channel. The modulator
operates by keying shifts in the amplitude, frequency or phase of a sinusoidal carrier wave to
the channel encoder output. The digital modulation techniques are referred to as amplitude-
shift keying, frequency- shift keying or phase-shift keying respectively. The Modulator
converts the input bit stream into an electrical waveform suitable for transmission over the
communication channel. Modulator can be effectively used to minimize the effects of
channel noise, to match the frequency spectrum of transmitted signal with channel
characteristics, to provide the capability to multiplex many signals.
The detector performs demodulation, thereby producing a signal the follows the time
variations in the channel encoder output. The modulator, channel and detector form a discrete
channel (because both its input and output signals are in discrete form.
1.1.5 Channel
The Channel provides the electrical connection between the source and destination. The
different channels are: Pair of wires, Coaxial cable, Optical fibre, Radio channel, Satellite
channel or combination of any of these.
The communication channels have only finite Bandwidth, non-ideal frequency response, the
signal often suffers amplitude and phase distortion as it travels over the channel. Also, the
Dept. of ECE, DSCE Page 3
Digital Communication System DCS Module - 1

signal power decreases due to the attenuation of the channel. The signal is corrupted by
unwanted, unpredictable electrical signals referred to as noise.
The important parameters of the channel are Signal to Noise power Ratio (SNR), usable
bandwidth, amplitude and phase response and the statistical properties of noise.
1.2. Information
The output of a discrete information source is a message that consists of a sequence of
symbols. The actual message that is emitted by the source during a message interval is
selected at random from a set of possible messages. The communication system is designed
to reproduce at the receiver either exactly or approximately the message emitted by the
source.
To measure the information content of a message quantitatively, we are required to arrive at
an intuitive concept of the amount of information.
Consider the examples: A trip to Miami, Florida from Minneapolis in the winter time,
 mild and sunny day,
 cold day,
 possible snow flurries.
The amount of information received is obviously different for these messages.
 The first message contains very little information since the weather in Miami is mild
and sunny most of the time.
 The forecast of a cold day contains more information since it is not an event that
occurs often.
 In contrast, the forecast of snow flurries conveys even more information since the
occurrence of snow in Miami is a rare event.
Thus on intuitive basis the amount of information received from the knowledge of occurrence
of an event is related to the probability or the likelihood of occurrence of the event. The
message associated with an event least likely to occur contains most information.
The information content of a message can be expressed quantitatively in terms of
probabilities as follows:
Suppose an information source emits one of ‘q’ possible messages m1, m2 …… mq with p1, p2
…… pq as their probs. of occurrence. Based on the above intusion, the information content of
the kth message, can be written as
1
I (𝑚𝑘 )𝛼 𝑝 
𝑘

Dept. of ECE, DSCE Page 4


Digital Communication System DCS Module - 1

Also to satisfy the intuitive concept, of information.


I (mk) must →zero as pk→1
Therefore,

Another requirement is that when two independent messages are received, the total
information content is – Sum of the information conveyed by each of the messages.
Thus the equation becomes

Whrere mk and mj are two independent messages.


A continuous function of p that satisfies the constraints specified in the above equations is
the logarithmic function and we can define a measure of information as

The base for the logarithmic in equation determines the unit assigned to the information
content.
Natural logarithm base : ‘nat’
Base - 10 : Hartley / decit
Base - 2 : bit
Using the binary digit as the unit of information is based on the fact that if two possible
binary digits occur with equal proby (p1 = p2 =½) then the correct identification of the binary
digit conveys an amount of information. I (m1) = I (m2) = – log2 (½ ) = 1 bit. Therefore one
bit is the amount of information that we gain when one of two possible and equally likely
events occurs.
Ex1: A source puts out one of five possible messages during each message interval. The
probabilities of these messages are P1 =1/2, P2=1/4, P3=1/4, P4=1/16, P5=1/16. What is the
information content of these messages?
Solution:
1
𝐼(𝑚1 ) = 𝑙𝑜𝑔2 = 1 𝑏𝑖𝑡𝑠
(1/2)
1
𝐼(𝑚2 ) = 𝑙𝑜𝑔2 = 2 𝑏𝑖𝑡𝑠
(1/4)

Dept. of ECE, DSCE Page 5


Digital Communication System DCS Module - 1

1
𝐼(𝑚3 ) = 𝑙𝑜𝑔2 = 2 𝑏𝑖𝑡𝑠
(1/4)

1
𝐼(𝑚4 ) = 𝑙𝑜𝑔2 = 4 𝑏𝑖𝑡𝑠
(1/16)

1
𝐼(𝑚5 ) = 𝑙𝑜𝑔2 = 4 𝑏𝑖𝑡𝑠
(1/16)
1.3 Entropy
Suppose a source that emits one of M possible symbols s1, s2,……sM in a
statistically independent sequence. Let p1,p2…….pq be the probabilities of occurrence
of the M symbols, respectively. In a long message containing N symbols, the symbol s1
will occur on the average p1N times, the symbol s2 will occur p2N times, and in general the
1
symbol sM will occur pMN times. The information content of the ith symbol is I (si)= log 2 (𝑝 ).
𝑖

1
Therefore p1N number of messages of type s1 contains 𝑝1 𝑁 log 2 (𝑝 )bits. Similarly p2N
1

1
number of messages of type s1 contains 𝑝2 𝑁 log 2 (𝑝 )bits.
2

∴ Total self-information contains all these messages as


1 1 1
𝐼𝑡𝑜𝑡𝑎𝑙 = 𝑝1 𝑁 log 2 (𝑝 ) + 𝑝2 𝑁 log 2 (𝑝 ) + ⋯ … … … … . +𝑝𝑀 𝑁 log 2 (𝑝 )
1 2 𝑀

1
𝐼𝑡𝑜𝑡𝑎𝑙 = 𝑁 ∑𝑀
𝑖=1 𝑝𝑖 log 2 (𝑝 ) bits
𝑖

The average information per symbol is obtained by dividing the total information content of
the message by the number of symbols in the message, as
𝑇𝑡𝑜𝑡𝑎𝑙 1
𝐸𝑛𝑡𝑟𝑜𝑝𝑦 = 𝐻 = = ∑𝑀
𝑖=1 𝑝𝑖 log 2 (𝑝 ) bits/symbol
𝑁 𝑖

1.3.1 Average information rate


If the symbols are emitted by source at a fixed time rate rs, then the average information rate
Rs is given by 𝑅𝑠 = 𝑟𝑠 ∗ 𝐻 bits/sec
Eamples:
1. Consider a discrete memoryless source with a source alphabet A= ( 𝑠0, 𝑠1, 𝑠2 ) with
1 1 1
respective probabilities 𝑝0 = 4, 𝑝1 = 4, 𝑝2 = 2. Find the entropy of the source.

Dept. of ECE, DSCE Page 6


Digital Communication System DCS Module - 1

Dept. of ECE, DSCE Page 7


Digital Communication System DCS Module - 1

Dept. of ECE, DSCE Page 8


Digital Communication System DCS Module - 1

Dept. of ECE, DSCE Page 9


Digital Communication System DCS Module - 1

Dept. of ECE, DSCE Page 10


Digital Communication System DCS Module - 1

1.3 Shannon’s Encoding algorithm


Source encoding is the process by which the output of an information source is converted in
to an r-array sequence. Coding is nothing but transformation of each of the source symbol
S={𝑠1, 𝑠2, 𝑠3 … … . . 𝑠𝑞 } using the symbols from the source alphabet X={𝑞1, 𝑞2, 𝑞3 … … . . 𝑞𝑟 ).
In binary coding r represents number of different symbols used in the code alphabet. That is
r=2 => X=(0,1). In general if {𝑠1, 𝑠2, 𝑠3 … … . . 𝑠𝑞 ) are to be transmitted, then q number of
different states are required. In binary coding only 2 states are required. Hence the
transmission process becomes much easier and efficiency of the system can be increased.
 Let the source symbols in the order of decreasing probabilities
S={𝑠1, 𝑠2, 𝑠3 … … . . 𝑠𝑞 )
P={𝑝1, 𝑝2, 𝑝3 … … . . 𝑝𝑞 ).
𝑝1 ≥ 𝑝2 ≥ 𝑝3 … … . ≥ 𝑝𝑞
 Compute the sequence

Dept. of ECE, DSCE Page 11


Digital Communication System DCS Module - 1

𝛼1 =0
𝛼2 =𝑝1 = 𝑝1+ 𝛼1
𝛼3 = 𝑝2 + 𝑝1 = 𝑝2 + 𝛼2
𝛼4 = 𝑝3 + 𝑝2 + 𝑝1 = 𝑝3 + 𝛼3
.
.
𝛼𝑞+1 = 𝑝𝑞 + 𝛼𝑞 = 1

 Determine the smallest integer for 𝑙𝑖 (length of code word) using the inequality
𝑖 1
2𝑙 ≥ 𝑝 for all i=1 to q
𝑖

 Expand the decimal numbers 𝛼𝑖 in binary form up to 𝑙𝑖 places neglecting the


expansion beyond 𝑙𝑖 places.
 Remove the decimal point to get the desired code.
𝒒
Code efficiency: The average length ‘L’ of any code is given by 𝑳 = ∑𝒊=𝟏 𝒑𝒊 𝑳𝒊 where
𝐿𝑖 = 𝑙1 𝑙2 𝑙3 𝑙4 … … . 𝑙𝑞
𝑯(𝑺)
Code efficiency, 𝜼𝒄 = ∗ 𝟏𝟎𝟎 for binary codes.
𝑳

Ex: 1. Construct the Shannon’s binary code for the following message symbols
S={𝑠1, 𝑠2, 𝑠3 , 𝑠4 ) with probabilities P=(0.4, 0.3,0.2,0.1).
Solution:
 0.4 > 0.3 > 0.2 > 0.1
 0 = 0,
1 = 0.4
2 = 0.4+0.3=0.7
3= 0.7+0.2=0.9
4= 0.9 + 0.1 = 1.0


 

Dept. of ECE, DSCE Page 12


Digital Communication System DCS Module - 1

 The codes are


s100, s201, s3101, s41110

The average length of this code is


H(S) 1.8464
%ηc = ∗ 100 ∗ 100 = 76.93%
L 2.4

Ex: 2: Apply Shannon’s binary encoding procedure to the following set of messages and
obtain code efficiency and redundancy.
1/8, 1/16, 3/16, 1/4, 3/8
Solution:

Dept. of ECE, DSCE Page 13


Digital Communication System DCS Module - 1

1 3 8 1 3 16 1 1
𝐻(𝑆) = 𝑙𝑜𝑔2 4 + 𝑙𝑜𝑔2 + 𝑙𝑜𝑔2 8 + 𝑙𝑜𝑔2 + 𝑙𝑜𝑔2 4 + 𝑙𝑜𝑔2 16
4 8 3 8 16 3 4 16
𝐻(𝑆) = 2.1085 𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙
𝑞
1 3 1 3 1
𝐿 = ∑ 𝑃𝑖 𝑙𝑖 = (2) + (2) + (3) + (3) + (4)
4 8 8 16 16
𝑖=1

𝐿 = 2.4375 𝑏𝑖𝑡𝑠/symbol
𝐻(𝑆)
%𝜂 = ∗ 100 = 86.5%
𝐿
Redundancy=1- 𝜂=100-86.5=13.5%
Ex: 3: Repeat the above messages (𝑥1, 𝑥2, 𝑥3 ) with P= (1/2, 1/5, 3/10)
Solution:

1 3 10 1
𝐻(𝑆) = 𝑙𝑜𝑔2 2 + 𝑙𝑜𝑔2 + 𝑙𝑜𝑔2 5
2 10 3 5
𝐻(𝑆) = 1.4855 𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙
3
1 3 1
𝐿 = ∑ 𝑃𝑖 𝐿𝑖 = (1) + (2) + (3)
2 10 5
𝑖=1

𝐿 = 1.7𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙
𝐻(𝑆)
%𝜂 = ∗ 100 = 87.38%
𝐿
1.4 Huffman Coding

 The source symbols are listed in the decreasing order of probabilities.

Dept. of ECE, DSCE Page 14


Digital Communication System DCS Module - 1

 Check if q = r + a(r-1) is satisfied and find the integer ‘a’, where q is number of
source symbols and r is number of symbols used in code alphabets. ‘a’ values is
calculated and it should be an integer, otherwise add suitable number of dummy
symbols of zero probability of occurrence to satisfy the equation. This step is not
required if we are to determine binary codes.
 Combine the last ‘r’ symbols into a single composite symbol whose probability of
occurrence is equal to the sum of the probabilities of occurrence of the last r –
symbols involved in the step.
 Repeat the above three steps respectively on the resulting set of symbols until in the
final step exactly r- symbols are left.
 The last source with ‘r’ symbols are encoded with ‘r’ different codes 0,1,2,3,….r-1
 In binary coding the last source are encoded with 0 and 1
 As we pass from source to source working backward, decomposition of one code
word each time is done in order to form 2 new code words.
 This procedure is repeated till we assign the code words to all the source symbols of
alphabet of source ‘s’ discarding the dummy symbols.

1 1 1 1
Ex:1. Construct a Huffman Code for symbols having probabilities{2 , 4 , 8 , 8}. Also find

efficiency and redundancy.

𝑞 = 𝑟 + 𝛼(𝑟 − 1)

4=2+α(1) => α=2∈ 𝑍

Dept. of ECE, DSCE Page 15


Digital Communication System DCS Module - 1

Symbols Codes Probabilities Length


𝑆1 0 ½ 1
𝑆2 10 ¼ 2
𝑆3 110 1/8 3
𝑆4 111 1/8 3
1 8 1
𝐻(𝑆) = 𝑙𝑜𝑔2 2 + 𝑙𝑜𝑔2 8 + 𝑙𝑜𝑔2 4
2 8 4
𝐻(𝑆) = 1.75 𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙
4
1 1 1 1
𝐿 = ∑ 𝑃𝑖 𝐿𝑖 = (1) + (2) + (3) + (3)
2 4 8 8
𝑖=1

𝐿 = 1.75 𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙
𝐻(𝑆)
%𝜂 = ∗ 100 = 100%
𝐿
Redundancy=0%
Ex. 2: A source has 9 symbols and each occur with a probability of 1/9. Construct a binary
Huffman code. Find efficiency and redundancy of coding.
Solution:

𝑞 = 𝑟 + 𝛼(𝑟 − 1)

9=2+α(1) => α=7∈ 𝑍

Dept. of ECE, DSCE Page 16


Digital Communication System DCS Module - 1

Symbols Codes Probabilities Length


𝑆1 001 1/9 3
𝑆2 0000 1/9 4
𝑆3 0001 1/9 4
𝑆4 110 1/9 3
𝑆5 111 1/9 3
𝑆6 100 1/9 3
𝑆7 101 1/9 3
𝑆8 010 1/9 3
𝑆9 011 1/9 3
9
𝐻(𝑆) = 𝑙𝑜𝑔2 9
9
𝐻(𝑆) = 3.17 𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙
1
𝐿 = ∑9𝑖=1 𝑃𝑖 𝐿𝑖 = 9 (3+4+4+3+3+3+3+3+3)

𝐿 = 3.22 𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙
𝐻(𝑆)
%𝜂 = ∗ 100 = 98.45%
𝐿
Redundancy=100-% 𝜂=1.55%
Ex. 3: Given the messages 𝑥1 , 𝑥2 , 𝑥3 , 𝑥4 , 𝑥5 &𝑥6 with probabilities 0.4, 0.2, 0.2, 0.1, 0.07,
0.03. Construct binary and trinary code by applying Huffman encoding procedure. Also find
efficiency and redundancy.

Dept. of ECE, DSCE Page 17


Digital Communication System DCS Module - 1

Solution:
(i) Binary
𝑞 = 𝑟 + 𝛼(𝑟 − 1)

6=2+α(1) => α=4∈ 𝑍

Symbols Codes Probabilities Length


𝑥1 1 0.4 1
𝑥2 01 0.2 2
𝑥3 000 0.2 3
𝑥4 0010 0.1 4
𝑥5 00110 0.07 5
𝑥6 00111 0.03 5

1 1 1 1
𝐻(𝑆) = 0.4 𝑙𝑜𝑔2 ( ) + 0.2 𝑙𝑜𝑔2 ( ) + 0.2 𝑙𝑜𝑔2 ( ) + 0.1 𝑙𝑜𝑔2 ( )
0.4 0.2 0.2 0.1
1 1
+ 0.07𝑙𝑜𝑔2 ( ) + 0.03𝑙𝑜𝑔2 ( )
0.07 0.03
𝐻(𝑆) = 𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙
6

𝐿 = ∑ 𝑃𝑖 𝐿𝑖 = 0.4(1) + 0.2(2) + 0.2(3) + 0.1(4) + 0.07(5) + 0.03(5)


𝑖=1

𝐿= 𝑏𝑖𝑡𝑠/𝑠𝑦𝑚𝑏𝑜𝑙
𝐻(𝑆)
%𝜂 = ∗ 100 =
𝐿
Redundancy=

Dept. of ECE, DSCE Page 18


Digital Communication System DCS Module - 1

(ii) Trinary
𝑞 = 𝑟 + 𝛼(𝑟 − 1)

6=3+α(2) => α=3/2 ; Let α=2

If α=2 => 𝑞 = 3 + 2(2)= 7

Hence add a symbol 𝑥7 with probability ‘0’.

Symbols Codes Probabilities Length


𝑥1 0 0.4 1
𝑥2 1 0.2 1
𝑥3 10 0.2 2
𝑥4 11 0.1 2
𝑥5 120 0.07 3
𝑥6 121 0.03 3

𝑥7 must be ignored as it is a dummy symbol.

1 1 1 1
𝐻(𝑆) = 0.4 𝑙𝑜𝑔3 ( ) + 0.2 𝑙𝑜𝑔3 ( ) + 0.2 𝑙𝑜𝑔3 ( ) + 0.1 𝑙𝑜𝑔3 ( )
0.4 0.2 0.2 0.1
1 1
+ 0.07𝑙𝑜𝑔3 ( ) + 0.03𝑙𝑜𝑔3 ( )
0.07 0.03

H(S)= bits/symbol

Ex. 4: Consider a zero memory source has an alphabet of 7 symbols whose probability of
occurrence of (0.25, 0.25, 0.125, 0.125, 0.125, 0.0625, 0.0625). Compute the Huffman code

Dept. of ECE, DSCE Page 19


Digital Communication System DCS Module - 1

for this source moving a combined symbol as high as possible. Evaluate the code efficiency.
And also construct code tree.
Solution:
q=r+α(r-1) => α=5

Symbols Pi Code length


S1 0.25 10 2
S2 0.25 11 2
S3 0.125 001 3
S4 0.125 010 3
S5 0.125 011 3
S6 0.0625 0000 4
S7 0.0625 0001 4
H(S)=2.625 bits/symbol
L=2.625 bits/symbol
%η=H(s)/L=100%
Code tree

1.6 Discrete memoryless Channel:

A channel is defined as the medium through which the coded signals are generated by an
information source are transmitted. In general, the input to the channel is a symbol belonging
to an alphabet ‘A’ with ‘r’ symbols, the output of a channel is a symbol belonging to an
alphabet ‘B’ with ‘s’ symbols.

Due to errors in the channel, the output symbols may differ from input symbols.

Dept. of ECE, DSCE Page 20


Digital Communication System DCS Module - 1

1.6.1 Representation of a channel:

A communication channel may be represented by a set of input alphabets


A=(𝑎1, 𝑎2, 𝑎3 … … . . 𝑎𝑟 ) consisting of ‘r’ symbols and set of output alphabets
𝑏
B=(𝑏1, 𝑏2, 𝑏3 … … . . 𝑏𝑠 ) consisting of s symbols and a set of conditional probability P( 𝑗⁄𝑎𝑖 )

with i=1,2,….r and j=1,2,…….s

𝑎1 𝑏1
𝑎 𝑏𝑗 𝑏
𝐴 2⋮ } → P( ⁄𝑎𝑖 ) → { 2⋮ 𝐵
𝑎𝑟 𝑏𝑠

The conditional probabilities come in to the existence due to the presence of noise in the
channel. Because of noise there will be some amount of uncertainty in the reception of any
symbols. For this reason there are ‘s’ number of symbols at the receiver from ‘r’ symbols at
transmitter. Totally there are r * s conditional probabilities represented in a form of matrix
which is called as Channel Matrix or Noise Matrix.

When 𝑎1 is transmitted, it can be received as any one of the output symbols (𝑏1, 𝑏2, 𝑏3 … … . . 𝑏𝑠 )

Therefore 𝑃11 + 𝑃12 + 𝑃13 + ⋯ … 𝑃𝑆 =1

𝑏 𝑏 𝑏 𝑏
=>P( 1⁄𝑎1 )+ P( 2⁄𝑎1 )+ P( 3⁄𝑎1 )+…………….. P( 𝑠⁄𝑎1 )=1

𝑏
In general, ∑𝑠𝑗=1 P( 𝑗⁄𝑎𝑖 ) = 1 for i= 1 to r

Thus the sum of all the elements in any row of the channel matrix is equal to UNITY.

1.6.2 Joint Probability:

Dept. of ECE, DSCE Page 21


Digital Communication System DCS Module - 1

Joint probability between any input symbol 𝑎𝑖 and any output symbol 𝑏𝑗 is given by
𝒃
P(𝑎𝑖 ∩ 𝑏𝑗 ) = 𝑷(𝒂𝒊 , 𝒃𝒋 ) = 𝐏( 𝒋⁄𝒂𝒊 )𝑷(𝒂𝒊 )

𝒂
𝑷(𝒂𝒊 , 𝒃𝒋 )= 𝐏( 𝒊⁄𝒃 )𝑷(𝒃𝒋 )
𝒋

Properties:

Consider the source alphabet A=(𝑎1, 𝑎2, 𝑎3 … … . . 𝑎𝑟 ) and output alphabet


B=(𝑏1, 𝑏2, 𝑏3 … … . . 𝑏𝑠 )

1
 The source entropy is given by 𝐻(𝐴) = ∑𝑟𝑖=1 𝑃𝑎𝑖 𝑙𝑜𝑔2 ( )
𝑃𝑎𝑖

1
 The entropy of the receiver or output is given by 𝐻(𝐵) = ∑𝑠𝑗=1 𝑃𝑏𝑗 𝑙𝑜𝑔2 (𝑃 )
𝑏𝑗

 If all the symbols are equi-probable, then maximum source entropy is


𝐻(𝐴)𝑚𝑎𝑥 = 𝑙𝑜𝑔2 𝑟
 Conditional Entropy: The entropy of input symbols 𝑎1, 𝑎2, 𝑎3 … … . . 𝑎𝑟 after the
transmission and reception of particular output symbol 𝑏𝑗 is defined as conditional

entropy, denoted by 𝐻(𝑨⁄𝒃 )


𝒋
𝑟
𝑎 1
𝐻 (𝐴⁄𝑏 ) = ∑ P ( 𝑖⁄𝑏 ) 𝑙𝑜𝑔2 .
𝑗 𝑗 𝑎𝑖
𝑖=1 P ( ⁄𝑏 )
𝑗

 If the average value of all the conditional probability is taken as j varies from 1 to s

denoted by 𝐻(𝐴⁄𝐵 ) = ∑𝑠𝑗=1 𝑃(𝑏𝑗 ) 𝐻 (𝐴⁄𝑏 )


𝑗

𝑎 1
=∑𝑠𝑗=1 ∑𝑟𝑖=1 𝑃(𝑏𝑗 ) P ( 𝑖⁄𝑏 ) 𝑙𝑜𝑔2 𝑎𝑖
𝑗 P( ⁄𝑏 )
𝑗

𝟏
𝑯(𝑨⁄𝑩) = ∑𝒔𝒋=𝟏 ∑𝒓𝒊=𝟏 𝑷( 𝒂𝒊 , 𝒃𝒋 ) 𝒍𝒐𝒈𝟐 𝒂𝒊 is conditional entropy of
𝑷( ⁄𝒃 )
𝒋

transmitter

𝟏
Similarly 𝑯(𝑩⁄𝑨) = ∑𝒓𝒊=𝟏 ∑𝒔𝒋=𝟏 𝑷( 𝒂𝒊 , 𝒃𝒋 ) 𝒍𝒐𝒈𝟐 𝒃𝒋 is conditional entropy of
𝑷( ⁄𝒂𝒊 )

receiver.

𝟏
 𝑯(𝑨, 𝑩) = ∑𝒓𝒊=𝟏 ∑𝒔𝒋=𝟏 𝑷( 𝒂𝒊 , 𝒃𝒋 ) 𝒍𝒐𝒈𝟐 𝑷(𝒂 ,𝒃 ) is joint conditional probability.
𝒊 𝒋

Dept. of ECE, DSCE Page 22


Digital Communication System DCS Module - 1

1.6.3 Mutual Information:

When an average amount of information H(x) is transmitted over a noisy channel, then an
amount of information 𝐻(𝑥⁄𝑦) is last in the channel. The balance of the information at the
receiver is defined as Mutual Information I(x.y)

I(x,y)= H(x)- 𝐻(𝑥⁄𝑦)

𝑦
= H(y)- 𝐻( ⁄𝑥)

1 𝑥 1
I(𝑥𝑖 ) = log(𝑃(𝑥 )) and 𝐼 ( 𝑖⁄𝑦𝑗 ) = log( 𝑥𝑖
𝑖 𝑃( ⁄ 𝑦𝑗 )

The difference between the above 2 is the information gained through the channel.

1 1
I (𝑥𝑖 , 𝑦𝑗 ) = log ( ) − log( 𝑥𝑖 )
𝑃(𝑥𝑖 ) 𝑃( ⁄𝑦𝑗 )

𝑥
𝑃( 𝑖⁄𝑦𝑗 )
I (𝑥𝑖 , 𝑦𝑗 ) = log 𝑃(𝑥𝑖 )

𝑃(𝑥 ,𝑦 )
𝑖 𝑗
I (𝑥𝑖 , 𝑦𝑗 ) = log 𝑃(𝑥 )𝑃(𝑦
𝑖 𝑗)

Properties:

 The Mutual Information is symmetric. I(𝑥𝑖 , 𝑦𝑗 ) = I(𝑦𝑗 , 𝑥𝑖 )


 𝐼(𝑋, 𝑌) = 𝐻(𝑋) + 𝐻(𝑌) − 𝐻(𝑋, 𝑌)
 𝐼(𝑋, 𝑌) = 𝐻(𝑋) − 𝐻(𝑋⁄𝑌)

 𝐼(𝑋, 𝑌) = 𝐻(𝑌) − 𝐻(𝑌⁄𝑋)

1.6.4 Channel Capacity


It is known that average information content of the source is
1
𝐻(𝑋) = ∑𝑀
𝑖=1 𝑝(𝑥𝑖 ) log 2 ( ) . Average information per symbol going in to the channel is
𝑝(𝑥𝑖 )

𝑅𝑖𝑛 = 𝑟𝑠 ∗ 𝐻(𝑋). Due to the error, it is not possible to reconstruct the input symbol sequence
with certainity on the recovered sequence. Therefore source information is lost due to the
errors.
 Therefore average rate of information transmission is given by 𝑅𝑡 = 𝐼(𝑋, 𝑌) . 𝑟𝑠 .
Bits/sec.

Dept. of ECE, DSCE Page 23


Digital Communication System DCS Module - 1

 The capacity of a discrete memoryless noisy channel is defined a s maximum possible


rate of maximum rate of transmission occurs when the source is matched to the
channel.
 ∴ 𝐶 = 𝑀𝑎𝑥(𝑅𝑡 )
 =Max[𝐼(𝑋, 𝑌) . 𝑟𝑠 ]
 𝐶 = Max{[𝐻(𝑋) − 𝐻(𝑋⁄𝑌)] 𝑟𝑠 }
1.6.5 Channel Efficiency
𝑅
%𝜂𝑐ℎ = 𝐶𝑡 ∗ 100
𝐼(𝑋,𝑌) . 𝑟
=Max[𝐼(𝑋,𝑌) .𝑠 𝑟 ] ∗ 100
𝑠

𝐻(𝑋)−𝐻(𝑋⁄ )
%𝜂𝑐ℎ =𝑀𝑎𝑥[𝐻(𝑋)−𝐻(𝑋𝑌⁄ ∗ 100
𝑌)]

Redundancy =1-𝜼𝒄𝒉
1.6.6 Symmetry Channel

Symmetry channel is defined as the channel in which the channel matrix has 2nd and
subsequent rows, the same elements as the first row, but in different order.

∴ 𝐻(𝑌⁄𝑋) = ℎ, where →entropy of any single row. The channel capacity with 𝑟𝑠 =1 bits/sec
is given by,

𝐶 = 𝑀𝑎𝑥(𝑅𝑡 )

=Max[𝐼(𝑋, 𝑌)] 𝑟𝑠

= Max[𝐼(𝑋, 𝑌)]

= Max(𝐻(𝑌) − 𝐻(𝑌⁄𝑋))

= Max[𝐻(𝑌)] −Max[𝐻(𝑌⁄𝑋)]

= Max[𝐻(𝑌)] - Max(ℎ)

𝐶 = Max[𝐻(𝑌)] − ℎ

H(Y) is the entropy of symbol which becomes maximum if and only if all the receive
symbols become equi-probable.

Since there are ‘s’ output symbols

Dept. of ECE, DSCE Page 24


Digital Communication System DCS Module - 1

Max[𝐻(𝑌)] = 𝑙𝑜𝑔2 𝑠

∴ 𝑪=𝒍𝒐𝒈𝟐 𝒔 − 𝒉

Ex.1: A transmitter has an alphabet containing of 5 letters {𝑎1 , 𝑎2 , 𝑎3 , 𝑎4 , 𝑎5 } and the receiver
has an alphabet of four letters{𝑏1 , 𝑏2 , 𝑏3 , 𝑏4 }. The joint probabilities of the system are given
below. Compute different entropies of this channel.

Solution:

Dept. of ECE, DSCE Page 25


Digital Communication System DCS Module - 1

Ex.2: A transmitter transmits 5 symbols with probabilities 0.2, 0.3, 0.2, 0.1 and 0.2. Given the
channel matrix 𝑃(𝐵⁄𝐴), Calculate H(B) and H(A,B)

Solution:

We know that, P(ai, bj)=P(ai)P(bj/ai)

Adding the element pf each column, we get

Dept. of ECE, DSCE Page 26


Digital Communication System DCS Module - 1

Ex.3: For the JPM given below, compute individually H(X), H(Y), H(X,Y), H(X/Y), H(Y/X),
I(X,Y) and channel Capacity if r=1000 symbols/sec. Verify the relationship among these
entropies.

Solution:

Dept. of ECE, DSCE Page 27


Digital Communication System DCS Module - 1

Dept. of ECE, DSCE Page 28


Digital Communication System DCS Module - 1

Verification:

Dept. of ECE, DSCE Page 29


Digital Communication System DCS Module - 1

1.7 Binary Symmetric Channel:

The binary symmetric channel is one of the most commonly and widely used channel whose
channel diagram is given below

From the above diagram, channel matrix can be written as

𝑃(𝑋⁄𝑌) = [
𝑃 1−𝑃
]= [𝑃 𝑃̅]
1−𝑃 𝑃 𝑃̅ 𝑃

The matrix is a symmetric matrix. Hence the channel is binary symmetric channel.

1.8 Channel Capacity

It is known that 𝐶 = Max{[𝐻(𝑌) − 𝐻(𝑌⁄𝑋)] 𝑟𝑠 }.

1 1
For symmetry channel, 𝐻(𝑌⁄𝑋) = ℎ = 𝑃 log + 𝑃̅ log
𝑃 𝑃̅

Since it is a binary symmetric channel, 𝐻(𝑌)𝑚𝑎𝑥 = 𝑙𝑜𝑔2 𝑠 = 𝑙𝑜𝑔2 2 = 1

∴ 𝐶 = 1 − ℎ bits/sec.

Dept. of ECE, DSCE Page 30


Digital Communication System DCS Module - 1

Ex.1: A binary symmetric channel has the following noise matrix with source probabilities of
3/4 1/4
P(𝑥1 )=2/3 and P(𝑥2 )=1/3. 𝑃(𝑌⁄𝑋) = [ ]. Determine H(X), H(Y), H(X,Y), H(Y/X),
1/4 3/4
H(X/Y), I(X,Y), Channel Capacity, Channel efficiency and redundancy.

Solution:

H(Y)=7/12 log(12/7)+5/12 log(12/5)

H(Y)=0.9799 bits/symbol

1 1
𝐻(𝑌⁄𝑋) = ℎ = 𝑃 log + 𝑃̅ log
𝑃 𝑃̅

= ¾ log(4/3)+1/4 log(4/1)

=0.8113 bits/symbols

Dept. of ECE, DSCE Page 31


Digital Communication System DCS Module - 1

Channel Redundancy =𝜂𝑐ℎ = 10.65%

Solution:

Dept. of ECE, DSCE Page 32


Digital Communication System DCS Module - 1

Decoding Alphabets: X8, X6, X3, X5, X1, X2, X7,X4

Dept. of ECE, DSCE Page 33


Digital Communication System DCS Module - 1

(i)

Dept. of ECE, DSCE Page 34


Digital Communication System DCS Module - 1

(ii)

(iv)

Dept. of ECE, DSCE Page 35


Digital Communication System DCS Module - 1

1. 9 Shannon Hartley Theorem and its Implication

Dept. of ECE, DSCE Page 36


Digital Communication System DCS Module - 1

Dept. of ECE, DSCE Page 37


Digital Communication System DCS Module - 1

Dept. of ECE, DSCE Page 38


Digital Communication System DCS Module - 1

Dept. of ECE, DSCE Page 39


Digital Communication System DCS Module - 1

Dept. of ECE, DSCE Page 40


Digital Communication System DCS Module - 1

Ex.1:

Dept. of ECE, DSCE Page 41


Digital Communication System DCS Module - 1

Ex.2: A CRT terminal is used to enter alphanumeric data into a chamber. The CRT is
connected through a voice-grade telephone line having usable band width of 3KHz and an
output (S/N) of 10 dB. Assume that the terminal has 128 characters and data is sent in an
independent manner with equal peobability.

(i) Find the average information per characters.


(ii) Find capacity of the channel.
(iii) Find maximum rate at which data can be sent from terminal to the computer error.

Dept. of ECE, DSCE Page 42


Digital Communication System DCS Module - 1

Ex.3: A voice-grade channel of the telephone network has a bandwidth of 3.4 KHz.

(a) Calculate channel capacity of the telephone channel foe a signal-to-noise ratio of
30dB.
(b) Calculate the minimum signal-to-noise ratio required to support information
transmission through the telephone channel at the rate of 4800 bits/sec.

Dept. of ECE, DSCE Page 43


Digital Communication System DCS Module - 1

Ex.4:

Solution:

Dept. of ECE, DSCE Page 44


Digital Communication System DCS Module - 1

Ex.5:

Dept. of ECE, DSCE Page 45


Digital Communication System DCS Module - 1

Dept. of ECE, DSCE Page 46

You might also like