0% found this document useful (0 votes)
36 views

Intro Class

Digital communication involves sampling, quantizing, and encoding an analog signal into a digital signal for transmission. It has three basic components: a transmitter that sends the signal, a channel or medium over which the signal is transmitted, and a receiver that receives the signal. Noise during transmission can degrade or interfere with the transmitted information. Information theory concepts like entropy and mutual information are used to analyze digital communication systems. Entropy is a measure of the average information content per symbol and provides an upper bound on how much information can be transmitted.

Uploaded by

Vanitha R
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views

Intro Class

Digital communication involves sampling, quantizing, and encoding an analog signal into a digital signal for transmission. It has three basic components: a transmitter that sends the signal, a channel or medium over which the signal is transmitted, and a receiver that receives the signal. Noise during transmission can degrade or interfere with the transmitted information. Information theory concepts like entropy and mutual information are used to analyze digital communication systems. Entropy is a measure of the average information content per symbol and provides an upper bound on how much information can be transmitted.

Uploaded by

Vanitha R
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 19

DIGITAL COMMUNICATION

INTRODUCTION
COMMUNICATION

 Communication is the process of sharing or


transferring an information from one place to another
place or from source to destination.

TYPES OF COMMUNICATION
 Analog Communication
 Digital Communication
Analog Communication

 Analog signal is directly transmitted from source to


destination
Digital Communication

 After performing sampling, Quantizing and


Encoding of signal ,Digital signal is transmitted from
source to destination
Communication Systems

 Basic components:

Transmitter

Channel or medium

Receiver

 Noise degrades or interferes with transmitted

information.
UNIT I - INFORMATION THEORY

Discrete Memory less source, Information, Entropy, Mutual Information –


Discrete Memory less channels Binary Symmetric Channel, Channel
Capacity - Hartley - Shannon law- Source coding theorem - Shannon -
Fano & Huffman codes.
Information
 Consider a communication system which emits
messages m1,m2,m3…..with probabilities p1,p2p3….

Msg m1 m2 m3
probability p1 p2 p3
Information
 Amount of information transmitted through the
message mk with probability pk is given by
Amount of information - Ik

Ik = log2(1/pk)
Unit of information is bits
Properties of information

 If there is more uncertainty about the message,


information carried is also more.
 If receiver knows the message being transmitted ,the
amount of information carried zero.
 If I1 is the information carried by message m1 and I2
is the information carried by message m2,then total
information carried is equal to I1+I2
Problems

1.Calculate the amount of information if pk=1/4


Solution
Amount of information Ik = log2(1/pk)
= log10(1/pk) / log102
= log10(4) / log10 2
Ik = 2bits
Problems
2.Calculate the amount of information if binary digits
occur with equal likelihood in binary PCM.
Solution p1=(probability 0f zero)=1/2
p2=(probability 0f one)=1/2
Amount of information Ik = log2(1/pk)
I = I1 +I2
Amount of information Ik = log2(1/pk)
I = I1 +I2
I1 = log10(1/p1) / log102 = log10(2) / log102 = 1 bit
I2 = log10(1/p2) / log102 = log10(2) / log102 = 1 bit

I = I1 +I2 = 2 bits
3. In binary PCM if ‘0’ occur with probability (1/4)
and ‘1’ occur with probability (3/4)then calculate
amount of information conveyed by each bit
Solution
Amount of information Ik = log2(1/pk)
p1=(probability 0f zero)=1/4
p2=(probability 0f one)=3/4
I1 = log10(1/p1) / log102 =
I2 = log10(1/p2) / log102 =
I1 = log10(1/p1) / log102 =log10(4) / log102=2bits

I2 = log10(1/p2) / log102 = log10(4/3) / log102


=0.415 bits
4.Prove the statement “if receiver knows the message
being transmitted, the amount of information
carried is zero.
 receiver knows the message being transmitted
 This means only one message is transmitted
 So probability pk=1
Amount of information Ik = log2(1/pk)
= log10(1) / log102

=0
ENTROPY

 Entropy can be defined as a measure of the average


information content per source symbol.
 Denoted by ‘H’
ENTROPY
PROPERTIES OF ENTROPY
1.Entropy is zero if the event is sure or it is impossible.
i.e. H = 0 if pk = 0 or 1
2. H = log 2M if all probabilities are equal.
3.Upper Bound of entropy is given as,
Hmax = log2 M
Calculate entropy when pk =1 and when pk=0
Entropy when pk =1

H= pk log2 (1/pk)
H= pk [ log10(1/pk) / log102]
H= (1) [log10(1) / log102]
H= 0

You might also like