0% found this document useful (0 votes)
10 views

ITC-03

The lecture discusses the concepts of uncertainty, information measures, and entropy in the context of information theory. It explains how information sources can be modeled as discrete time random processes and introduces the measure of information, including self-information and mutual information. Additionally, it covers properties of information and entropy, providing examples and exercises to illustrate these concepts.

Uploaded by

jivinanto2005
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
10 views

ITC-03

The lecture discusses the concepts of uncertainty, information measures, and entropy in the context of information theory. It explains how information sources can be modeled as discrete time random processes and introduces the measure of information, including self-information and mutual information. Additionally, it covers properties of information and entropy, providing examples and exercises to illustrate these concepts.

Uploaded by

jivinanto2005
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 24

Lecture-3

Information
Dr.A.Manikandan,
Associate Prof/ECE, Amrita School of Engineering.

AMRITA VISHWA VIDYAPEETHAM

1
Learning Objective

To discuss about uncertainty, information


measure & Entropy

AMRITA VISHWA VIDYAPEETHAM

2
Modeling of information sources

• Information sources are modeled mathematically as discrete time


random processes, which are sequences of random variables. If we
consider an event, there are three conditions of occurrence,
• If the event has not occurred, there is a condition of uncertainty

• If the event has just occurred, there is a condition of surprise

• If the event has occurred, a time back there is a condition of having some
information.

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 3

Pic Courtesy: Research Gate


Contd…

• Any information source produces an output that is random in nature.

• If the source output had no randomness, i.e., the output were known
exactly, there would be no need to transmit it.
• The sun will rise

• It may rain in the noon

• It will snow in Delhi next winter.

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 4


Measure of information
• The “amount of information” or “amount of uncertainty” or “amount of
surprise” would be come more as the probability of occurrence of the
event becomes smaller.

• Whether it is information or uncertainty or surprise associated with


happening.

• Measure of information of is an indication of “freedom of choice”


exercised by the source in selecting a message.

• If the source can freely choose from many different messages, the user
is highly uncertain as to which message will be selected for
transmission.
AMRITA VISHWA VIDYAPEETHAM 2/3/2025 5
Measure of information

• Consider a discrete random variable X with the possible outcomes 𝑥𝑖 , 𝑖 =


1,2, … . 𝑛. The self information of the event 𝑋 = 𝑥𝑖 is defined as,

1
𝐼 𝑥𝑖 = 𝑙𝑜𝑔 = − log 𝑃(𝑥𝑖 ) [Self information]
𝑃(𝑥𝑖 )

• This set of probabilities satisfies σ𝐾−1


𝑖=0 𝑃 𝑥𝑖 = 1

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 6


Properties of information

• If there is more uncertainty about the message, information carried also


more.

• If the receiver knows the message being transmitted, the amount of


information carried is zero.

• If I1 is the message carried by m1, and I2 is the message carried by m2,(m1


and m2 are independent) then amount of information carried combinely
due to m1 and m2 is I1 +I2.

• If there are M=2N equally likely messages, then the amount of


information carried by each message will be N bits.
AMRITA VISHWA VIDYAPEETHAM 2/3/2025 7
Example

• Consider a binary source which tosses a fair coin and outputs a 1, if a


head appears and a 0, if a tail appears. For this source, P(1)=P(0)=0.5. The
information content of each output from the source is
𝐼 𝑥𝑖 = − log 2 𝑃 𝑥𝑖 = − log 2 0.5 = 1 𝑏𝑖𝑡.

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 8


Exercise

• Find the amount of information defined by the event. “Then sun will rise”

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 9


Exercise

• A source put of one of five possible messages during each message


interval. The probabilities of these messages are

1 1 1 1
𝑃1 = , 𝑃2 = , 𝑃3 = , 𝑃4 = 𝑃5 =
2 4 8 16

Find the information content of each of these messages.

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 10


Mutual information

• Consider two discrete random variables X and Y with possible outcomes


xi and yi. Now there are to extreme cases:
• X and Y are independent, in which case the occurrence of Y=yi provides no
information about X = xi

• X and Y are fully dependent events, in which case the occurrence of Y=yi
determines the occurrence of the event X=xi.

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 11


Mutual information

• Mutual information (MI) is a measure of the amount of information that two


random variables have in common.

• A suitable measure that satisfies the conditions mentioned in the previous slides
is the algorithm of the ratio of the conditional probability.
𝑃 𝑋 = 𝑥𝑖 𝑌 = 𝑦𝑖 ) = 𝑃(𝑥𝑖 |𝑦𝑖 )

• The mutual information 𝐼 𝑥𝑖 ; 𝑦𝑖 between 𝑥𝑖 𝑎𝑛𝑑 𝑦𝑖 is defined as,

𝑃(𝑥𝑖 |𝑦𝑖
𝐼 𝑥𝑖 ; 𝑦𝑖 = 𝑙𝑜𝑔
𝑃 𝑥𝑖

𝑃(𝑦𝑖 |𝑥𝑖
𝐼 𝑦𝑖 ; 𝑥𝑖 = 𝑙𝑜𝑔 = 𝐼 𝑥𝑖 ; 𝑦𝑖
𝑃 𝑦𝑖
AMRITA VISHWA VIDYAPEETHAM 2/3/2025 12
Properties

• When the random variables X and Y are statistically independent,


𝑃 𝑥𝑖 𝑦𝑖 = 𝑃(𝑥𝑖 ), which leads to 𝐼 𝑥𝑖 ; 𝑦𝑖 = 0.

• When the occurrence of 𝑌 = 𝑦𝑗 uniquely determine the occurrence of the


event 𝑋 = 𝑥𝑖 , 𝑃 𝑥𝑖 𝑦𝑖 = 1, and the mutual information becomes

1
𝐼 𝑥𝑖 ; 𝑦𝑖 = 𝑙𝑜𝑔 = − log 𝑃(𝑥𝑖 )
𝑃(𝑥𝑖 )

Also, remember

𝑃(𝑥𝑖 |𝑦𝑗 ) 𝑃(𝑥𝑖 |𝑦𝑗 )𝑃(𝑦𝑗 ) 𝑃(𝑥𝑖 , 𝑦𝑗 ) 𝑃(𝑦𝑖 |𝑥𝑖 )


= = =
𝑃(𝑥𝑖 )
AMRITA VISHWA VIDYAPEETHAM 𝑃 𝑥𝑖 𝑃(𝑦𝑗 2/3/2025
) 𝑃 𝑥𝑖 𝑃(𝑦𝑗 ) 𝑃(𝑦𝑗 ) 13
Example

• Consider a binary symmetric channel shown in figure. Determine the


total probability to receive 0 and to receive 1. Also calculate the mutual
information.

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 14


Average self Information (Entropy)

• Consider that we have ‘M’ different messages. Let these messages be


𝑚1 , 𝑚2 … 𝑚𝑀 with the probabilities 𝑝1 , 𝑝2 … . . 𝑝𝑀 . Suppose that a sequence of L
message is transmitted, and L is very large then we may say that
• 𝑝1 𝐿 messages of 𝑚1 are transmitted

• 𝑝2 𝐿 messages of 𝑚2 are transmitted

• .

• ..

• 𝑝𝑀 𝐿 messages of 𝑚𝑀 are transmitted

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 15


Contd…

1
• Hence the information due to message m1 is 𝐼1 = log 2
𝑝1

• Since there are p1L number of messages of m1, the total information due to all

1
message of m1 is 𝐼1(𝑡𝑜𝑡𝑎𝑙) = 𝑝1 𝐿 log 2
𝑝1

• Similarly the total information due to all the messages m2 is

1
𝐼2 (𝑡𝑜𝑡𝑎𝑙) = 𝑝2 𝐿 log 2 and so on.
𝑝2

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 16


Contd…
• Thus total information due to the sequence of L message will be
𝐼𝑡𝑜𝑡𝑎𝑙 = 𝐼1 𝑡𝑜𝑡𝑎𝑙 + 𝐼2 𝑡𝑜𝑡𝑎𝑙 +⋯………… 𝐼𝑀 𝑡𝑜𝑡𝑎𝑙

1 1 1
𝐼𝑡𝑜𝑡𝑎𝑙 = 𝑝1 𝐿 log + 𝑝2 𝐿 log + ⋯ . . 𝑝𝑀 𝐿 log
𝑝1 𝑝2 𝑝𝑀

The average information per message will be,


𝑇𝑜𝑡𝑎𝑙 𝑖𝑛𝑓𝑜𝑟𝑚𝑎𝑡𝑖𝑜𝑛 𝐼𝑡𝑜𝑡𝑎𝑙
𝐴𝑣𝑒𝑟𝑎𝑔𝑒 𝐼𝑛𝑓𝑜𝑟𝑚𝑎𝑡𝑖𝑜𝑛 = =
𝑁𝑢𝑚𝑏𝑒𝑟 𝑜𝑓 𝑚𝑒𝑠𝑠𝑎𝑔𝑒𝑠 𝐿
Hence
1 1 1
𝐸𝑛𝑡𝑟𝑜𝑝𝑦 𝐻 = 𝐼𝑡𝑜𝑡𝑎𝑙 = 𝑝1 log + 𝑝2 log + ⋯ . . 𝑝𝑀 log
𝐿 𝑝1 𝑝2 𝑝𝑀

𝑀
1
𝐸𝑛𝑡𝑟𝑜𝑝𝑦 𝐻 = ෍ 𝑝𝑘 log 2
𝑝𝑘
AMRITA VISHWA VIDYAPEETHAM 𝑘=1
2/3/2025 17
Exercise

A source generates four messages m0, m1, m2 and m3 with probabilities 1/3, 1/6, ¼
and 1/4 respectively. The successive messages emitted by the source are
statistically independent. Calculate the entropy of the source.

Entropy=1959bits/messaage

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 18


Properties of entropy

• Entropy is zero if the event is sure or impossible. i.e, 𝐻 = 0, 𝑖𝑓 𝑝𝑘 = 1 𝑜𝑟 0

• When 𝑝𝑘 = 1/𝑀 for all the “M” symbols, then the symbols are equally likely. For
such source the entropy 𝐻 = log 2 𝑀

• Upper bound of the entropy is given by 𝐻𝑚𝑎𝑥 = log 2 𝑀

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 19


Exercise

• Calculate the entropy when 𝑝𝑘 = 0 𝑜𝑟 𝑝𝑘 = 1

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 20


Exercise

• A source transmits two independent messages with probabilities of “p” and “1-p”
respectively. Prove that entropy is maximum when both the messages are equally
likely. Also plot the variation of entropy H as a function of probability of messages.

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 21


Exercise

• Consider a telegraph source having two symbols dot and dash. The duration dash
is 3 times of dot. Calculate the average information.

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 22


Average Mutual information

The average mutual information between


𝑛 𝑚
two random variables X and Y is given by,
𝑝(𝑥𝑖 , 𝑦𝑗 )
𝐼 𝑋; 𝑌 = ෍ ෍ 𝑝 𝑥𝑖 , 𝑦𝑗 𝑙𝑜𝑔
𝑝 𝑥𝑖 𝑝(𝑦𝑗 )
𝑖=1 𝑗=1

𝑛 𝑚
𝑝(𝑥𝑖 |𝑦𝑗 )
= ෍ ෍ 𝑝 𝑥𝑖 , 𝑦𝑗 𝑙𝑜𝑔
𝑝 𝑥𝑖
𝑖=1 𝑗=1

𝑛 𝑚
𝑝(𝑦𝑗 |𝑥𝑖 )
= ෍ ෍ 𝑝 𝑥𝑖 , 𝑦𝑗 𝑙𝑜𝑔
𝑝(𝑦𝑗 )
𝑖=1 𝑗=1

AMRITA VISHWA VIDYAPEETHAM 2/3/2025 23


AMRITA VISHWA VIDYAPEETHAM 2/3/2025 24

You might also like