ITC-03
ITC-03
Information
Dr.A.Manikandan,
Associate Prof/ECE, Amrita School of Engineering.
1
Learning Objective
2
Modeling of information sources
• If the event has occurred, a time back there is a condition of having some
information.
• If the source output had no randomness, i.e., the output were known
exactly, there would be no need to transmit it.
• The sun will rise
• If the source can freely choose from many different messages, the user
is highly uncertain as to which message will be selected for
transmission.
AMRITA VISHWA VIDYAPEETHAM 2/3/2025 5
Measure of information
1
𝐼 𝑥𝑖 = 𝑙𝑜𝑔 = − log 𝑃(𝑥𝑖 ) [Self information]
𝑃(𝑥𝑖 )
• Find the amount of information defined by the event. “Then sun will rise”
1 1 1 1
𝑃1 = , 𝑃2 = , 𝑃3 = , 𝑃4 = 𝑃5 =
2 4 8 16
• X and Y are fully dependent events, in which case the occurrence of Y=yi
determines the occurrence of the event X=xi.
• A suitable measure that satisfies the conditions mentioned in the previous slides
is the algorithm of the ratio of the conditional probability.
𝑃 𝑋 = 𝑥𝑖 𝑌 = 𝑦𝑖 ) = 𝑃(𝑥𝑖 |𝑦𝑖 )
𝑃(𝑥𝑖 |𝑦𝑖
𝐼 𝑥𝑖 ; 𝑦𝑖 = 𝑙𝑜𝑔
𝑃 𝑥𝑖
𝑃(𝑦𝑖 |𝑥𝑖
𝐼 𝑦𝑖 ; 𝑥𝑖 = 𝑙𝑜𝑔 = 𝐼 𝑥𝑖 ; 𝑦𝑖
𝑃 𝑦𝑖
AMRITA VISHWA VIDYAPEETHAM 2/3/2025 12
Properties
1
𝐼 𝑥𝑖 ; 𝑦𝑖 = 𝑙𝑜𝑔 = − log 𝑃(𝑥𝑖 )
𝑃(𝑥𝑖 )
Also, remember
• .
• ..
1
• Hence the information due to message m1 is 𝐼1 = log 2
𝑝1
• Since there are p1L number of messages of m1, the total information due to all
1
message of m1 is 𝐼1(𝑡𝑜𝑡𝑎𝑙) = 𝑝1 𝐿 log 2
𝑝1
1
𝐼2 (𝑡𝑜𝑡𝑎𝑙) = 𝑝2 𝐿 log 2 and so on.
𝑝2
1 1 1
𝐼𝑡𝑜𝑡𝑎𝑙 = 𝑝1 𝐿 log + 𝑝2 𝐿 log + ⋯ . . 𝑝𝑀 𝐿 log
𝑝1 𝑝2 𝑝𝑀
𝑀
1
𝐸𝑛𝑡𝑟𝑜𝑝𝑦 𝐻 = 𝑝𝑘 log 2
𝑝𝑘
AMRITA VISHWA VIDYAPEETHAM 𝑘=1
2/3/2025 17
Exercise
A source generates four messages m0, m1, m2 and m3 with probabilities 1/3, 1/6, ¼
and 1/4 respectively. The successive messages emitted by the source are
statistically independent. Calculate the entropy of the source.
Entropy=1959bits/messaage
• When 𝑝𝑘 = 1/𝑀 for all the “M” symbols, then the symbols are equally likely. For
such source the entropy 𝐻 = log 2 𝑀
• A source transmits two independent messages with probabilities of “p” and “1-p”
respectively. Prove that entropy is maximum when both the messages are equally
likely. Also plot the variation of entropy H as a function of probability of messages.
• Consider a telegraph source having two symbols dot and dash. The duration dash
is 3 times of dot. Calculate the average information.
𝑛 𝑚
𝑝(𝑥𝑖 |𝑦𝑗 )
= 𝑝 𝑥𝑖 , 𝑦𝑗 𝑙𝑜𝑔
𝑝 𝑥𝑖
𝑖=1 𝑗=1
𝑛 𝑚
𝑝(𝑦𝑗 |𝑥𝑖 )
= 𝑝 𝑥𝑖 , 𝑦𝑗 𝑙𝑜𝑔
𝑝(𝑦𝑗 )
𝑖=1 𝑗=1