ITC-6 sem -1
ITC-6 sem -1
IT-6th Sem.
Praveen Yadav
Unit:1 INFORMATION THEORY
• Amount of Information
I = log2(1/P(Outcome)) …(1)
I = -log2P(Outcome) …(2)
When log base is 2 (or binary system) the unit is a bit. When the base is ‘e’
the unit is ‘nat’ ,base is 10 unit Hartley.
1 nat = 1.4426 bits, 1 Hartley = 3.3219 bits.
Mutual information
• What is Mutual Information?
A measure of the amount of information one variable reveals about
another.
𝑚
𝐻 ( 𝑦 ) =− ∑ 𝑃 ( 𝑦𝑗 ) 𝑙𝑜𝑔 2 𝑃 ( 𝑦𝑗 ) 𝑏𝑖𝑡 /Receiver
𝑠𝑦𝑚𝑏𝑜𝑙
𝑗=1
Entropy
4. Relationship between joint, conditional and
= 𝐻(𝑋,𝑌) − 𝐻(𝑋)
= 𝐻(𝑋,𝑌) − 𝐻(𝑌)
Also we have transinformation (average mutual information)
∣ 𝑌)
𝐼(𝑋,𝑌) = 𝐻(𝑌) −
Example: The joint probability of a system is given by:
[ ]
𝑥1 ¿ 0.5 0.25
𝑃 ( 𝑋 , 𝑌 )= 𝑥 2 ¿ 0 0.125
𝑥 3 ¿ 0.0625 0.0625
Find:
1. Marginal entropies.
2. Joint entropy.
3. Conditional entropies.
4. The transinformation.
Solution: x1 x2 x3 y1
y2
P(x)=[0.75 0.125 0.125], P(y)= [0.5625
1- Marginal 0.4375]
entropies:
[
𝑛
0.75 𝑙𝑛0.75 +2 ∗0.12
𝐻 ( 𝑥 ) =− ∑ 𝑃 ( 𝑥𝑖 ) 𝑙𝑜𝑔2 𝑃 ( 𝑥𝑖 )=−
𝑖=1 𝑙𝑛2
¿ 1.06127 𝑏𝑖𝑡 /𝑠 𝑦𝑚𝑏𝑜𝑙
[
𝑚
0.5625𝑙𝑛0.5625+0.4375𝑙𝑛
𝐻 ( 𝑦 ) =− ∑ 𝑃 ( 𝑦𝑗 ) 𝑙𝑜𝑔 2 𝑃 ( 𝑦𝑗 )=−
𝑗=1 𝑙𝑛2
¿ 0.9887 𝑏𝑖𝑡/ 𝑠 𝑦𝑚𝑏𝑜𝑙
[
𝑥1 ¿ 0.5 0.2
2- Joint 𝑃 ( 𝑋 , 𝑌 )= 𝑥 2 ¿ 0 0.12
entropy: 𝑥 3 ¿ 0.0625 0.
m n
H ( x , y )=− ∑ ∑ P ( xi , yj ) log2 P ( xi , yj )
j =1 i=1
3. Conditional entropies :
4. The trans-information :
abaabaababbbaabbabab… ….
0.5 b
source
Intuition on Shannon’s Entropy
n
Why H p log( p ) i i
i 1
Suppose you have a long random string of two binary symbols 0 and 1, and the probability of symbols 1
and 0 are p
and
0 1 p
Ex: 00100100101101001100001000100110001 ….
If any string is long enough say N, it is likely to contain Np 0’s and Np11’s. The probability of this
0
string pattern occurs is equal to
p p0Np0 p1Np1
Hence, # of possible patterns is 1 / p p0 Np0 p1 Np1
1
# bits to represent all possible patterns is log( p Np0
0 p1 Np1 ) Npi log pi
i 0
The average # of bits to represent the symbol is therefore
1
p log p
i 0
i i
More Intuition on Entropy
Assume a binary memoryless source, e.g., a flip of a coin. How
much information do we receive when we are told that the
outcome is heads?
If it’s a fair coin, i.e., P(heads) = P (tails) = 0.5, we say that the
amount of information is 1 bit.
Example 2:
Which logarithm? Pick the one you like! If you pick the natural log,
you’ll measure in nats, if you pick the 10-log, you’ll get Hartleys,
if you pick the 2-log (like everyone else), you’ll get bits.
Self Information
Let
Of t e
n de
Then note
d
1
The uncertainty (information) is greatest when
0 0.5 1
Example
Three symbols a, b, c with corresponding probabilities:
What is H(P)?
What is H(Q)?
Entropy: Three properties
1. It can be shown that 0 ≤ H ≤ log N.
(a) How much information (in bits) is gained when the forecast
predicts it will be sunny?
(b) Which weather outcome carries more uncertainty, and why?
C = m ax { I ( X , Y )} = max{H(Y ) − H ( Y | X ) } (bits/symbol)
• Let t i be the symbol duration for X i and t a v be the average time for
transmission of a symbol, the channel capacity can also be
defined as
H ( X ) = − Case
Free Σ
q P (X )
i 2 P (X )
i
log i=1 (bits/symbol)
C = 1 + ( 1 − p e ) log 2 (1 − p e ) + p e log 2 pe
0.8
0.6
C
0.4
0.2
0
−4 −3 −2 −1
10 10 10 10
p
e
P ( X 0 , Y 0 ) = P ( X 0 ) P ( Y 0 | X 0 ) = ( 1 − p e ) /2 , P ( X 0 , Y1 ) = P ( X 0 ) P (Y1 |X0 )
= p e /2 P ( X 1 , Y 0 ) = p e /2, P ( X 1 , Y 1 ) = (1 − pe )/2
• Obvious implications:
S P increases the channel capacity
– Increasing the SNR
NP
– Increasing the channel bandwidth B increases the channel capacity
Bandwidth and SNR Trade
off
• From the definition of channel capacity, we can trade the channel bandwidth B
for the SNR or signal power S P , and vice versa
We have
SP SP
C∞ = lim C = log 2 e = 1.44
B→∞ N0 N0
Bandwidth and SNR Trade
off – Example
• Q: A channel has an SNR of 15. If the channel bandwidth is
reduced by half, determine the increase in the signal power
required to maintain the same channel capacity
• A !
′
: SP ′ SP
B · log2 1+ = B · log2 1+
N0 B N0 B′
!
B (S′ / P
4·B = · log2 SP ) · S
2 1+ PN B / 2
0
′ !
SP
8 = log2 1 +
SP
30
S′ ′
256 = 1 + 30 −→ SP =
SP
P 8.5S P
ELEC3028 Digital Transmission – Overview & S
Summary
Information Theory Chen
• Shannon theorem
7
Topic: 7- Bandwidth and Channel Capacity: Mutual
Information, Shannon’s Channel Capacity Theorem, Bandwidth
and Channel Capacity
L-1: A family is experiencing slow Wi-Fi speeds when multiple
devices are connected.
1. What factors affect the channel capacity of a home Wi-Fi
network?
2. How does bandwidth impact internet speed?
3. Suggest three practical ways to improve Wi-Fi speed based
on Shannon’s Channel Capacity theorem.
L-2: A remote worker faces lag and poor quality during video
calls, especially when multiple apps are running.
4. What is the role of mutual information in determining the
quality of transmitted video and audio?
5. Suggest two optimizations (e.g., adjusting bandwidth
allocation, using adaptive coding) to ensure smooth video
calls.
Cont…Topic: 7- Bandwidth and Channel Capacity: