Information Theory and Coding - Chapter 4
Information Theory and Coding - Chapter 4
Part B
Prerequisite: Stat 219
Te×t Book: B.P. Lathi, “Modern Digital and Analog Communication Systems”, 3th edition, O×ford University Press, Inc., 1998
Reference: A. Papoulis, Probability, Random Variables, and Stochastic Processes, Mc-Graw Hill, 2005
a1 • • b1 P(a1) P(b1) a2 • • b2
P(a2) P(b2)
a3 • • b3 P(a3) P(b3)
Ideal Practical
Backward Conditional Channel Matrix: Conditional input probability distribution at the receiver
i.e., B = FMT A
a) Complete: ,
b) Draw the forward channel diagram.
c) Calculate the output probabilities.
d) Calculate the following joint probabilities: .
e) Calculate the following conditional probabilities: , .
Problem 2 – Solution
=>
a) 0.1 , 0.08
0.8
0.13
0.07
Problem 2 – Solution
=>
a 1100
b 10
c 0110
d 00
p(a)=p(c)=0.0625, p(b)=p(d)=0.4375.
Problem 5
If the shown ternary stream is applied to a ternary Huffman
channel with the following forward channel matrix: T Prob Ternary
Code
0.25 1
a) Calculate the channel input probability 0.2 2
distribution. 0.2 00
b) Calculate the channel output probability 0.16 02
distribution.
c) Find the receiver decision rule and its probability 0.05 011
of error. 0.05 012
d) Calculate the following conditional probabilities: ,
0.04 0100
0.04 0101
0.01 0102
Problem 5 - Solution
Huffman
a) The channel input probability distribution: T Prob
Ternary Code
05+ 05 + +
0.25 1
05+ 04+ 04+
0.2 2
0.2 00
0.16 02
0.05 012
0.04 0100
0.04 0101
0.01 0102
Problem 5 - Solution
Huffman
T Prob
Ternary Code
c) The receiver decision rule:
0.25 1
d(0) = 0
d(1) = 1 0.2 2
d(2) = 2 0.2 00
0.05 011
0.05 012
0.04 0100
0.04 0101
d)
0.01 0102
0599
Problem 6
Given the following forward ternary channel matrix:
Where the channel input probability distribution is 0.4, 0.5 & 0.1.
1. Draw the forward channel diagram.
2. Calculate the output probability distribution.
3. Calculate the joint channel matrix.
4. Calculate the backward channel matrix.
5. Find the receiver decision rule.
Problem 6 - Solution
2. The output probability distribution
P(b1) = P(b1/a1)P(a1) + P(b1/a2)P(a2) + P(b1/a3) P(a3)
= P(a1,b1) + P(a2,b1) + P(a3,b1)
= 0.4×0.7 + 0.5×0.08 + 0.1×0.2
= 0.28 + 0.04 + 0.02 = 0.34
P(b2) = 0.13×0.4 + 0.9×0.5 +0.3×0.1 = 0.052 + 0.45 + 0.03 = 0.532
P(b3) = 0.17×0.4 +0.02×0.5 + 0.5×0.1 = 0.068 + 0.01 + .05 = 0.128
3. The joint channel matrix
=