0% found this document useful (0 votes)
31 views

Information Theory and Coding - Chapter 4

The document discusses discrete memoryless channels and how they can be represented using forward and backward conditional channel matrices to describe the probabilistic relationships between the input and output alphabets. It also examines how to calculate important channel parameters like the output probability distribution, joint probabilities, and the receiver's decision rule to minimize the probability of error. Examples are provided to illustrate finding the channel input and output distributions, decision rules, and error probabilities.
Copyright
© Public Domain
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
31 views

Information Theory and Coding - Chapter 4

The document discusses discrete memoryless channels and how they can be represented using forward and backward conditional channel matrices to describe the probabilistic relationships between the input and output alphabets. It also examines how to calculate important channel parameters like the output probability distribution, joint probabilities, and the receiver's decision rule to minimize the probability of error. Examples are provided to illustrate finding the channel input and output distributions, decision rules, and error probabilities.
Copyright
© Public Domain
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 28

Mustaqbal University

College of Engineering &Computer Sciences


Electronics and Communication Engineering Department

Course: EE301: Probability Theory and Applications

Part B
Prerequisite: Stat 219

Te×t Book: B.P. Lathi, “Modern Digital and Analog Communication Systems”, 3th edition, O×ford University Press, Inc., 1998
Reference: A. Papoulis, Probability, Random Variables, and Stochastic Processes, Mc-Graw Hill, 2005

Dr. Aref Hassan Kurdali


Discrete Memoryless Channels
The issue of information transmission is now considered,
with particular emphasis on reliability (error free).
Consider a discrete memoryless channel which is a
statistical model with an input A and an output B that is a
noisy version of A; both A and B are random discrete
variables. At every unit of time, the channel accepts an
input symbol ai selected from an input alphabet A
{a1,a2, ...,ai, ., ar} and in response, it emits an output
symbol B from an output alphabet B {b1,b2, ...,bj, ., bs}.
The channel is said to be "discrete" when both of the
alphabets A and B have finite sizes. It is said to be
"memoryless" when the current output symbol depends
only on the current input symbol and not on any of the
previous ones. The input alphabet A and output alphabet
B need not have the same size.
A convenient way of describing a discrete memoryless channel is to arrange
the various conditional probabilities of the channel at the transmitter in the form
of a forward conditional channel matri× F as follows:
p(b1/a1) p(b2/a1) ............ p(bs/a1)
FM = p(b1/a2) p(b2/a2) ............ p(bs/a2)
.
p(b1/ar) p(b2/ar) ............ p(bs/ar)
The ith raw represents the output probability distribution given the input symbol
ai is transmitted. Thus, the sum over any raw has to equal to one.
Similarly, at the receiver, the backward conditional channel matri× can be
calculated and arranged as follows:
p(a1/b1) p(a2/b1) ............ p(ar/b1)
BM = p(a1/b2) p(a2/b2) ............ p(ar/b2)
.
p(a1/bs) p(a2/bs) ............ p(ar/bs)
The jth raw represents the input probability distribution given the output symbol
bj is received. Thus, the sum over any raw has also to equal to one.
The joint probability p(ai , bj) = p(bj/ai) p(ai) = p(ai/bj) p(bj) = p(bj , ai)
Tr R× Tr R×

a1 • • b1 P(a1) P(b1) a2 • • b2
P(a2) P(b2)
a3 • • b3 P(a3) P(b3)

Forward channel Diagram

Backward channel Diagram


Forward Conditional Channel Matrix: Conditional output probability distribution at the transmitter

Ideal Practical

The output probabilities can be calculated as:


P(b1) = P(b1/a1)P(a1) + P(b1/a2) P(a2) + P(b1/a3) P(a3) = P(a1,b1) + P(a2,b1) + P(a3,b1)
P(b2) =
P(b3) =

Backward Conditional Channel Matrix: Conditional input probability distribution at the receiver

P(a1/b1) P(a2/b1) P(a3/b1) 0.8 0.15 0.05


P(a1/b2) P(a2/b2) P(a3/b2) 0.07 0.9 0.03
P(a1/b3) P(a2/b3) P(a3/b3) 0.01 0.29 0.7
The marginal probability distribution of the output random variable
B {p(b1), p(b2), ...,p(bj), ., p(bs)} can be calculated from the priori
probability distribution of the input random variable A
{p(a1), p(a2), ...,p(ai), ., p(ar)} and the forward conditional channel
matri×. Where

P(bj) = sum over all i for p(bj/ai) p(ai), i=1, 2, ....., r

i.e., B = FMT A

Note that the input probability distribution A and the forward


channel matrix FM are independent and should be practically
calculated and given.
BSC forward transition matrix

The binary symmetric channel (BSC) is of great theoretical


interest and practical importance. It is a special case of the
discrete memoryless channel with i = j = 2. The channel has two
input symbols (x0 = 0, xl = 1) and two output symbols (y0 = 0, y1 =
1). The channel is symmetric because the probability of receiving
a 1 if a 0 is sent is the same as the probability of receiving a 0 if a
1 is sent. This conditional probability of error is denoted by p. The
forward channel matrix and diagram of a binary symmetric
channel are shown above in Figure 9.8.
Problem 1
Consider the transition probability diagram of a binary symmetric channel shown
in the figure below. The input binary symbols 0 and 1 occur with equal
.probability
a) Find the probabilities of the binary symbols 0 and 1 appearing at the channel
output.
b) Repeat the calculation in (a), assuming that the input binary symbols 0 and 1
occur with probabilities and , respectively.
Problem 1 - Solution
Problem 1 - Solution
Problem 2
Given the following forward ternary channel matrix:

where the channel input probabilities are:

a) Complete: ,
b) Draw the forward channel diagram.
c) Calculate the output probabilities.
d) Calculate the following joint probabilities: .
e) Calculate the following conditional probabilities: , .
Problem 2 – Solution
=>

a) 0.1 , 0.08

b) The forward channel diagram

0.8

0.13
0.07
Problem 2 – Solution
=>

c) The output probabilities.

d) The joint probabilities:

e) Calculate the following conditional probabilities:


Receiver Decision Rules
Let the symbol bj has been received, which input symbol
ai should the receiver decide to be sent? d(b j) is the
receiver decision rule whenever bj is received. Using the
backward transition matrix and the maximum likelihood
rule over each row:
d(bj) = a*, where p(a*/bj) ≥ p(ai/bj) for all i
and in terms of the joint probabilities:
i.e. p(a*,bj) ≥ p(ai,bj) for all i
or the forward conditional probabilities p(bj/a*)p(a*) ≥ p(bj/ai)p(ai) for all i
If the input probability is uniform, then choose
d(bj) = a*, where p(bj /a*) ≥ p(bj /ai) for all i
Probability of Error Calculations
For a given channel matri× and a given receiver decision rule,
the conditional probability of error p(E/bj ) = 1- p(d(bj)/bj) and
the statistical average of error PE can be calculated as:

If the input probability is uniform, then


PE = 1- (1/r) (sum of all p(bj /a*) )
Problem 3
Consider the following backward 4-ary channel matrix

What is the receiver decision rule would be for: , , .


Problem 3 - Solution
Consider the following backward 4-ary channel matrix

The receiver decision rule would be


Problem 4
A 4-symbols of memoryless source S {a, b, c, and d} is encoded using the following
binary Shannon-Fano encoder:

a 1100
b 10
c 0110
d 00

The source probability distribution is: p(a)=p(c)=0.0625, p(b)=p(d)=0.4375.


If this encoder output binary stream is applied to a binary channel, Calculate the
channel input probability distribution: P(0) and P(1).
Problem 4 - Solution
a 1100
b 10
c 0110
d 00

p(a)=p(c)=0.0625, p(b)=p(d)=0.4375.
Problem 5
If the shown ternary stream is applied to a ternary Huffman
channel with the following forward channel matrix: T Prob Ternary
Code
0.25 1
a) Calculate the channel input probability 0.2 2
distribution. 0.2 00
b) Calculate the channel output probability 0.16 02
distribution.
c) Find the receiver decision rule and its probability 0.05 011
of error. 0.05 012
d) Calculate the following conditional probabilities: ,
0.04 0100
0.04 0101
0.01 0102
Problem 5 - Solution
Huffman
a) The channel input probability distribution: T Prob
Ternary Code
 05+ 05 + +
0.25 1
 05+ 04+ 04+
0.2 2

0.2 00

0.16 02

b) Calculate the channel output probability distribution. 0.05 011

0.05 012

0.04 0100

0.04 0101

0.01 0102
Problem 5 - Solution
Huffman
T Prob
Ternary Code
c) The receiver decision rule:
0.25 1
d(0) = 0
d(1) = 1 0.2 2
d(2) = 2 0.2 00

Its probability of error: 0.16 02

0.05 011

0.05 012

0.04 0100

0.04 0101
d)
0.01 0102

0599
Problem 6
Given the following forward ternary channel matrix:

Where the channel input probability distribution is 0.4, 0.5 & 0.1.
1. Draw the forward channel diagram.
2. Calculate the output probability distribution.
3. Calculate the joint channel matrix.
4. Calculate the backward channel matrix.
5. Find the receiver decision rule.
Problem 6 - Solution
2. The output probability distribution
P(b1) = P(b1/a1)P(a1) + P(b1/a2)P(a2) + P(b1/a3) P(a3)
= P(a1,b1) + P(a2,b1) + P(a3,b1)
= 0.4×0.7 + 0.5×0.08 + 0.1×0.2
= 0.28 + 0.04 + 0.02 = 0.34
P(b2) = 0.13×0.4 + 0.9×0.5 +0.3×0.1 = 0.052 + 0.45 + 0.03 = 0.532
P(b3) = 0.17×0.4 +0.02×0.5 + 0.5×0.1 = 0.068 + 0.01 + .05 = 0.128
 
3. The joint channel matrix

4. The backward channel matrix


 BW=

5. The receiver decision rule


d(b1) = a1, d(b2) = a2 & d(b3) = a1
Problem 7
A source S has six symbols with probability distribution [0.55, 0.1, 0.15, 0.13,
0.03, and 0.04].
 
a) Construct a ternary Huffman code.
b) If the encoder output stream is transmitted via a ternary symmetric channel
with probability of error p = 0.1. Calculate the input probability distribution
of the channel.
c) Write the forward ternary channel matrix.
d) Calculate the output probability distribution.
e) Calculate the joint channel matrix.
f) Calculate the backward channel matrix.
g) Find the receiver decision rule and calculate its probability of error P E.
h) How could you reduce the probability of error PE?
Problem 7 - Solution
Ternary symmetric channel forward diagram:
Problem 7 - Solution
Problem 8
Given the following forward ternary channel matrix:
0.9 0.03 0.07
0.08 0.7 0.22
0.1 0.4 0.5
 
Find the receiver decision rule and its Probability of error for each input
probability distribution [p(a1), p(a2), P(a3)]:
a) [0.8, 0.15 & 0.05]
b) [0.15, 0.8 & 0.05]
c) [0.1, 0.15 & 0.8]
d) Comment

You might also like