Problems Information Theory
Problems Information Theory
2. A source emits one of four symbols s o , s1 , s 2 and s 3 with probabilities 1/3, 1/6, 1/4,
and 1/4, respectively. The successive symbols emitted by the source are statistically
independent. Calculate the entropy of the source. [9.3 Haykin]
3. The sample function of a Gaussian process of zero mean and unit variance is
uniformly sampled and then applied to a uniform quantizer having the input-output
amplitude characteristic shown in Fig.1. Calculate the entropy of the quantizer output.
[9.5 Haykin]
Output
1.5
-1
0.5
-0.5
Input
-1.5
Given:
1
2
ey
/2
if x = 0
0.5
dy =
0.1611 if x = 1
0
1
rs = 1000 sybmol/sec
Figure 1
6. Find the rate of information transmission of the discrete channel shown in Figure 2.
1
X 2
0 .8
0 .8
0 .1
0.2
0.2
0 .1
rs = 1000 symbol/sec
2 Y
3
0 .8
P ( X = 1) = P( X = 2) = P( X = 3) = 1 / 3
3
Fig. 2
7. Two binary symmetric channels are connected in cascade. Find the overall channel
capacity of the cascaded connection, assuming that both channels have the same
transition probability as shown below. [Haykin9.22]
8. An analog signal has a 4 kHz bandwidth. The signal is sampled at 2.5 times the
Nyquist rate and each sample is quantized into one of 256 equally likely levels.
Assume that the successive samples are statistically independent.
(a) What is the information rate of this source?
(b) Can the output of this source be transmitted without errors over a Gaussian
channel with a bandwidth of 50kHz and S/N ratio of 23 dB?
(c) What will be the output of the source without errors if the S/N ratio is 10 dB?
9. A black-and-white television picture may be viewed as consisting of approximately
3105 elements, each of which may occupy one of 10 distinct brightness levels with
equal probability. Assume that (1) the rate of transmission is 30 picture frames per
second, and (2) the signal-to-noise ratio is 30 dB. Using the information capacity
theorem, calculate the minimum bandwidth required to support the transmission of
the resulting video signal.
(Note: As a matter of interest, commercial television transmissions actually employ a
bandwidth of 4.2 MHz, which fits into an allocated bandwidth of 6MHz.)
[Haykin9.31]
Solution
I = log 2 p
1.
bits
3. H = p( xi ) log 2 p( xi )
i =0
y2
1
exp dy = 0.1611
2
2
y2
1
exp dy = 0.5 0.1611 = 0.3389
2
2
H ( S 2 ) = 2 H ( S ) = 2.158 bits/symbol
5 (a)
H (Y | X ) = P( X = i ) H (Y | X = i ) = 0.555 bits/symbol
i =0
1
(c)
I ( X ; Y ) = H ( X ) H ( X | Y ) = 0.276 bits/symbol
or I ( X ; Y ) = H (Y ) H (Y | X ) = 0.277 bits/symbol
Therefore, I ( X ; Y )rs = 276 bits/second
6.
P (Y = 1) = P(Y = 1 | X = 1) P( X = 1) + P(Y = 1 | X = 2) P( X = 2) + P(Y = 1 | X = 3) P( X = 3) = ...
P (Y = 2) = P (Y = 2 | X = 1) P( X = 1) + P(Y = 2 | X = 2) P( X = 2) + P(Y = 2 | X = 3) P( X = 3) = ...
P (Y = 3) = P(Y = 3 | X = 1) P( X = 1) + P(Y = 3 | X = 2) P( X = 2) + P(Y = 3 | X = 3) P( X = 3) = ...
3
H (Y ) = P (Y = i ) log 2 P (Y = i ) = ...
i =1
H (Y | X ) = P( X = i ) H (Y | X = i ) = ...
i =1
I ( X ; Y ) = H (Y ) H (Y | X ) = ... bits/symbol
I ( X ; Y ) = rs I ( X ; Y ) = ... bits/second
7.
.
Using the result of the BSC, we have
pi =
i =1
1
256
Therefore, H ( X ) = 8 bits/symbol
Information rate = f s H ( X ) = 160k bits/second
(b) Channel Capacity = B log 2 (1 + S / N ) = 50k log 2 (1 + 199.5) = 382kbps
As Channel Capacity > Information rate, the output of this source can be
transmitted without errors.
(c) /
10