0% found this document useful (0 votes)
9 views

BITS-Pilani, K.K. Birla Goa Campus Department of Electrical & Electronics Engineering Information Theory & Coding (ECE-F344)

ITC

Uploaded by

gowrianand24
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views

BITS-Pilani, K.K. Birla Goa Campus Department of Electrical & Electronics Engineering Information Theory & Coding (ECE-F344)

ITC

Uploaded by

gowrianand24
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

BITS-Pilani, K.K.

Birla Goa Campus


Department of Electrical & Electronics Engineering
Information Theory & Coding ( ECE-F344)
Midsem Examination, Date: March 04, 2020

Duration: 90 Mins Weightage=30% Max marks: 60

Note: It is a closed book examination and all questions are compulsory.

1. State whether the following statements are true or false. If false, give the correct statement.
(2x5=10 marks)
(a) I(X; Y ) = D[p(x, y)||p(x)p(y)] ≥ 0 with equality holds if and only if X and Y are depen-
dent to each other.
(b) Relative entropy D(p||q) is concave in the pair (p, q) while the entropy H(p) is a convex
function of p.
(c) The joint entropy & conditional entropy is exhibited by the fact that the entropy of a pair
of random variable is the entropy of one random variable plus the joint entropy of the other
random variable.
(d) The correct order of code efficiency is FLC>Binary tree VLC>Lempel Ziv>Arithmetic>Huffman.
(e) I(X; Y ) = H(X) − H(X|Y ) = H(Y |X) − H(Y ) = H(X) + H(Y ) − H(X, Y ).

2. Solve the following short answer type problems: (2x5=10 marks)


(a) Define & prove Kraft inequality.
(b) Let exponential distributions p(x) and q(x) have their respective means λ1 and λ2 . Deter-
mine KL distance between p and q.
(c) For a (23,12,7) binary code, error correcting capability is 3. If it is used over a binary
symmetric channel (BSC) with bit error probability p = 0.01, determine the word error
probability.
(d) For a source with entropy H(X), prove that the entropy of a B−symbol block is B{H(X)}.
(e) For a (5,3) code over GF(4), the generator matrix is given by:
 
1 0 0 1 1
0 1 0 1 2
0 0 1 1 3

i. Find the parity check matrix.


ii. Determine the errors that code can detect and correct.

3. (a) An input alphabet (a keyboard on a word processor) consists of 100 characters.(5 marks)
i. If the keystrokes are encoded by a fixed-length code, determine the required number
of bits for the encoding.
ii. We make the simplifying assumption that 10 of the keystrokes are equally likely
and that each occurs with probability 0.05. We also assume that the remaining 90
keystrokes are equally likely. Determine the average number of bits required to encode
this alphabet using a variable-length Huffman code. (Hint: you can consider 90
keystrokes together.)
(b) Consider a DMS with source probability {0.2, 0.2, 0.15, 0.15, 0.10, 0.10, 0.05, 0.05} (5 marks)
i. Determine the efficient fixed length code for the source.
ii. Determine the Huffman code for this source.
iii. Determine source entropy & compare the two codes in terms of average length.
4. Suppose that in BITS-Goa, 3/4 of the ECE students pass and 1/4 fail. Of those who pass, 10
percent own iPhones, while 50 percent of the failing students own iPhones. All of the iPhone
owning students are Jio subscribers, while 40 percent of those who do not own iPhones but
pass, as well as 40 percent of those who do not own iPhones but fail, are Jio subscribers. (10
marks)
(a) How much information is conveyed about a student’s academic standing by specifying
whether or not he owns an iPhone?
(b) How much information is conveyed about a student’s academic standing by specifying
whether or not he is a Jio subscriber.
(c) If a student’s academic standing, iPhone-owning status, and Jio subscriber status are
transmitted by three successive binary digits, how much information is conveyed by each
digit.

5. Consider the following generator matrix over GF (2). (10 marks)


 
1 0 1 0 0
G = 1 0 0 1 1
0 1 0 1 0

(a) Generate all possible codewords using this matrix.


(b) Find the parity check matrix, H.
(c) Find the generator matrix of an equivalent systematic code.
(d) What is the minimum distance of this code?
(e) How many errors can this code detect?
(f) Write down the set of error patterns this code can detect.
(g) How many errors can this code correct?
(h) What is the probability of symbol error if we use this encoding scheme? Compare it with
the uncoded probability of error.

6. The input source to a noisy communication channel is a random variable X over the four
symbols a, b, c, d. The output from this channel is a random variable Y over these same four
symbols. The joint distribution of these two random variables is as follows: (10 marks)

x=a x=b x=c x=d


1 1 1 1
y =a 8 16 16 4
1 1 1
y =b 16 8 16 0
1 1 1
y =c 32 32 16 0
1 1 1
y =d 32 32 16 0

(a) Write down the marginal distribution for X and compute the marginal entropy H(X) in
bits.
(b) Write down the marginal distribution for Y and compute the marginal entropy H(Y ) in
bits.
(c) What is the joint entropy H(X, Y ) of the two random variables in bits?
(d) What is the conditional entropy H(Y |X) in bits?
(e) What is the mutual information I(X; Y ) between the two random variables in bits?

Best wishes

Page 2

You might also like