Information Theory and Coding: Instructions To Candidates
Information Theory and Coding: Instructions To Candidates
of Pages : 02
Total No. of Questions : 09
B.Tech.(ECE/ETE) (E–I 2011 Onwards)
(Sem.–6)
INFORMATION THEORY AND CODING
Subject Code : BTEC-907
M.Code : 71236
Time : 3 Hrs. Max. Marks : 60
INSTRUCTIONS TO CANDIDATES :
1. SECTION-A is COMPULSORY consisting of TEN questions carrying T WO marks
each.
2. SECTION-B contains FIVE questions carrying FIVE marks each and students
have to attempt any FOUR questions.
3. SECTION-C contains T HREE questions carrying T EN marks each and students
have to attempt any T WO questions.
m
SECTION-A
o
1. Answer briefly :
.r c
p e
a) Define mutual information and its properties.
m
a o
b) Define Hamming weight and Hamming distance. Find the hamming weight of 10110
p
and the hamming distance between 1111 and 0000.
r .r c
b
c) Define bandwidth efficiency.
p e
d) Explain in brief Go Back N ARQ system.
p a
e) Define code efficiency.
b r
f) Enumerate the properties of a syndrome.
i) What is the significance of a syndrome vector in the context of error control coding?
1 | M-71236 (S2)-1994
SECTION-B
Q2. What do you understand by information? What are its units? How does it relate to the
entropy?
Q4. A BSC has the error probability p = 0.2 and the input to the channel consists of 4
equiprobable messages xl = 000; x2 = 001; x3 = 011; x4 = 111. Calculate :
Q6. Explain the working of (2,1,3) convolutional encoder using transform domain approach.
o m
.r c
SECTION-C
e
Q7. Discuss Shanon's Hartley theorem based on channel capacity. How does channel capacity
a) 0011
b p e
p a
b) 0100
b r
Show that how cyclic code is decoded to get word for previous case (a).
Q9. Construct the Huffman code with minimum code variance for the following probabilities
and also determine the code variance and code efficiency :
NOTE : Disclosure of Identity by writing Mobile No. or Making of passing request on any
page of Answer Sheet will lead to UMC against the Student.
2 | M-71236 (S2)-1994