0% found this document useful (0 votes)
14 views2 pages

DC Expt 6 - Huffman

Digital Communications Huffman experiment writeups

Uploaded by

Rehan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views2 pages

DC Expt 6 - Huffman

Digital Communications Huffman experiment writeups

Uploaded by

Rehan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

T. Y. B.

Tech (ECE)
Trimester: V Subject: DC
Name: Class:
Roll No: Batch:
Experiment No: 6
Marks Teacher’s Signature with date
Performed on:

Submitted on:

AIM: Implementation of algorithm for source coding and its evaluation.


Generation and evaluation of variable length source code using Huffman algorithm.

Develop a MATLAB code for the same using functions huffmandict, huffmanenco,
huffmandeco

THEORY:
Procedure for implementation of Huffman algorithm:

1. List the source probabilities in decreasing order.


2. Combine the probabilities of the two symbols having the lowest probabilities, and
record the resultant probabilities; this step is called reduction1.
The same procedure is repeated until there are two ordered probabilities
remaining.
3. Start encoding with the last reduction, which consists of exactly two ordered
probabilities. Assign 0 as the first digit in the code words for all the source symbols
associated with the first probability; assign 1 to the second probability.
4. Now go back and assign 0 and 1 to the second digit for the two probabilities that were
combined in the previous reduction step, retaining all assignments made in step 3.
5. Keep regressing this way until the first column is reached.

User inputs:
1. Discrete memoryless Source probabilities.

Program outputs:
1. Source code i.e. List of Codewords corresponding to all source symbols.
2. Average codeword length
3. Efficiency of the source code
4. Verification of generation of unique and instantaneous decodable code using Kraft
inequality
5. Code variance, σ2

Use following equations:

Efficiency = Entropy H(X) / Avg. codeword length, L

Avg. codeword length L =  (Pi * Li) where Li is the length of ith codeword.

1
Also calculate Kraft inequality parameter K= (2-Li) and verify that K ≤1 and

Code Variance σ2 = Pi *(L – Li ) 2

Study questions:
1. What is the disadvantage of Huffman algorithm?
2. What is an Optimum source code?
3. What is prefix free condition?
4. State source coding theorem or Shannon’s first theorem.
5. A DMS with 5 symbols and probabilities as Pi = [ 0.2, 0.4, 0.1, 0.2, 0.1] is used.
Generate Huffman code for the DMS and verify your results using the MATLAB code.

You might also like