Huffman Coding
Huffman Coding
1 char = 1 byte, be it e or x
The Basic Algorithm
Any savings in tailoring codes to frequency of
character?
Code word lengths are no longer fixed like ASCII.
Code word lengths vary and will be shorter for the
more frequently used characters.
The (Real) Basic Algorithm
1. Scan text to be compressed and count
occurrence of all characters.
2. Sort or prioritize characters based on
number of occurrences in text.
3. Build Huffman code tree based on
prioritized list.
4. Perform a traversal of tree to determine
all code words.
5. Scan text again and create new file
using the Huffman codes.
Building a Tree
Scan the original text
Consider the following short text:
E e r i space
y s n a l k .
Building a Tree
Scan the original text
Eerie eyes seen near lake.
What is the frequency of each character in the text?
E i y l k . r s n a sp e
1 1 1 1 1 1 2 2 2 2 4 8
Building a Tree
While priority queue contains two or more nodes
Create new node
Dequeue node and make it left subtree
Dequeue next node and make it right subtree
Frequency of new node equals sum of frequency of left and
right children
Enqueue new node back into queue
Building a Tree
E i y l k . r s n a sp e
1 1 1 1 1 1 2 2 2 2 4 8
Building a Tree
y l k . r s n a sp e
1 1 1 1 2 2 2 2 4 8
E i
1 1
Building a Tree
y l k . r s n a sp e
2
1 1 1 1 2 2 2 2 4 8
E i
1 1
Building a Tree
k . r s n a sp e
2
1 1 2 2 2 2 4 8
E i
1 1
y l
1 1
Building a Tree
2
k . r s n a 2 sp e
1 1 2 2 2 2 4 8
y l
E i 1 1
1 1
Building a Tree
r s n a 2 2 sp e
2 2 2 2 4 8
y l
E i 1 1
1 1
k .
1 1
Building a Tree
r s n a 2 2 sp e
2
2 2 2 2 4 8
E i y l k .
1 1 1 1 1 1
Building a Tree
n a 2 sp e
2 2
2 2 4 8
E i y l k .
1 1 1 1 1 1
r s
2 2
Building a Tree
n a 2 sp e
2 2 4
2 2 4 8
E i y l k . r s
1 1 1 1 1 1 2 2
Building a Tree
2 4 e
2 2 sp
8
4
y l k . r s
E i 1 1 1 1 2 2
1 1
n a
2 2
Building a Tree
2 4 4 e
2 2 sp
8
4
y l k . r s n a
E i 1 1 1 1 2 2 2 2
1 1
Building a Tree
4 4 e
2 sp
8
4
k . r s n a
1 1 2 2 2 2
2 2
E i y l
1 1 1 1
Building a Tree
4 4 4
2 sp e
4 2 2 8
k . r s n a
1 1 2 2 2 2
E i y l
1 1 1 1
Building a Tree
4 4 4
e
2 2 8
r s n a
2 2 2 2
E i y l
1 1 1 1
2 sp
4
k .
1 1
Building a Tree
4 4 4 6 e
2 sp 8
r s n a 2 2
4
2 2 2 2 k .
E i y l 1 1
1 1 1 1
4 4
r s n a
2 2 2 2
Building a Tree
4 6 e 8
2 2 2 8
sp
4 4 4
E i y l k .
1 1 1 1 1 1
r s n a
2 2 2 2
Building a Tree
8
e
8
4 4
10
r s n a
2 2 2 2 4
6
2 2 2 sp
4
E i y l k .
1 1 1 1 1 1
Building a Tree
8 10
e
8 4
4 4
6
2 2
r s n a 2 sp
2 2 2 2 4
E i y l k .
1 1 1 1 1 1
Building a Tree
10
16
4
6
2 2 e 8
2 sp 8
4
E i y l k . 4 4
1 1 1 1 1 1
r s n a
2 2 2 2
Building a Tree
10 16
4
6
e 8
2 2 8
2 sp
4 4 4
E i y l k .
1 1 1 1 1 1
r s n a
2 2 2 2
Building a Tree
26
16
10
4 e 8
6 8
2 2 2 sp 4 4
4
E i y l k .
1 1 1 1 1 1 r s n a
2 2 2 2
Building a Tree •After
enqueueing
26 this node
there is only
16
10 one node left
4 e 8
in priority
6 8 queue.
2 2 2 sp 4 4
4
E i y l k .
1 1 1 1 1 1 r s n a
2 2 2 2
Building a Tree
Dequeue the single node
left in the queue.
26
16
This tree contains the 10
new code words for each
4 e 8
character. 6 8
2 2 2 sp 4 4
4
Frequency of root node E i y l k .
1 1 1 1 1 1 r s n a
should equal number of 2 2 2 2
characters in text.
Eerie eyes seen near lake. 26 characters
Encoding the File
Traverse Tree for Codes
1 go right 0000101100000110011
1000101011011010011
Practical considerations
It is not practical to create a Huffman encoding for a
single short string, such as ABRACADABRA
To decode it, you would need the code table
If you include the code table in the entire message,
the whole thing is bigger than just the ASCII message
Huffman encoding is practical if:
The encoded string is large relative to the code table,
Example
Assume that relative frequencies of letters occuring in a
certain text are:
A: 40
B: 20
C: 10
D: 10
R: 20
Write down the codes for the above mentioned alphabets using
huffman coding scheme.
A=0
B = 100
C = 1010
D = 1011
R = 11
Huffman Code Construction
•Char •Freq
Character count in text.
•E •125
•T •93
•A •80
•O •76
•I •73
•N •71
•S •65
•R •61
•H •55
•L •41
•D •40
•C •31
•U •27
•50
•Char•Freq
•E •125
•T •93
•A •80
•O •76
•C •U
•31 •27
•51
•Char•Freq
•E •125
•T •93
•A •80
•O •76
•C •31
•U •27
•58
•C •U
•31 •27
•52
•Char•Freq
•E •125
•T •93
•81
•A •80
•L •41
•D •40
•81
•D •L
•C •U
•31 •27
•53
•Char•Freq
•E •125
•113
•T •93
•81
•58
•H •55
•81 •113
•D •L •H
•C •U
•31 •27
•54
•Char•Freq
•126
•E •125
•113
•T •93
•S •65
•R •61
•D •L •R •S •H
•C •U
•31 •27
•55
•Char•Freq
•144
•126
•E •125
•113
•I •73
•N •71
•D •L •R •S •N •I •H
•C •U
•31 •27
•56
•Char•Freq
•156
•144
•126
•E •125
•A •80
•O •76
•156
•A •O
•D •L •R •S •N •I •H
•C •U
•31 •27
•57
•Char•Freq
•174
•156
•144
•126
•T •93
•81
•156 •174
•A •O •T
•D •L •R •S •N •I •H
•C •U
•31 •27
•58
•Char•Freq
•238
•174
•156
•144
•E •125
•113
•156 •174
•23
8
•A •O •T •E
•D •L •R •S •N •I •H
•C •U
•31 •27
•59
•Char•Freq
•270
•238
•174
•156
•D •L •R •S •N •I •H
•C •U
•31 •27
•60
•Char•Freq
•330
•270
•238
•33
0
•D •L •R •S •N •I •H
•C •U
•31 •27
•61
•Char•Freq
•508
•330
•33 •50
0 8
•D •L •R •S •N •I •H
•C •U
•31 •27
•62
•Char•Freq
•838
•508
•330
Huffman Code Construction •83
8
•33 •50
0 8
•D •L •R •S •N •I •H
•C •U
•31 •27
•63
•Char •Freq •Fixed •Huff
•E •125 •0000 •110
•C •U
•64
Summary
Huffman coding is a technique used to compress
files for transmission
Uses statistical coding
more frequently used symbols have shorter code words
Works well for text and fax transmissions
An application that uses multiple data structures