Algo-Ch-3 Greedy Algorithms
Algo-Ch-3 Greedy Algorithms
Algorithms
By: Elsay M.
Outline for Greedy Algorithms (6hr)
Kruskal's algorithm works by picking the smallest available edge and making
sure no cycles are formed, ensuring that all nodes are connected with the least
cost.
Example:
• Let’s say we have 4 nodes: A, B, C, and D, and these are connected by edges
with the following weights:
• A—B: 4, A—C: 1, B—C: 2, B—D: 5, C—D: 3
• Steps for Kruskal's Algorithm:
• List edges: First, list all edges and their weights:
• A—C: 1, B—C: 2, C—D: 3, A—B: 4, B—D: 5
• Sort edges by weight:
• A—C: 1, B—C: 2, C—D: 3, A—B: 4, B—D: 5
• Pick the smallest edge: Start by adding the smallest edge, A—C with
weight 1.
• Next smallest edge: Add B—C with weight 2. No cycle is formed.
• Next smallest edge: Add C—D with weight 3. No cycle is formed.
• Stop: Now all nodes (A, B, C, and D) are connected. Adding A—B or B—D
would form a cycle, so we stop here.
• Result: The Minimum Spanning Tree consists of the edges A—C, B—C, and
C—D, with a total weight of 1 + 2 + 3 = 6.
Prim’s Algorithm:
• Prim’s Algorithm is another popular algorithm to find the
Minimum Spanning Tree (MST) of a graph, which connects all
vertices (or nodes) with the least total edge weight without
forming any cycles.
• The main difference between Prim’s Algorithm and Kruskal’s
Algorithm is that Prim’s starts from a single node and grows
the MST by adding the nearest vertex to the existing tree, while
Kruskal’s algorithm adds edges in order of weight.
[100]
/ \
F(0)[45] [55]
/ \
[25] [30]
/ \ / \
C(110) D(111) [14] E(101)
/ \
A(1000) B(1001)
Huffman Tree Example:
• Start by combining A (5) and B (9), forming a new node with frequency 14.
• Then, combine C (12) and D (13), forming a new node with frequency 25.
• Next, combine the two smallest
nodes: A+B (14) and C+D (25),
forming a new node with frequency 39.
• Finally, combine E (16) with F (45),
and the result with A+B+C+D.
• Character F gets the shortest code
since it has the highest frequency.
• After assigning binary codes from the tree,
you get variable-length codes that compress the data efficiently.
• This method is used in file compression formats like ZIP and JPEG.
Example 2
• Example 1: Compressing a Short String
• Suppose you want to compress the string ABBCCCDDDD.
First, calculate the frequency of each character:
• A: 1 occurrence, B: 2 occurrences, C: 3 occurrences, D: 4
occurrences
• Step 1: Build a Min-Heap (Priority Queue)
• Insert each character with its frequency into a min-heap:
• A (1), B (2), C (3), D (4)
• Step 2: Build the Huffman Tree
• Traverse left from the root: 0
• Traverse right from the root: 1
Cont..
• Combine the two
smallest elements until
one tree remains:
Combine A (1) and
B (2): cost = 1 + 2
= 3 (new node)
Combine C (3) and
(A+B) (3): cost = 3
+ 3 = 6 (new
node)
Combine D (4) and
(C+(A+B)) (6):
cost = 4 + 6 = 10
(root node)
Cont.
• D has a code of 0 because it's on the left of the root.
• C has a code of 10 (left-right).
• A has a code of 110 (left-right-right).
• B has a code of 111 (left-right-right-right).
• Step 3: Assign Huffman Codes
§ From the tree, assign binary codes: A: 110, B: 111, C: 10, D: 0
§ Step 4: Encode the String
§ Original string: ABBCCCDDDD
§ A → 110, B → 111, C → 10, D → 0
§ Encoded string: 110 111111 101010 0000
§ This compressed representation is much shorter than
using fixed-length encoding (e.g., 2 bits per character).
Example 2: Compressing a More Complex String