DAA Unit-5
DAA Unit-5
Algorithms
Greedy Algorithms 4
Pseudo code of Greedy Algorithm
Algorithm Greedy (a, n)
{
Solution : = 0;
for i = 0 to n do
{
x: = select(a);
if feasible(solution, x)
{
Solution: = union(solution , x)
}
return solution;
}
}
Greedy Algorithms 5
Coin Change Problem
Problem Definition
Suppose following coins are available with unlimited quantity:
1. 10
2. 5
3. 2
4. 1
5. 50 paisa
Greedy Algorithms 7
Make Change – Greedy Solution
If suppose, we need to pay an amount of 28/- using the available coins.
Here we have a candidate (coins) set
The greedy solution is,
3
2. Denominations: d1=6, d2=4, d3=1. Make a change of 8.
The minimum coins 2
required are
Greedy Algorithms 10
Knapsack Problem
Knapsack Problem
Greedy Algorithms 12
Fractional Knapsack
Problem
Introduction
We are given objects and a knapsack.
Object has a positive weight and a positive value for
The knapsack can carry a weight not exceeding .
Our aim is to fill the knapsack in a way that maximizes the value of the
included objects, while respecting the capacity constraint.
In a fractional knapsack problem, we assume that the objects can be
broken into smaller pieces.
So we may decide to carry only a fraction of object , where
In this case, object contribute to the total weight in the knapsack, and to
the value of the load.
Symbolic Representation of the problem can be given as follows:
maximize subject to
Where, and for .
Greedy Algorithms 14
Fractional Knapsack Problem - Example
We are given objects and the weight carrying capacity of knapsack is .
For each object, weight and value are given in the following table.
Object
(Profit)
Fill the knapsack with given objects such that the total value of knapsack
is maximized.
Greedy Algorithms 15
Fractional Knapsack Problem - Greedy Solution
Three Selection Functions can be defined as,
1. Sort the items in descending order of their values and select the items till weight
criteria is satisfied.
2. Sort the items in ascending order of their weight and select the items till weight
criteria is satisfied.
3. To calculate the ratio value/weight for each item and sort the item on basis of this
ratio. Then take the item with the highest ratio and add it.
Greedy Algorithms 16
Fractional Knapsack Problem - Greedy Solution
Object
1 Profit
Profit
1Profit
1 66
==20
66
0+++3020
60
0. +
++66 30
(40
164 *+
+ 40
48===
0.5) 164
156
8146
Greedy Algorithms 17
Fractional Knapsack Problem - Algorithm
Algorithm: Greedy-Fractional-Knapsack (w[1..n], p[1..n], W)
for i = 1 to n do
x[i] 0 ; weight 0
While weight < W do
i the best remaining object
if weight + w[i] ≤ W then W = 100 and Current weight in
knapsack= 60
x[i] 1 Object weight = 50
weight weight + w[i] The fraction of object to be
included will be
else (100 – 60) / 50 = 0.8
x[i] (W - weight) / w[i]
weight W
return x
Greedy Algorithms 18
Exercises – Home Work
1. Consider Knapsack capacity , and find the maximum profit using greedy
approach.
2. Consider Knapsack capacity , and . Find the maximum profit using greedy
approach.
Greedy Algorithms 19
Activity Selection
Problem
Introduction
The Activity Selection Problem is an optimization problem which deals with
the selection of non-overlapping activities that needs to be executed by a
single person or a machine in a given time duration.
An activity-selection can also be applicable for scheduling a resource
among several competing activities.
We are given a set of activities with start time and finish time , of an
activity. Find the maximum size set of mutually compatible activities.
Activities andare compatible if the half-open internal and do not overlap,
that is, and are compatible if or .
Sij = all the activities which starts after ai finishes and finishes
before activity aj starts
S14= ?
Greedy Algorithms 21
Activity Selection Problem - Example: 11 activities are given as,
Example Solution:
Sr Activity (, )
.
Greedy Algorithms 24
Activity Selection - Algorithm
Algorithm: Activity Selection
Step I: Sort the input activities by increasing finishing time.
f1 ≤ f2 ≤ . . . ≤ fn
Input: N = 4
arr[ ] = {{3, 4}, {2, 5}, {1, 3}, {5, 9}, {0, 7}, {11, 12}, {8, 10}}
Output: (1, 3) (3, 4) (5, 9) (11, 12)
Greedy Algorithms 26
Greedy Algorithms 27
Exercise - HW
Given arrival and departure times of all trains that reach a railway station,
find the minimum number of platforms required for the railway station so
that no train waits. We are given two arrays which represent arrival and
departure times of trains that stop. Arr[ ] = {9:00, 9:40, 9:50, 11:00,
15:00, 18:00} dep[ ] = {9:10, 12:00, 11:20, 11:30, 19:00, 20:00}
Greedy Algorithms 28
Complexity Analysis
Case 1: When the provided list of activities is already sorted by finish time,
then no need for sorting it again. Therefore, Time Complexity in such a case
will be O(n).
Case 2: When the provided list of activities is not sorted, then we will have
to either write a sort() function from scratch or we can use the in-built
Standard Template Library function. Therefore, Time Complexity, in this
case, will be O(nlogn).
Greedy Algorithms 29
Job Scheduling with
Deadlines
Real-life example
suppose you are a content writer and have 4 articles to write. Each article
takes a different amount of time to complete and has different payments.
You want to earn the maximum amount by writing the articles.
So which article will you write first?
You have two options:
Article with maximum payment: In this case, the article can take a longer
duration to complete but by writing this article, you may miss some other
articles which can be completed in a shorter interval of time, and it can be
unprofitable to you if the sum of the payments from the missed articles is
greater than the payment of this single article which you are doing.
The article which takes less time: In this case, it may happen that you
have written multiple articles in the given time but there could be the
possibility that if you would have chosen the article with maximum
payment, you could have earned more, even by writing a single article.
Greedy Algorithms 31
Introduction
We have set of jobs to execute, each of which takes unit time.
At any point of time we can execute only one job.(Uniprocessor and no preemption)
Jobearns profit if and only if it is executed no later than its deadline
We have to find an optimal sequence of jobs such that our total profit is
maximized.
Feasible jobs: A set of job is feasible if there exits at least one sequence
that allows all the jobs in the set to be executed no later than their
respective deadlines.
Greedy Algorithms 32
Brute-force approach
The simple and brute-force solution for this problem is to generate all the
sequences of the given set of jobs and find the most optimal sequence
that maximizes the profit.
Suppose, if there are n number of jobs, then to find all the sequences of
jobs we need to calculate all the possible subsets and a total of 2 n subsets
will be created. Thus, the time complexity of this solution would be O(2n).
Greedy Algorithms 33
Greedy algorithm
Sort the jobs in decreasing order of their profit.
Find the highest deadline among all deadlines and draw a Gantt chart up
to that deadline.
Now, we need to assign time slots to individual job ids.
Now pick each job one by one and check if the maximum possible time
slot for the job, i.e., its deadline, is assigned to another job or not. If it is
not filled yet, assign the slot to the current job id.
Otherwise, search for any empty time slot less than the deadline of the
current job. If such a slot is found, assign it to the current job id and move
to the next job id.
Continue this process until all the feasible jobs are allocated to their time
slot.
In the end, we can calculate the profit for all the allocated feasible jobs.
Greedy Algorithms 34
Job Scheduling with Deadlines - Example
Using greedy algorithm find an optimal schedule for following jobs with .
Profits: &
Deadline:
Solution:
Step
Sort the jobs in decreasing order of their profit.
1:
Job
Profit
Deadline
Greedy Algorithms 35
Job Scheduling with Deadlines - Example
Job
Profit
Deadline
Step
Find total position
2:
Here,
P 1 2 3
Job selected 0 0 0
Step
: assign job to position
3:
P 1 2 3
Job selected 0 0 J1
Greedy Algorithms 36
Job Scheduling with Deadlines - Example
Job
Profit
Deadline
Step
assign job to position
4:
P 1 2 3
Job selected J2 0 J1
Step
: assign job to position
5:
But position 1 is already occupied
and two jobs can not be executed in
parallel, so reject job 3
Greedy Algorithms 37
Job Scheduling with Deadlines - Example
Job
Profit
Deadline
Step
assign job to position as, position is not free but position
6:
is free.
P 1 2 3
Job selected J2 J4 J1
Greedy Algorithms 38
Exercises – Home Work
𝒏=4.
1. Using greedy algorithm find an optimal schedule for following jobs with
𝒏=5.
2. Using greedy algorithm find an optimal schedule for following jobs with
Greedy Algorithms 39
Job Scheduling with Deadlines - Algorithm
int result[n]; // To store result (Sequence of jobs)
bool slot[n]; // To keep track of free time slots
// Find a free slot for this job (Note that we start from the last possible
slot)
for (int j = min(n, arr[i].dead) - 1; j >= 0; j--) {
Greedy Algorithms 40
The time complexity for the above job scheduling algorithm is O(n2 ) in the
worst case when we will look for all the slots in the Gantt chart for a given
job id.
On average, n jobs search n/2 slots. So, this would also take O(n2 ) time
approximately.
Greedy Algorithms 41
Huffman Codes
Intro
When you want to send files or store the files on a computer, if the size is
very huge we generally compress it and store it.
Greedy Algorithms 43
Prefix Code
Prefix code is used for encoding(compression) and
Decoding(Decompression).
Prefix Code: Any code that is not prefix of another code is called prefix
Characte Frequen
code. Code Bits
rs cy
a 45 000 135
b 13 111 39
c 12 101 36
d 16 110 48
e 9 011 27
f 5 001 15
Total bits 300
Greedy Algorithms 44
Huffman code Introduction
Huffman Coding is a technique of compressing data to reduce its size
without losing any of the details. It was first developed by David Huffman.
Huffman Coding is generally useful to compress the data where character
comes frequently.
Huffman invented a greedy algorithm that constructs an optimal prefix
code called a Huffman code.
Huffman coding is a lossless data compression algorithm.
It assigns variable-length codes to input characters.
Lengths of the assigned codes are based on the frequencies of
corresponding characters.
The most frequent character gets the smallest code and the least frequent
character gets the largest code.
The variable-length codes assigned to input characters are Prefix Codes.
Greedy Algorithms 45
Huffman Codes
In Prefix codes, the codes are assigned in such a way that the code
assigned to one character is not a prefix of code assigned to any other
character.
For example,
a = 0, b = 1 and c = 01 Not a prefix
code
Now assume that the generated bitstream is 001, while decoding the code
it can be written as follows:
0 0 1 = aab ..?
0 01 = ac ..?
This is how Huffman Coding makes sure that there is no ambiguity when
decoding the generated bit stream.
There are mainly two major parts in Huffman Coding
1. Build a Huffman Tree from input characters.
2. Traverse the Huffman Tree and assign
Greedy codes
Algorithmsto characters. 46
Huffman Codes - Example
Find the Huffman codes for the following characters.
Characters a b c d e f
Frequency (in
45 13 12 16 9 5
thousand)
Step
Arrange the characters in the Ascending order of
1:
their frequency. (These are stored in a priority queue Q/ min-heap.)
Greedy Algorithms 47
Huffman Codes - Example
Step
2:
Extract two nodes with the minimum frequency from min heap.
Create a new internal node with frequency equal to the sum of the
two nodes frequencies.
Make the first extracted node as its left child and the other extracted
node as its right child.
Add this node to the min-heap.
1
4
c:1 b:1 d:1 a:4
f:5 e:9
2 3 6 5
Greedy Algorithms 48
Huffman Codes - Example
Step
3:
Rearrange the tree in ascending order.
Assign to the left branch and to the right branch.
Repeat the process to complete the tree.
1
4
c:1 b:1 d:1 a:4
f:5 e:9
2 3 6 5
0 1
Greedy Algorithms 49
Huffman Codes - Example
Step 2
4: 5
c:1 b:1 1 d:1 a:4
2 3 4 6 5
0 1
f:5 e:9
1 d:1 2 a:4
4 6 5 5
0 1 0 1
c:1 b:1
f:5 e:9
2 3
Greedy Algorithms 50
Huffman Codes - Example
Step 3
5: 0
1 d:1 2 a:4
4 6 5 5
0 1 0 1
c:1 b:1
f:5 e:9
2 3
2 3 a:4
5 0 5
0 1 0 1
c:1 b:1 1 d:1
2 3 4 6
0 1
f:5 e:9
Greedy Algorithms 51
Huffman Codes - Example
Step
6:
5
5
0 1
2 3 a:4
5 0 5
0 1
0 1
c:1 b:1 1 d:1
2 3 4 6
0 1
f:5 e:9
Greedy Algorithms 52
Huffman Codes - Example
Step 10
7: 0
0 1
a:4 5
5 5
0 1
2 3
5 0 1
0 1 0
c:1 b:1 1 d:1
2 3 4 6
0 1
f:5 e:9
Greedy Algorithms 53
Huffman Codes - Example
Step Characters a b c d e f
8:
Frequency (in
45 13 12 16 9 5
thousand) 110 110
0 101 100 111
1 0
Total bits:
224
Greedy Algorithms 54
Huffman Codes - Algorithm
Algorithm: HUFFMAN (C)
create a priority queue Q consisting of each unique character.
sort then in ascending order of their frequencies
n = |C|
Q = C
for i = 1 to n-1
allocate a new node z
z.left = x = EXTRACT-MIN(Q)
z.right = y = EXTRACT-MIN(Q)
z.freq = x.freq + y.freq
INSERT(Q,z)
return EXTRACT-MIN(Q) // return the root of the tree
Greedy Algorithms 55
Complexity analysis
Time complexity: O(N logN)
Sorting will take O(N logN)
The time taken for encoding each given unique character depending on its
frequency is O(N logN) where N is the number of unique given characters.
Getting minimum from the priority queue(min-heap): time complexity is
O(log N) since it calls minHeapify method.
extractmin() is called for 2*(N-1) times i.e. 2*(N-1)*log N
Inertnode() is called for (N-1) times i.e (N-1) * log N
Therefore, the overall complexity of Huffman Coding is considered to be
O(N logN).
Space Complexity: O(N)
The space complexity is O(N) because there are N characters used for
storing them in a map/array while calculating their frequency.
Greedy Algorithms 56
Exercises – Home Work
Find an optimal Huffman code for the following set of frequency.
1. a : 50, b : 20, c : 15, d : 30.
2. Frequency
Characters A B C D E F
Frequency (in
24 12 10 8 8 5
thousand)
3. Frequency
Characters a b c d e f g
Frequency (in
37 28 29 13 30 17 6
thousand)
Greedy Algorithms 57
Important formula
Greedy Algorithms 58
1. Huffman Code For Characters-
To write Huffman Code for any character, traverse the Huffman Tree from
root node to the leaf node of that character.
Following this rule, the Huffman Code for each character is-
• a = 111 f:10
• e = 10 f:15
• i = 00 f:12
• o = 11001 f:3
• u = 1101 f:4
• s = 01 f:13
• t = 11000 f:1
From here, we can observe-
• Characters occurring less frequently in the text are assigned the larger
code.
• Characters occurring more frequently in the text are assigned the smaller
code. Greedy Algorithms 59
2. Average Code Length-
Using formula-01, we have-
Average code length
= ∑ ( frequencyi x code lengthi ) / ∑ ( frequencyi )
= { (10 x 3) + (15 x 2) + (12 x 2) + (3 x 5) + (4 x 4) + (13 x 2) + (1 x 5) } / (10 + 15 + 12
+ 3 + 4 + 13 + 1)
= 2.52
3. Length of Huffman Encoded Message-
Using formula-02, we have-
Total number of bits in Huffman encoded message
= Total number of characters in the message x Average code length per character
= 58 x 2.52
= 146.16
≅ 147 bits
Greedy Algorithms 60
Applications of Huffman Coding
Huffman Coding is frequently used by conventional compression formats
like PKZIP, GZIP, etc.
For data transmission using fax and text, Huffman Coding is used because
it reduces the size and improves the speed of transmission.
Many multimedia storage like JPEG, PNG, and MP3 use Huffman
encoding(especially the prefix codes) for compressing the files.
Image compression is mostly done using Huffman Coding.
It is more useful in some cases where there is a series of frequently
occurring characters to be transmitted.
Greedy Algorithms 61
Minimum Spanning
Tree
Spanning tree
A spanning tree is a sub-graph that connects all the vertices of a graph
with the minimum possible number of edges. It may or may not be
weighted and does not have cycles.
Let us understand what is spanning tree with an interesting example.
Greedy Algorithms 63
Spanning tree
A Spanning tree always contains n-1 edges, where n is the total number of
vertices in the graph G.
The total number of spanning trees that a complete graph of n vertices
can have is n(n-2).
We can construct a spanning tree by removing atmost e-n+1 edges from a
complete graph G, where e is the number of edges and n is the number of
vertices in graph G.
Greedy Algorithms 64
Spanning tree
A complete undirected graph G can have a maximum nn-2 number of
spanning trees, where n is the number of nodes in a given graph G. Let us
Consider a complete graph G with 3 vertices, then the total number of
spanning trees this graph can have is 3(3-2)=3 which are shown in the
image below.
Greedy Algorithms 65
Introduction to Minimum Spanning Tree (MST)
Let be a connected, undirected graph where,
1. is the set of nodes and
2. is the set of edges.
Each edge has a given positive length or weight.
A spanning tree of a graph is a sub-graph which is basically a tree and it
contains all the vertices of but does not contain cycle.
A minimum spanning tree (MST) of a weighted connected graph is a
spanning tree with minimum or smallest weight of edges.
Two Algorithms for constructing minimum spanning tree are,
1. Kruskal’s Algorithm
2. Prim’s Algorithm
Greedy Algorithms 66
Spanning Tree Examples
Grap A Spanning Tree A
h B C B C
D E F G D E F G
H H
D F D F
Greedy Algorithms 67
Kruskal’s Algorithm for MST – Example 1
A Step 2: Taking next Step 4: Taking
4 5
6 min edge (B,C) next
6
min edge (A,B)
B 3
E B A
4
5 7 2
2
C D B E
C D 1 3
1
2
Step 3: Taking
4 6 {2, 2
6 4 5 3}
{4, 3
5}
4 3 5 8 6
{6, 3
7}
7
3 {1, 4
4 4}
{2, 4
7 5}
{4, 4
7}
Greedy Algorithms {3, 5 69
Kruskal’s Algorithm for MST – Step:2
Select the minimum weight edge
Example 2 but no cycle.
Edge Weigh
s t
1 2
1 2 3 {1, 1
2}
{2, 2
4 6 4 5 6
3}
{4, 3
5}
4 3 5 8 6
{6, 3
7}
7
3 {1, 4
4 4}
{2, 4
7 5}
{4, 4
7}
Greedy Algorithms {3, 5 70
Kruskal’s Algorithm for MST – Step:3
The minimum spanning tree for the
Example 2 given graph.
Edge Weigh
s t
1 2
1 2 3 {1, 1
2}
{2, 2
4
3}
{4, 3
5}
4 3 5 6
{6, 3
7}
4 3 Total
{1, Cost
4 = 17
4}
{4, 4
7 7}
Greedy Algorithms 71
Kruskal’s Algorithm – Example 2
Step Edges Connected Edge Weigh
considered - Components s t
{u, v}
{1, 1
Init. - {1} {2} {3} {4} {5} {6} {7}
2}
1 {1,2} {1,2} (3} {4} {5} {6} {7} {2, 2
2 {2,3} {1,2,3} {4} {5} {6} {7} 3}
3 {4,5} {1,2,3} {4,5} {6} {7} {4, 3
4 {6,7} {1,2,3} {4,5} {6,7} 5}
5 {1,4} {1,2,3,4,5} {6,7} {6, 3
7} Cost =
Total
6 {2,5} Rejected
{1, 17 4
7 {4,7} {1,2,3,4,5,6,7} 4}
{4, 4
7}
Greedy Algorithms 72
Kruskal’s Algorithm for MST
MAKE-SET(x) creates a new set whose only member (and thus representative) is x. Since the sets are disjoint, we require
that x not already be in some other set.
FIND-SET(x) returns a pointer to the representative of the (unique) set containing x.
UNION(x,y) unites the dynamic sets that contain x and y
Greedy Algorithms 73
Complexity analysis
Implemented Using linear array/list
Initializing the set A in line 1 takes O(1) time
For vertex we need to call make_set() i.e. O(V)
Sorting edges O(E log E)
FIND_SET() and UNION() operation takes O(V) or improvised O(log V)
Total running time: O(E log E)
𝟏. 𝟐.
1
3
3 A 1
F C
0
A 4 4 3 7
B 2 4
C 5
8
5 6
4 B D D E F G
4
H 2 3 2 8
9
3
G 3
E1 H
Greedy Algorithms 75
Prim’s Algorithm
In Prim's algorithm, the minimum spanning tree grows in a natural way,
starting from an arbitrary root.
At each stage we add a new branch to the tree already constructed; the
algorithm stops when all the nodes have been reached.
The complexity for the Prim’s algorithm is where is the total number of
nodes in the graph .
Greedy Algorithms 76
Prim’s Algorithm for MST – Step:1 Select an arbitrary node.
Example 1
Node - Set B Edges
1 2
1 2 3 1
4 6 4 5 6
4 3 5 8 6
7
4 3
Greedy Algorithms 77
Prim’s Algorithm for MST – Step:2 Find an edge with minimum
weight.
Example 1
Node - Set B Edges
1 2
1 2 3 1 {1, 2},
{1, 4}
4 6 1, 2 {1, 4}, {2, 3} {2,
6 4 5
4}, {2, 5}
1, 2, 3 {1,4}, {2,4},
4 3 5 8 6 {2,5}, {3,5},
1, 2, 3, {3,6}
{2,4} {2,5} {3,5}
7 4 {3,6} {4,5} {4,7}
4 3 1, 2, 3, {2,4} {2,5} {3,5}
4, 5 {3,6} {4,7}
7 1, 2, 3, 4, {5,6}
{2,4} {5,7}
{2,5} {3,5}
5, 7 {3,6} {5,6} {5,7}
1, 2, 3, 4, {6,7}
5, 6, 7
Greedy Algorithms 78
Prim’s Algorithm for MST –
Step:3
Example 1 The minimum spanning tree for
the given graph.
Node Edges
1 2
1 2 3 1
1, 2 {1, 2}
4
1, 2, 3 {2, 3}
1, 2, 3, 4 {1, 4}
4 3 5 6 1, 2, 3, 4, 5 {4, 5}
1, 2, 3, 4, 5, {4, 7}
7
4 3
1, 2, 3, 4, 5, {6, 7}
6, 7 Total Cost =
7 17
Greedy Algorithms 79
Cost =
17
Prim’s Algorithm –
v}
Init. - {1} --
1 {1, 2} {1,2} {1,2} {1,4}
2 {2, 3} {1,2,3} {1,4} {2,3} {2,4} {2,5}
3 {1, 4} {1,2,3,4} {1,4} {2,4} {2,5} {3,5}
{3,6}
4 {4, 5} {1,2,3,4,5} {2,4} {2,5} {3,5} {3,6}
Prim’s Algorithm
Alternatively
1. Create an array parent[] of size V to store indexes of parent
nodes in MST.
2.Also create an array key[] to store key values of all vertices.
3.Initialize all key values as INFINITE.
4.Assign key value as 0 for the first vertex so that it is picked first.
6.Create a boolean array mstSet[] to represent the set of vertices
included in MST.
7.While mstSet doesn’t include all vertices
6. Pick a vertex u which is not there in mstSet and has
minimum key value.
7. Include u to mstSet.
8. Update the key value of all adjacent vertices of u according
to the respective edge weights.
7.The parent array is the output array that is used to show the
constructed MST.
Greedy Algorithms 81
Prim’s Algo
1. Loop For each u∈V[G], Steps 2
2. Key[u]= ∞, π(u)=Nil,
3. Key[r]=0
4. Q=v[G]
5. Repeat steps 6, to 8 While Q≠Φ
6. u=EXTRACT-MIN(Q)
7. for each v ∈ Adj[u]
8. If v ∈ Q and w(u,v)<Key[v]
9. then: π[v]=u,
10. key[v] = w(u,v)
11. Exit
Greedy Algorithms 82
Complexity analysis
Line 1-2: for each vertex --> O(V)
Line 3: O(1)
Line 4: Build heap O(V log V)
Line 5: O(V) times
Line 6: O (log V) to extract 1 min , so Line 5 and 6-: V (log V)
Line 7: worst case (V-1) vertices can be adjacent to any 1 so O(V-1) time
i.e. for complete graph
So line 5,7 O(V2)
Line 10: actually calling decrease key function of Min_Heap O(log v)
So line 5,7,10 O(V2 log V)
time complexity: V+VlogV+VlogV+V2log V
If graph is not complete: O(E log V) i.e. E=V2 (dense) and E=V(sparse)
Overall time complexity: V+VlogV+VlogV+Elog V==> O((E+V) log V)
Greedy Algorithms 83
Exercises – Home Work
Write the Prim’s Algorithm to find out Minimum Spanning Tree. Apply the
same and find MST for the graph given below.
𝟏. 𝟐.
1 F
3
3 A 1
0
C
A 4 4 3 7
B 2 4
C 5
8
5 6
4 B D D E F G
4
H 2 3 2 8
9
3
G 3
E1 H
Greedy Algorithms 84
Exercise
An undirected graph G(V, E) contains n ( n > 2 ) nodes named v1 , v2 ,
….vn. Two nodes vi , vj are connected if and only if 0 < |i – j| <= 2. Each
edge (vi, vj ) is assigned a weight i + j. A sample graph with n = 4 is
shown below
Create a graph for n=4 :
Greedy Algorithms 85
Single Source Shortest
Path – Dijkstra’s
Algorithm
Introduction
Consider now a directed graph where is the set of nodes and is the set of
directed edges of graph .
Each edge has a positive length.
One of the nodes is designated as the source node.
The problem is to determine the length of the shortest path from the
source to each of the other nodes of the graph.
Dijkstra’s Algorithm is for finding the shortest paths between the nodes in
a graph.
For a given source node, the algorithm finds the shortest path between
the source node and every other node.
The algorithm maintains a matrix which gives the length of each
directed edge: if the edge and
otherwise.
Greedy Algorithms 87
Dijkstra’s Algorithm - Example
Compare cost of
Is there path from 1 Ye
Is
Is there
there path
path from
from 11 No 1–5–4 (20) and
-5-2 s
-5-4 -5-3 1–4 (100)
Greedy Algorithms 88
Dijkstra’s Algorithm - Example
Compare cost of
Is there path from 1
Is
Is there
there-path Yes
No 1–4–2 (70)
1–4–3 (40) and
4 - 2from
path from 11
-4-5 -4-3 1–2(50)
1–3 (30)
Greedy Algorithms 89
Dijkstra’s Algorithm - Example
1. 2.
2
2 A B
B D
1 4 1 3 10
0
1 4 8 7 9 2 2
A C D E
3 2 5 8 4 6
C E
1
F G
Greedy Algorithms 91
Dijkstra’s Algorithm
A B
1
10
9
2 3
S 4 6
7
5
2
D C
Greedy Algorithms 92
Time complexity
Adjacency matrix: O(V2)
Adding all |V| vertices to Q takes O(V) time.
Removing the node with minimal dist takes O(V) time, and we only need O(1) to relax
the node
For each vertex we are identifying the adjacent and try to relax the node take O(VxV)
Greedy Algorithms 94
Optimal merge pattern
Given n number of sorted files, the task is to find the minimum
computations done to reach the Optimal Merge Pattern.
When two or more sorted files are to be merged altogether to form a
single file, the minimum computations are done to reach this file are
known as Optimal Merge Pattern.
Consider three sorted lists L1, L2 and L3 of size 30, 20 and 10 respectively.
Two way merge compares elements of two sorted lists and put them in new sorted
list.
Case 1:
If we first merge list L1 and L2, it does 30 + 20 = 50 comparisons and creates a new
array L’ of size 50.
L’ and L3 can be merged with 50 + 10 = 60 comparisons that forms a sorted list L of
size 60.
Thus total number of comparisons required to merge lists L1, L2 and L3 would be 50
+ 60 = 110.
Case 2:
Alternatively, first merging L2 and L3 does 20 + 10 = 30 comparisons, which creates
sorted list L’ of size 30. Greedy Algorithms 95
In both the cases, final output is identical but first approach does 120
comparisons whereas second does only 90. Goal of optimal merge pattern
is to find the merging sequence which results into minimum number of
comparisons.
Let S = {s1, s2, …, sn} be the set of sequences to be merged.
Greedy approach selects minimum length sequences si and sj from S. The
new set S’ is defined as, S’ = (S – {si, sj}) ∪ {si + sj}.
This procedure is repeated until only one sequence is left.
Greedy Algorithms 96
Algorithm
Algorithm OPTIMAL_MERGE_PATTERNS(S)
// S is set of sequences
end
Greedy Algorithms 97
Complexity Analysis
In every iteration, two delete minimum and one insert operation is
performed. Construction of heap takes O(logn) time.
The total running time of this algorithm is O(nlogn).
Standard recurrence:
T(n) = O(n – 1) * max(O(findmin), O(insert))