Vision 2023 Algorithm Chapter 4 Greedy Method 29
Vision 2023 Algorithm Chapter 4 Greedy Method 29
com
1
byjusexamprep.com
ALGORITHM
4 GREEDY METHOD
● "Greedy Method finds out of many options, but you have to choose the best option."
In this method, we have to find out the best method/option out of many present ways.
In this approach/method we focus on the first stage and decide the output, don't think about the
future.
● Greedy Algorithms solve problems by making the best choice that seems best at the particular
moment. Many optimization problems can be determined using a greedy algorithm. Some issues
have no efficient solution, but a greedy algorithm may provide a solution that is close to optimal.
A greedy algorithm works if a problem exhibits the following two properties:
1. Greedy Choice Property: A globally optimal solution can be reached at by creating a locally
optimal solution. In other words, an optimal solution can be obtained by creating "greedy"
choices.
2. Optimal substructure: Optimal solutions contain optimal sub-solutions. In other words,
answers to subproblems of an optimal solution are optimal.
Spanning Tree :
Let G (V, E) be an undirected connected graph A subset T (V, E') of G (G, E) is said to be a spanning
tree if T is a tree.
Connected graph :
In a connected graph between every pair of vertices there exists a path.
e.g.
2
byjusexamprep.com
Complete graph :
In a complete graph between every pair of vertices there exists are edge.
e.g.
e.g.
Vertices : 1 2 3
Outbeg : 2 +1 + 0 =3
n (n − 1)
→ Total no of edge in a complete directed graph with n vertices = 2
2
= n (n – 1)
e.g.
⇒ 2 × 3 = 6 edges
3
byjusexamprep.com
Theorem :
Prove that maximum no. of undirected graph with ‘n’ vertices = 2 no. of edges
e.g. Take n = 3 vertices.
n ( n − 1) 3 ( 3 − 1)
⇒ Maximum no. of edges = = 3 edges
2 2
↓ undirected graph.
Case (i) : Graph with ‘o’ no of edges.
= 3C0 = 1
Case (ii) : Graph with ‘1’ no. of edges.
= 3C1 = 3
Case (iii) : Graph with ‘2’ no. of edges.
= 3C2 = 3
Case (iv) : Graph with ‘3’ no. of edges.
= 3C3 = 1
∴ Total no. of graph = 1 + 3 + 3 + 1 = 8
Proof :
n = vertices
n ( n − 1)
edges =
2
n (n − 1)
(i) Graph with ‘0’ edge =
2C
0
n (n − 1)
(ii) Graph with ‘1’ edge =
2C
1
4
byjusexamprep.com
n (n − 1)
2
n ( n − 1)
Graph with edges = C
1
n (n − 1)
2
n (n − 1) n (n − 1) n (n − 1)
Total no of graphs = + + ...... +
2C 2C 2C
0 1
n(n −1)
2
n ( n − 1)
=2
n
C0 +n C1 + .... +n Cn = 2n
2
⇒ <vi, vj> = 2 |i – j|
= 2 |1 – 2|
=2
5
byjusexamprep.com
Ex:
Let ‘w’ be the minimum weight among all edge weights that are undirected connected graphs.
Let ‘e’ be a specific edge which contains weight ‘w’. Which of the following is false.
A. There is a MST which contains an edge ‘e’.
B. Every MST contains an edge of weight ‘w’.
C. If ‘e’ is not in MST, ‘T’ then by adding c to ‘T’ it forms a cycle.
D. Every MST contains an edge ‘e’.
Here we have to take some weight edges because we count more than 1 MST.
6
byjusexamprep.com
Kruskal’s Algorithm :
Fine MST for the following graph.
COST = 99
It satisfies all the properties of spanning trees.
1) Edges must be arrange in the increasing order of their weight with order of ElogE time.
2) In each iteration delete root node from priority queue with log E time and include it into the
partially constructed forest without forming a cycle.
In the worst case we may perform an E delete operation. So time complexity for deletion is E
log E.
3) Time complexity of K.A. = Step 1 + Step 2
= E log E + E log E
= 0 (E log E)
7
byjusexamprep.com
Prim’s Algorithm :
Apply Prim’s algo. On the following graph on starting vertex 1.
(
By using binary heap time complexity = 0 ( v + E ) log V )
Ex :
For the undirected weighted graph given below which of the following sequence of edges represent
correct execution of Prim’s algo to construct MST.
8
byjusexamprep.com
Difference between Prim’s and Kruskal’s algorithm-
It starts to build the Minimum Spanning Tree It starts to build the Minimum Spanning Tree from
from any vertex in the graph. the vertex carrying minimum weight in the graph.
It traverses one node more than one time to It traverses one node only once.
get the minimum distance.
Prim’s algorithm has a time complexity of Kruskal’s algorithm’s time complexity is O(E log
O(V ), V being the number of vertices and can
2
V), V being the number of vertices.
be improved up to O(E + log V) using Fibonacci
heaps.
Prim’s algorithm gives connected components Kruskal’s algorithm can generate
as well as it works only on connected graphs. forest(disconnected components) at any instant
as well as it can work on disconnected
components
Prim’s algorithm runs faster in dense graphs. Kruskal’s algorithm runs faster in sparse graphs.
9
byjusexamprep.com
Conditions-
It is important to note the following points regarding Dijkstra Algorithm-
● The Dijkstra algorithm works only for connected graphs.
● The Dijkstra algorithm works only for those graphs that do not contain any negative weight
edge.
● The actual Dijkstra algorithm does not output the shortest paths.
● It only provides the value or cost of the shortest paths.
● By making minor modifications in the actual algorithm, the shortest paths can be easily
obtained.
● Dijkstra algorithm works for directed as well as undirected graphs.
Case-02:
This case is valid when-
● The given graph G is represented as an adjacency list.
● Priority queue Q is represented as a binary heap.
Here,
● With adjacency list representation, all vertices of the graph can be traversed using BFS in
O(V+E) time.
● In min heap, operations like extract-min and decrease-key value takes O(logV) time.
● So, overall time complexity becomes O(E+V) x O(logV) which is O((E + V) x logV) =
O(ElogV)
● This time complexity can be reduced to O(E+VlogV) using the Fibonacci heap.
Example-
Using Dijkstra’s Algorithm, find the shortest distance from source vertex ‘S’ to remaining
vertices in the following graph-
10
byjusexamprep.com
Now,
● d[S] + 1 = 0 + 1 = 1 < ∞
∴ d[a] = 1 and Π[a] = S
● d[S] + 5 = 0 + 5 = 5 < ∞
∴ d[b] = 5 and Π[b] = S
After edge relaxation, our shortest path tree is-
11
byjusexamprep.com
Step-04:
● Vertex ‘a’ is chosen.
● This is because shortest path estimate for vertex ‘a’ is least.
● The outgoing edges of vertex ‘a’ are relaxed.
Before Edge Relaxation-
Now,
● d[a] + 2 = 1 + 2 = 3 < ∞
∴ d[c] = 3 and Π[c] = a
● d[a] + 1 = 1 + 1 = 2 < ∞
∴ d[d] = 2 and Π[d] = a
● d[b] + 2 = 1 + 2 = 3 < 5
∴ d[b] = 3 and Π[b] = a
After edge relaxation, our shortest path tree is-
12
byjusexamprep.com
Step-05:
● Vertex ‘d’ is chosen.
● This is because shortest path estimate for vertex ‘d’ is least.
● The outgoing edges of vertex ‘d’ are relaxed.
Before Edge Relaxation-
Now,
● d[d] + 2 = 2 + 2 = 4 < ∞
∴ d[e] = 4 and Π[e] = d
After edge relaxation, our shortest path tree is-
13
byjusexamprep.com
Step-06:
● Vertex ‘b’ is chosen.
● This is because shortest path estimate for vertex ‘b’ is least.
● Vertex ‘c’ may also be chosen since for both the vertices, shortest path estimate is least.
● The outgoing edges of vertex ‘b’ are relaxed.
Before Edge Relaxation-
Now,
● d[b] + 2 = 3 + 2 = 5 > 2
∴ No change
After edge relaxation, our shortest path tree remains the same as in Step-05.
Now, the sets are updated as-
● Unvisited set : {c , e}
● Visited set : {S , a , d , b}
Step-07:
● Vertex ‘c’ is chosen.
● This is because the shortest path estimate for vertex ‘c’ is least.
● The outgoing edges of vertex ‘c’ are relaxed.
Before Edge Relaxation-
Now,
● d[c] + 1 = 3 + 1 = 4 = 4
∴ No change
After edge relaxation, our shortest path tree remains the same as in Step-05.
Now, the sets are updated as-
● Unvisited set : {e}
● Visited set : {S , a , d , b , c}
14
byjusexamprep.com
Step-08:
● Vertex ‘e’ is chosen.
● This is because the shortest path estimate for vertex ‘e’ is least.
● The outgoing edges of vertex ‘e’ are relaxed.
● There are no outgoing edges for vertex ‘e’.
● So, our shortest path tree remains the same as in Step-05.
Now, the sets are updated as-
● Unvisited set : { }
● Visited set : {S , a , d , b , c , e}
Now,
● All vertices of the graph are processed.
● Our final shortest path tree is as shown below.
● It represents the shortest path from source vertex ‘S’ to all other remaining vertices.
15
byjusexamprep.com
Difference Between Bellman fors’s and Dijkstra’s Algorithm.
BELLMAN FORD’S ALGORITHM DIJKSTRA’S ALGORITHM
Bellman Ford’s Algorithm works when there Dijkstra’s Algorithm doesn’t work when there is
is a negative weight edge, it also detects the a negative weight edge.
negative weight cycle.
The result contains the vertices which The result contains the vertices containing whole
contain the information about the other information about the network, not only the
vertices they are connected to. vertices they are connected to.
It can easily be implemented in a distributed It can not be implemented easily in a distributed
way. way.
It is more time consuming than Dijkstra’s It is less time consuming. The time complexity
algorithm. Its time complexity is O(VE). is O(E logV).
Dynamic Programming approach is taken to Greedy approach is taken to implement the
implement the algorithm. algorithm.
16
byjusexamprep.com
Ex:
J1 J2 J3 J4 J5 J6 J7 J8
Deadline 6 5 6 6 3 4 4 5
Profit 10 8 9 12 3 6 11 13
J 8 ≥ J4 ≥ J 7 ≥ J 1 ≥ J 3 ≥ J 2 ≥ J 6 ≥ J 5
Max. Profit = 63
Ex : Linked que.
Task T1 T2 T3 T4 T5 T6 T7 T8 T9
Deadline 7 2 5 3 4 5 2 7 3
Profit 15 20 30 18 18 10 23 16 25
Knapsack Problem :
Find maximum profit by placing below objects into the knapsack.
17
byjusexamprep.com
Objects 01 02 03
(W1 W2 W3) 18 15 10
(P1 P2 P3) 25 24 15
10
Greedy about weight ⇒ Profit = 0 (25) + (24 ) + 1 (15)
15
= 31
2
Greedy about profit ⇒ Profit = 1 (25) + (24 ) + 0 (15)
15
= 28.5
Greedy about unit weight profit ⇒
P1 25
= = 1.3
W2 18
P2 24 P P P
= = 1.6 2 3 1
W2 15 W2 W3 W1
P3 15
= = 1.5
W3 10
5
∴ Profit = 0 (25) + 1 (24 ) + (15)
10
= 31.5
Shortcut :
Arrange all unit weight profit into decreasing order & then process objects in that order without
exceeding knapsack size.
→ Since we are using priority queue, where priority is assigned to unit weight profit. So, time
complexity = 0 (n log n)
Note :
Let ‘x’ denote decision on ith object then 0 ≤ xi ≤ 1 (Fractional knapsack problem).
18
byjusexamprep.com
Continue the process until all files are merged.
→ Here, n = 3
F1 F2 F3
(30) (10) (20)
3! = 6 ways
n! 3!
= = 3 ways
2 2
(I1 + I2 )
In (a) Total no. of record movements = 100
In (B) Total no. of record movements = 90
In (C) Total no. of record movements = 110
Find least no. of record movement.
F1 F2 F3 F4 F5 F6 F7
4 8 15 3 7 9 10
A. 150
B. 171
C. 169
D. 170
19
byjusexamprep.com
F4 ≤ F1 ≤ F5 ≤ F2 ≤ F6 ≤ F7 ≤ F3
Note :
n
Total no. of record movements = diqi
i =1
Huffman Encoding :
Objective of H.E. is to encode letters with least no. of bits. Let us consider a Gmail application
which contains letters a, b, c, d, e. With corresponding frequency 10, 20, 4, 15, 6 respectively.
Therefor there are 5 letters to encode letter we need at least 3-bits,
So, total no. of bits required = (10 + 20 + 4 + 5 + 6) × 3 = 165
No. of bits. No. of letters to be encoded
0 − a
1
1 − b
20
byjusexamprep.com
00 − a
01 − d
2
10 − c
11 − b
000 −
001 −
010 −
011 − a
3
100 − e
101 − h
110 − c
111 − d
Step 1
Arrange all letters in the increasing order of their frequencies & then in each iteration construct
binary tree by assigning 1st least frequency letter as left child & 2nd least frequencies letter as
a sight child.
Step 2
After constructing a binary tree from root to leaf path every left branch is assigned with O &
right branch is assigned with 1.
21
byjusexamprep.com
Note :
n
Total no of bits required = diq1
i =1
****
22