DAA Unit-3 PPT - Part-1
DAA Unit-3 PPT - Part-1
Greedy Method
1
Introduction
•A greedy algorithm is the most straightforward design
technique.
•Most of these problems have N inputs and require us to
obtain a subset that satisfies some constraints. Any subset
satisfies these constraints is called a feasible solution.
•Wee need to find a feasible solution that either maximizes
or minimizes a given objective function. A feasible solution
that does this is called optimal solution.
The Greedy Method
• Greedy algorithms make good local choices in the hope that they
result in an optimal solution.
• They result in feasible solutions.
• Not necessarily an optimal solution.
• A proof is needed to show that the algorithm finds an optimal
solution.
The Greedy Method Strategy
4
Algorithm: Greedy method control abstraction for the subset
paradigm
Real time example is loading the containers
Knapsack Problem
7
Solution-1 : Randomly selected .
Solution-2 : profit non-increasing order.
Solution -3 : profit non-decreasing order
Solution-4 : non-increasing order of profit and weight ratio(Pi/Wi), that is p1/w1≥p2/w2≥.....≥pn/wn,
then, the greedy knapsack generates the optimal solution.
8
Algorithm GreedyKnapsack(m, n)
//P[1:n] and w[1:n] contain the profits and weights respectively of the n objects
// ordered such that p[i]/w[i] >= p[i+1]/w[i+1].
//m is the knapsack size and x[1:n] is the solution vector.
{
for i :=1 to n
do x[i] := 0.0; // Initialize x.
U := m; //Knapsack Size
for i := 1 to n do
{
if ( w[i] > U ) then
break;
if x[i] := 1 then
U := U - w[i];
}
if ( i <= n) then
x[i] := U/w[i];
}
9
Time Complexity
The main time taking step is the sorting of all items in the decreasing order of their value / weight ratios.
If the items are already arranged in the required order, the while loop takes O(n) time.
10
JOB SEQUENCING WITH DEADLINES
11
12
O(n2)
13
14
MINIMUM-COST SPANNING TREES
15
MINIMUM-COST SPANNING TREES
There are two algorithms to find the minimum cost spanning tree .
i). Prim’s algorithm.
ii). Kruskal’s algorithm
16
Prim’s algorithm.
18
t[1:n-1; 1:2] Near[]
i
i 1 2
1
1
2
2
3
3 Cost matrix of given G
4 i/j 1 2 3 4 5 6 7
4
5 1 0 28 ∞ ∞ ∞ 10 ∞
5
2 28 0 16 ∞ ∞ ∞ 14
6
6 3 ∞ 16 0 12 ∞ ∞ ∞
7
4 ∞ ∞ 12 0 22 18
5 ∞ ∞ ∞ 22 0 25 24
6 10 ∞ ∞ ∞ 25 0 ∞
7 ∞ 14 ∞ 18 24 ∞ 0
19
Time Complexity: Depends on the DS used for the G
1. O(n2), if we use adjacency matrix to read the graph
edges, n is number of vertices in the G.
2. O((n+ |E|)logN), if we use adjacency list to read the G.
20
Kruskal’s algorithm.
21
Minimum Cost Spanning Tree – KRUSKAL’s Algorithm
Algorithm kruskal(E,cost,n,t)
if(j!=k) then
// E is the set of edges in G.G has n vertices.cost[u,v]
//is the cost of edge(u,v). t is the set of edges in the {
//minimum –cost spanning tree.the final cost i=i+1;
t[I,1]=u; t[I,2]=v;
// is returned.
mincost=mincost+cos[u,v];
{ Union(j,k);
Construct a heap out of the edge costs using Hepify; }
for i=1 to n do parent[i]=-1; }
//each vertex is in a different set. If(i!=n-1) then
write(“no spanning tree”);
i=0; mincost=0; else return mincost;
while((i<n-1) and (heap not empty)) do }
{
delete a minimum cost edge (u,v) from the
heap and reheapify using Adjust; Time Complexity: O(|E|log|E|). E is the number
of edges and log|E| is time to construct minheap.
j=Find(u);
K=Find(v);
23
Min heap
1,6
10
4,3
2,7
12 14
2,3
4,7 5,7
16 24
18 22
5,6 4,5
25 28 1,2
25
Difference between Prim’s Algorithm and Kruskal’s Algorithm-
26
Single-source shortest paths algorithm(Dijkstra Algorithm)
27
1 f i/j 1 2 3 4 5 6
1 0 50 45 10 ∞ ∞
2 f
2 ∞ 0 10 15 ∞ ∞
3 f 3 ∞ ∞ 0 ∞ 30 ∞
4 f 4 20 ∞ ∞ 0 15 ∞
5 f 5 ∞ 20 35 ∞ 0 ∞
6 ∞ ∞ ∞ ∞ 3 0
6 f
i Dist[]
1 0.0
2 50
3 45
4 10
5 ∞
6 ∞
Initially 28
Bellman Ford’s Algorithm Dijkstra’s Algorithm
2. The result contains the information about the other The result contains whole information about the network,
vertices they are connected to. not only the vertices they are connected to.
3. It can easily be implemented in a distributed way. It can not be implemented easily in a distributed way.
4. It is more time consuming than Dijkstra’s algorithm. It is less time consuming. The time complexity is
Its time complexity is O(VE(or N^2)). O(E logV or (nlog n)).
Dijkstra’s Algorithm is one example of a single-source shortest or SSSP algorithm, i.e., given a source vertex it finds
shortest path from source to all other vertices.
Floyd Warshall Algorithm is an example of all-pairs shortest path algorithm, meaning it computes the shortest path
between all pair of nodes.
29