SlideShare a Scribd company logo
UNIT-3
The Greedy Method: Introduction, Huffman Trees and codes, Minimum
Coin Change problem, Knapsack problem, Job sequencing with deadlines,
Minimum Cost Spanning Trees, Single Source Shortest paths.
Q) Define the following terms.
i. Feasible solution ii. Objective function iii. Optimal solution
Feasible Solution: Any subset that satisfies the given constraints is called
feasible solution.
Objective Function: Any feasible solution needs either maximize or
minimize a given function which is called objective function.
Optimal Solution: Any feasible solution that maximizes or minimizes the
given objective function is called an optimal solution.
Q) Describe Greedy technique with an example.
Greedy method constructs a solution to an optimization problem piece
by piece through a sequence of choices that are:
ļ‚· feasible, i.e. satisfying the constraints.
ļ‚· locally optimal, i.e., it has to be the best local choice among all
feasible choices available on that step.
ļ‚· irrevocable, i.e., once made, it cannot be changed on subsequent steps
of the algorithm.
For some problems, it yields a globally optimal solution for every
instance.
The following is the general greedy approach for control abstraction of
subset paradigm.
Algorithm Greedy(a,n)
//a[1:n] contains n inputs
{
solution := // initializes to empty
for i:=1 to n do
{
x := select(a);
if Feasible(solution x) then
solution := union(solution, x)
}
return solution;
}
Eg. Minimum Coin Change:
Given unlimited amounts of coins of denominations d1 > … > dm ,
give change for amount n with the least number of coins.
here, d1 = 25c, d2 =10c, d3 = 5c, d4 = 1c and n = 48c
Greedy approach: At each step we take a maximum denomination
coin which is less than or equal to remaining amount required.
Step 1: 48 – 25 = 23
Step 2: 23 – 10 = 13
Step 3: 13 – 10 = 03
Step 4: 03 – 01 = 02
Step 5: 02 – 01 = 01
Step 6: 01 – 01 = 00
Solution: <1, 2, 0, 3> i.e; d1 – 1coin, d2 – 2 coins, d3 – 0 coin and d4 –
3 coins.
Greedy solution is optimal for any amount and ā€œnormal’’ set of
denominations.
Q) Explain Huffman tree and Huffman code with suitable example.
Huffman tree is any binary tree with edges labeled with 0’s and 1’s yields a
prefix-free code of characters assigned to its leaves.
Huffman coding or prefix coding is a lossless data compression algorithm.
The idea is to assign variable-length codes to input characters, lengths of
the assigned codes are based on the frequencies of corresponding
characters.
Algorithm to build Huffman tree:
// Input is an array of unique characters along with their frequency of
occurrences and output is Huffman Tree.
1. Create a leaf node for each unique character and build a min heap of all
leaf nodes.
2. Extract two nodes with the minimum frequency from the min heap.
3. Create a new internal node with a frequency equal to the sum of the two
nodes frequencies. Make the first extracted node as its left child and the
other extracted node as its right child. Add this node to the min heap.
4. Repeat step2 and step3 until the heap contains only one node. The
remaining node is the root node and the tree is complete.
Time complexity: O(nlogn) where n is the number of unique characters. If
there are n nodes, extractMin() is called 2*(n – 1) times. extractMin() takes
O(logn) time as it calles minHeapify(). So, overall complexity is O(nlogn).
Eg.
character A B C D _
frequency 0.35 0.1 0.2 0.2 0.15
The code word for the character will be 001, 010, 011, 100 and 101
(fixed length encoding) without using Huffman coding, i.e; on an average
we need 3 bits to represent a character.
Step1:
Step2:
Step3:
Step4:
Step5:
Therefore, the codeword we get after using Huffman coding is
character A B C D _
frequency 0.35 0.1 0.2 0.2 0.15
codeword 11 100 00 01 101
Average bits per character using Huffman coding
= 2*0.35 + 3*0.1 + 2*0.2 + 2*0.2 + 3*0.15
= 2.25
Therefore, compression ratio: (3 - 2.25)/3*100% = 25%
Q) Briefly explain about knapsack problem with an example.
Knapsack Problem
Given a set of items, each with a weight and a value, determine a subset of
items to include in a collection so that the total weight is less than or equal
to a given limit and the total value is as large as possible.
Fractional Knapsack
In this case, items can be broken into smaller pieces, hence we can select
fractions of items.
According to the problem statement,
ļ‚· There are n items in the store
ļ‚· Weight of ith item wi > 0
ļ‚· Profit for ith item pi>0 and
ļ‚· Capacity of the Knapsack is W
In this version of Knapsack problem, items can be broken into smaller
pieces. So, the thief may take only a fraction xi of ith item.
0 ≤ xi ≤ 1
The ith item contributes the weight xi *wi to the total weight in the
knapsack and profit xi.pi to the total profit.
Hence, the objective of this algorithm is to
Maximize āˆ‘
subject to constraint,
āˆ‘ ≤ W
It is clear that an optimal solution must fill the knapsack exactly, otherwise
we could add a fraction of one of the remaining items and increase the
overall profit.
Thus, an optimal solution can be obtained by
āˆ‘ = W
Algorithm Greedyknapsack(m,n)
//p[1:n] and w[1:n] contain the prfits and weights respectively
//all n objects are ordered p[i]/w[i] ≄ p[i+1]/w[i+1]
//m is the knapsack size and x[1:n] is the solution vector
{
for i:=1 to n do
x[i]:=0.0;
u := m;
for i:=1 to n do
{
if(w[i] > u) then
break;
x[i] := 1;
u := u - w[i];
}
if(i ≤ n) then
x[i]:= u/w[i];
}
Analysis
If the provided items are already sorted into a decreasing order of pi/wi,
then the while loop takes a time in O(n); Therefore, the total time including
the sort is in O(n logn).
Eg. Let us consider that the capacity of the knapsack W = 60 and the list of
provided items are shown in the following table āˆ’
Item A B C D
Profit 280 100 120 120
Weight 40 10 20 24
Step 1: find p/w ratio for each item.
Item A B C D
Profit 280 100 120 120
Weight 40 10 20 24
Ratio pi/wi 7 10 6 5
Step2:
As the provided items are not sorted based on pi/wi. After sorting, the items
are as shown in the following table.
Item B A C D
Profit 100 280 120 120
Weight 10 40 20 24
Ratio pi/wi 10 7 6 5
Step3:
We choose 1st item B as weight of B is less than the capacity of the
knapsack.
Now knapsack contains weight = 60 – 10 = 50
Step4:
item A is chosen, as the available capacity of the knapsack is greater than
the weight of A.
Now knapsack contains weight = 50 – 40 = 10
Step5:
Now, C is chosen as the next item. However, the whole item cannot be
chosen as the remaining capacity of the knapsack is less than the weight
of C.
Hence, fraction of C (i.e. (60 āˆ’ 50)/20) is chosen.
Now, the capacity of the Knapsack is equal to the selected items. Hence, no
more item can be selected.
The total weight of the selected items is 10 + 40 + 20 * (10/20) = 60
And the total profit is 100 + 280 + 120 * (10/20) = 380 + 60 = 440
Q) Explain job sequencing with deadlines indetail with an example.
We are given a set of n jobs. Associated with job i is an integer deadline di ≄
0 and a profit pi>0. For any job i, the profit pi is earned iff the job is
completed by its deadline.
To complete a job one has to process the job on a machine for one unit of
time. Only one machine is available for processing jobs.
A feasible solution for this problem is a subset J of jobs such that each job
in this subset can be completed by its deadline.
The value of a feasible solution J is the sum of the profits of the jobs in J.
i.e; is āˆ‘
An optimal solution is a feasible solution with maximum value.
Eg.
The above is exhaustive technique in which we check all 1 and 2 jobs
feasible possibilities and the optimal is 3rd sequence which is 4,1 sequence.
The following algorithm is a high level description of job sequencing:
The following JS is the correct implementation of above algorithm:
The above algorithm assumes that the jobs are already sorted such that P1
≄ p2 ≄ ... ≄ pn. Further it assumes that n>=1 and the deadline d[i] of job i
is atleast 1.
For the above algorithm JS there are 2 possible parameters in terms of
which its time complexity can be measured.
1. the number of jobs, n
2. the number of jobs included in the solution J, which is s.
The while loop in the above algorithm is iterated atmost k times. Each
iteration takes O(1) time.
The body of the conditional operator if require O(k-r) time to insert a job i.
Hence the total time for each iteration of the for loop is O(k). This loop is
iterated for n-1 times.
If s is the final value of k, that is, S is the number of jobs in the final
solution, then the total time needed by the algorithm is O(sn). Since s ≤ n, in
worst case, the time complexity is O(n2)
Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf
Q) What is minimum spanning tree?
i) Explain Prim’s algorithm with an example.
ii) Explain Kruskal’s algorithm with an example.
A spanning tree of an undirected connected graph is its connected acyclic
subgraph (i.e., a tree) that contains all the vertices of the graph. If such a
graph has weights assigned to its edges,
A minimum spanning tree is its spanning tree of the smallest weight,
where the weight of a tree is defined as the sum of the weights on all its
edges.
The minimum spanning tree problem is the problem of finding a minimum
spanning tree for a given weighted connected graph.
Eg.
In the above image (a) is given graph and (b),(c) are two different spanning
trees. Image (c) is the minimum spanning tree as it have less cost compare
to (b).
i. Prim’s algorithm:
ļ‚· Start with tree T1 consisting of one (any) vertex and ā€œgrowā€ tree one
vertex at a time to produce MST through a series of expanding
subtrees T1, T2, …, Tn
ļ‚· On each iteration, construct Ti+1 from Ti by adding vertex not in Ti
that is closest to those already in Ti (this is a ā€œgreedyā€ step!)
ļ‚· Stop when all vertices are included.
ļ‚· Needs priority queue for locating closest fringe(not visited) vertex.
ļ‚· Efficiency:
i. O(n2) for weight matrix representation of graph and array
implementation of priority queue
ii. O(m log n) for adjacency lists representation of graph with n
vertices and m edges and min-heap implementation of the
priority queue
Eg.
Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf
Eg. 2:
ii. Kruskal’s algorithm:
ļ‚· Sort the edges in nondecreasing order of lengths
ļ‚· ā€œGrowā€ tree one edge at a time to produce MST through a series
of expanding forests F1, F2, …, Fn-1
ļ‚· On each iteration, add the next edge on the sorted list unless
this would create a cycle. (If it would, skip the edge.)
ļ‚· Algorithm looks easier than Prim’s but is harder to implement
(checking for cycles!)
ļ‚· Cycle checking: a cycle is created iff added edge connects vertices in
the same connected component
ļ‚· Runs in O(m log m) time, with m = |E|. The time is mostly spent on
sorting.
Q) Explain indetail about single source shortest path problem.
Single Source Shortest Paths Problem: Given a weighted connected
(directed) graph G, find shortest paths from source vertex s to each of the
other vertices.
Dijkstra’s algorithm: Similar to Prim’s MST algorithm, with a different way
of computing numerical labels: Among vertices not already in the tree, it
finds vertex u with the smallest sum
dv + w(v,u)
where
v is a vertex for which shortest path has been already found
on preceding iterations (such vertices form a tree rooted at s)
dv is the length of the shortest path from source s to v
w(v,u) is the length (weight) of edge from v to u.
ļ‚· Doesn’t work for graphs with negative weights
ļ‚· Applicable to both undirected and directed graphs
ļ‚· Efficiency
o O(|V|2) for graphs represented by weight matrix and array
implementation of priority queue
o O(|E|log|V|) for graphs represented by adj. lists and min-heap
implementation of priority queue
Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf
Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf
Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf
Eg 2.
The shortest paths and their lengths are:
From a to b: a – b of length 3
From a to d: a – b – d of length 5
From a to c: a – b – c of length 7
From a to e: a – b – d – e of length 9
Ad

More Related Content

What's hot (20)

Knapsack problem using greedy approach
Knapsack problem using greedy approachKnapsack problem using greedy approach
Knapsack problem using greedy approach
padmeshagrekar
Ā 
Forward and Backward chaining in AI
Forward and Backward chaining in AIForward and Backward chaining in AI
Forward and Backward chaining in AI
Megha Sharma
Ā 
Race around and master slave flip flop
Race around and master slave flip flopRace around and master slave flip flop
Race around and master slave flip flop
Shubham Singh
Ā 
Backtracking
Backtracking  Backtracking
Backtracking
Vikas Sharma
Ā 
Learning sets of rules, Sequential Learning Algorithm,FOIL
Learning sets of rules, Sequential Learning Algorithm,FOILLearning sets of rules, Sequential Learning Algorithm,FOIL
Learning sets of rules, Sequential Learning Algorithm,FOIL
Pavithra Thippanaik
Ā 
Informed search (heuristics)
Informed search (heuristics)Informed search (heuristics)
Informed search (heuristics)
Bablu Shofi
Ā 
Fractional Knapsack Problem
Fractional Knapsack ProblemFractional Knapsack Problem
Fractional Knapsack Problem
harsh kothari
Ā 
Breadth First Search & Depth First Search
Breadth First Search & Depth First SearchBreadth First Search & Depth First Search
Breadth First Search & Depth First Search
Kevin Jadiya
Ā 
greedy algorithm Fractional Knapsack
greedy algorithmFractional Knapsack greedy algorithmFractional Knapsack
greedy algorithm Fractional Knapsack
Md. Musfiqur Rahman Foysal
Ā 
DAA-Floyd Warshall Algorithm.pptx
DAA-Floyd Warshall Algorithm.pptxDAA-Floyd Warshall Algorithm.pptx
DAA-Floyd Warshall Algorithm.pptx
ArbabMaalik
Ā 
C++: Constructor, Copy Constructor and Assignment operator
C++: Constructor, Copy Constructor and Assignment operatorC++: Constructor, Copy Constructor and Assignment operator
C++: Constructor, Copy Constructor and Assignment operator
Jussi Pohjolainen
Ā 
GRAPH APPLICATION - MINIMUM SPANNING TREE (MST)
GRAPH APPLICATION - MINIMUM SPANNING TREE (MST)GRAPH APPLICATION - MINIMUM SPANNING TREE (MST)
GRAPH APPLICATION - MINIMUM SPANNING TREE (MST)
Madhu Bala
Ā 
Insertion sort and shell sort
Insertion sort and shell sortInsertion sort and shell sort
Insertion sort and shell sort
Praveen Kumar
Ā 
Syntax directed translation
Syntax directed translationSyntax directed translation
Syntax directed translation
Akshaya Arunan
Ā 
The n Queen Problem
The n Queen ProblemThe n Queen Problem
The n Queen Problem
Sukrit Gupta
Ā 
Algorithm: priority queue
Algorithm: priority queueAlgorithm: priority queue
Algorithm: priority queue
Tareq Hasan
Ā 
AI 7 | Constraint Satisfaction Problem
AI 7 | Constraint Satisfaction ProblemAI 7 | Constraint Satisfaction Problem
AI 7 | Constraint Satisfaction Problem
Mohammad Imam Hossain
Ā 
Recurrence relation
Recurrence relationRecurrence relation
Recurrence relation
Ajay Chimmani
Ā 
AI_Session 13 Adversarial Search .pptx
AI_Session 13 Adversarial Search .pptxAI_Session 13 Adversarial Search .pptx
AI_Session 13 Adversarial Search .pptx
Guru Nanak Technical Institutions
Ā 
Ai lecture 7(unit02)
Ai lecture  7(unit02)Ai lecture  7(unit02)
Ai lecture 7(unit02)
vikas dhakane
Ā 
Knapsack problem using greedy approach
Knapsack problem using greedy approachKnapsack problem using greedy approach
Knapsack problem using greedy approach
padmeshagrekar
Ā 
Forward and Backward chaining in AI
Forward and Backward chaining in AIForward and Backward chaining in AI
Forward and Backward chaining in AI
Megha Sharma
Ā 
Race around and master slave flip flop
Race around and master slave flip flopRace around and master slave flip flop
Race around and master slave flip flop
Shubham Singh
Ā 
Backtracking
Backtracking  Backtracking
Backtracking
Vikas Sharma
Ā 
Learning sets of rules, Sequential Learning Algorithm,FOIL
Learning sets of rules, Sequential Learning Algorithm,FOILLearning sets of rules, Sequential Learning Algorithm,FOIL
Learning sets of rules, Sequential Learning Algorithm,FOIL
Pavithra Thippanaik
Ā 
Informed search (heuristics)
Informed search (heuristics)Informed search (heuristics)
Informed search (heuristics)
Bablu Shofi
Ā 
Fractional Knapsack Problem
Fractional Knapsack ProblemFractional Knapsack Problem
Fractional Knapsack Problem
harsh kothari
Ā 
Breadth First Search & Depth First Search
Breadth First Search & Depth First SearchBreadth First Search & Depth First Search
Breadth First Search & Depth First Search
Kevin Jadiya
Ā 
DAA-Floyd Warshall Algorithm.pptx
DAA-Floyd Warshall Algorithm.pptxDAA-Floyd Warshall Algorithm.pptx
DAA-Floyd Warshall Algorithm.pptx
ArbabMaalik
Ā 
C++: Constructor, Copy Constructor and Assignment operator
C++: Constructor, Copy Constructor and Assignment operatorC++: Constructor, Copy Constructor and Assignment operator
C++: Constructor, Copy Constructor and Assignment operator
Jussi Pohjolainen
Ā 
GRAPH APPLICATION - MINIMUM SPANNING TREE (MST)
GRAPH APPLICATION - MINIMUM SPANNING TREE (MST)GRAPH APPLICATION - MINIMUM SPANNING TREE (MST)
GRAPH APPLICATION - MINIMUM SPANNING TREE (MST)
Madhu Bala
Ā 
Insertion sort and shell sort
Insertion sort and shell sortInsertion sort and shell sort
Insertion sort and shell sort
Praveen Kumar
Ā 
Syntax directed translation
Syntax directed translationSyntax directed translation
Syntax directed translation
Akshaya Arunan
Ā 
The n Queen Problem
The n Queen ProblemThe n Queen Problem
The n Queen Problem
Sukrit Gupta
Ā 
Algorithm: priority queue
Algorithm: priority queueAlgorithm: priority queue
Algorithm: priority queue
Tareq Hasan
Ā 
AI 7 | Constraint Satisfaction Problem
AI 7 | Constraint Satisfaction ProblemAI 7 | Constraint Satisfaction Problem
AI 7 | Constraint Satisfaction Problem
Mohammad Imam Hossain
Ā 
Recurrence relation
Recurrence relationRecurrence relation
Recurrence relation
Ajay Chimmani
Ā 
Ai lecture 7(unit02)
Ai lecture  7(unit02)Ai lecture  7(unit02)
Ai lecture 7(unit02)
vikas dhakane
Ā 

Similar to Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf (20)

Introduction to Greedy method, 0/1 Knapsack problem
Introduction to Greedy method, 0/1 Knapsack problemIntroduction to Greedy method, 0/1 Knapsack problem
Introduction to Greedy method, 0/1 Knapsack problem
DrSMeenakshiSundaram1
Ā 
Algorithm Design Techiques, divide and conquer
Algorithm Design Techiques, divide and conquerAlgorithm Design Techiques, divide and conquer
Algorithm Design Techiques, divide and conquer
Minakshee Patil
Ā 
Dynamic programming1
Dynamic programming1Dynamic programming1
Dynamic programming1
debolina13
Ā 
DSA Complexity.pptx What is Complexity Analysis? What is the need for Compl...
DSA Complexity.pptx   What is Complexity Analysis? What is the need for Compl...DSA Complexity.pptx   What is Complexity Analysis? What is the need for Compl...
DSA Complexity.pptx What is Complexity Analysis? What is the need for Compl...
2022cspaawan12556
Ā 
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhhCh3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
danielgetachew0922
Ā 
DYNAMIC PROGRAMMING AND GREEDY TECHNIQUE
DYNAMIC PROGRAMMING AND GREEDY TECHNIQUEDYNAMIC PROGRAMMING AND GREEDY TECHNIQUE
DYNAMIC PROGRAMMING AND GREEDY TECHNIQUE
ssusered62011
Ā 
Daa chapter 3
Daa chapter 3Daa chapter 3
Daa chapter 3
B.Kirron Reddi
Ā 
376951072-3-Greedy-Method-new-ppt.ppt
376951072-3-Greedy-Method-new-ppt.ppt376951072-3-Greedy-Method-new-ppt.ppt
376951072-3-Greedy-Method-new-ppt.ppt
RohitPaul71
Ā 
Data structure notes
Data structure notesData structure notes
Data structure notes
anujab5
Ā 
Case Study(Analysis of Algorithm.pdf
Case Study(Analysis of Algorithm.pdfCase Study(Analysis of Algorithm.pdf
Case Study(Analysis of Algorithm.pdf
ShaistaRiaz4
Ā 
Daa chapter 2
Daa chapter 2Daa chapter 2
Daa chapter 2
B.Kirron Reddi
Ā 
Design and analysis of algorithms
Design and analysis of algorithmsDesign and analysis of algorithms
Design and analysis of algorithms
PSG College of Technology
Ā 
Brute Force and Divide & Conquer Technique
Brute Force and Divide & Conquer TechniqueBrute Force and Divide & Conquer Technique
Brute Force and Divide & Conquer Technique
ssusered62011
Ā 
Chapter 5.pptx
Chapter 5.pptxChapter 5.pptx
Chapter 5.pptx
Tekle12
Ā 
01 - DAA - PPT.pptx
01 - DAA - PPT.pptx01 - DAA - PPT.pptx
01 - DAA - PPT.pptx
KokilaK25
Ā 
AAC ch 3 Advance strategies (Dynamic Programming).pptx
AAC ch 3 Advance strategies (Dynamic Programming).pptxAAC ch 3 Advance strategies (Dynamic Programming).pptx
AAC ch 3 Advance strategies (Dynamic Programming).pptx
HarshitSingh334328
Ā 
02 Notes Divide and Conquer
02 Notes Divide and Conquer02 Notes Divide and Conquer
02 Notes Divide and Conquer
Andres Mendez-Vazquez
Ā 
Module 3_DAA (2).pptx
Module 3_DAA (2).pptxModule 3_DAA (2).pptx
Module 3_DAA (2).pptx
AnkitaVerma776806
Ā 
Daa chapter4
Daa chapter4Daa chapter4
Daa chapter4
B.Kirron Reddi
Ā 
Unit 3
Unit 3Unit 3
Unit 3
Gunasundari Selvaraj
Ā 
Introduction to Greedy method, 0/1 Knapsack problem
Introduction to Greedy method, 0/1 Knapsack problemIntroduction to Greedy method, 0/1 Knapsack problem
Introduction to Greedy method, 0/1 Knapsack problem
DrSMeenakshiSundaram1
Ā 
Algorithm Design Techiques, divide and conquer
Algorithm Design Techiques, divide and conquerAlgorithm Design Techiques, divide and conquer
Algorithm Design Techiques, divide and conquer
Minakshee Patil
Ā 
Dynamic programming1
Dynamic programming1Dynamic programming1
Dynamic programming1
debolina13
Ā 
DSA Complexity.pptx What is Complexity Analysis? What is the need for Compl...
DSA Complexity.pptx   What is Complexity Analysis? What is the need for Compl...DSA Complexity.pptx   What is Complexity Analysis? What is the need for Compl...
DSA Complexity.pptx What is Complexity Analysis? What is the need for Compl...
2022cspaawan12556
Ā 
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhhCh3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
Ch3(1).pptxbbbbbbbbbbbbbbbbbbbhhhhhhhhhh
danielgetachew0922
Ā 
DYNAMIC PROGRAMMING AND GREEDY TECHNIQUE
DYNAMIC PROGRAMMING AND GREEDY TECHNIQUEDYNAMIC PROGRAMMING AND GREEDY TECHNIQUE
DYNAMIC PROGRAMMING AND GREEDY TECHNIQUE
ssusered62011
Ā 
376951072-3-Greedy-Method-new-ppt.ppt
376951072-3-Greedy-Method-new-ppt.ppt376951072-3-Greedy-Method-new-ppt.ppt
376951072-3-Greedy-Method-new-ppt.ppt
RohitPaul71
Ā 
Data structure notes
Data structure notesData structure notes
Data structure notes
anujab5
Ā 
Case Study(Analysis of Algorithm.pdf
Case Study(Analysis of Algorithm.pdfCase Study(Analysis of Algorithm.pdf
Case Study(Analysis of Algorithm.pdf
ShaistaRiaz4
Ā 
Brute Force and Divide & Conquer Technique
Brute Force and Divide & Conquer TechniqueBrute Force and Divide & Conquer Technique
Brute Force and Divide & Conquer Technique
ssusered62011
Ā 
Chapter 5.pptx
Chapter 5.pptxChapter 5.pptx
Chapter 5.pptx
Tekle12
Ā 
01 - DAA - PPT.pptx
01 - DAA - PPT.pptx01 - DAA - PPT.pptx
01 - DAA - PPT.pptx
KokilaK25
Ā 
AAC ch 3 Advance strategies (Dynamic Programming).pptx
AAC ch 3 Advance strategies (Dynamic Programming).pptxAAC ch 3 Advance strategies (Dynamic Programming).pptx
AAC ch 3 Advance strategies (Dynamic Programming).pptx
HarshitSingh334328
Ā 
Ad

More from yashodamb (10)

conversion of Infix to Postfix conversion using stack
conversion of Infix to Postfix conversion using stackconversion of Infix to Postfix conversion using stack
conversion of Infix to Postfix conversion using stack
yashodamb
Ā 
Introduction to Cloud computing concept.pptx
Introduction to Cloud computing concept.pptxIntroduction to Cloud computing concept.pptx
Introduction to Cloud computing concept.pptx
yashodamb
Ā 
Introduction to Database Management System.pptx
Introduction to Database Management System.pptxIntroduction to Database Management System.pptx
Introduction to Database Management System.pptx
yashodamb
Ā 
Cascading style sheet: complete explanation with some examples
Cascading style sheet: complete explanation with some examplesCascading style sheet: complete explanation with some examples
Cascading style sheet: complete explanation with some examples
yashodamb
Ā 
applets.pptx
applets.pptxapplets.pptx
applets.pptx
yashodamb
Ā 
Navigation between the worksheets.pptx
Navigation between the worksheets.pptxNavigation between the worksheets.pptx
Navigation between the worksheets.pptx
yashodamb
Ā 
Implementing a Java Program.pptx
Implementing a Java Program.pptxImplementing a Java Program.pptx
Implementing a Java Program.pptx
yashodamb
Ā 
Thread life cycle.pptx
Thread life cycle.pptxThread life cycle.pptx
Thread life cycle.pptx
yashodamb
Ā 
DECISION MAKING STATEMENTS.pptx
DECISION MAKING STATEMENTS.pptxDECISION MAKING STATEMENTS.pptx
DECISION MAKING STATEMENTS.pptx
yashodamb
Ā 
History of C programming.pptx
History of C programming.pptxHistory of C programming.pptx
History of C programming.pptx
yashodamb
Ā 
conversion of Infix to Postfix conversion using stack
conversion of Infix to Postfix conversion using stackconversion of Infix to Postfix conversion using stack
conversion of Infix to Postfix conversion using stack
yashodamb
Ā 
Introduction to Cloud computing concept.pptx
Introduction to Cloud computing concept.pptxIntroduction to Cloud computing concept.pptx
Introduction to Cloud computing concept.pptx
yashodamb
Ā 
Introduction to Database Management System.pptx
Introduction to Database Management System.pptxIntroduction to Database Management System.pptx
Introduction to Database Management System.pptx
yashodamb
Ā 
Cascading style sheet: complete explanation with some examples
Cascading style sheet: complete explanation with some examplesCascading style sheet: complete explanation with some examples
Cascading style sheet: complete explanation with some examples
yashodamb
Ā 
applets.pptx
applets.pptxapplets.pptx
applets.pptx
yashodamb
Ā 
Navigation between the worksheets.pptx
Navigation between the worksheets.pptxNavigation between the worksheets.pptx
Navigation between the worksheets.pptx
yashodamb
Ā 
Implementing a Java Program.pptx
Implementing a Java Program.pptxImplementing a Java Program.pptx
Implementing a Java Program.pptx
yashodamb
Ā 
Thread life cycle.pptx
Thread life cycle.pptxThread life cycle.pptx
Thread life cycle.pptx
yashodamb
Ā 
DECISION MAKING STATEMENTS.pptx
DECISION MAKING STATEMENTS.pptxDECISION MAKING STATEMENTS.pptx
DECISION MAKING STATEMENTS.pptx
yashodamb
Ā 
History of C programming.pptx
History of C programming.pptxHistory of C programming.pptx
History of C programming.pptx
yashodamb
Ā 
Ad

Recently uploaded (20)

Political History of Pala dynasty Pala Rulers NEP.pptx
Political History of Pala dynasty Pala Rulers NEP.pptxPolitical History of Pala dynasty Pala Rulers NEP.pptx
Political History of Pala dynasty Pala Rulers NEP.pptx
Arya Mahila P. G. College, Banaras Hindu University, Varanasi, India.
Ā 
SCI BIZ TECH QUIZ (OPEN) PRELIMS XTASY 2025.pptx
SCI BIZ TECH QUIZ (OPEN) PRELIMS XTASY 2025.pptxSCI BIZ TECH QUIZ (OPEN) PRELIMS XTASY 2025.pptx
SCI BIZ TECH QUIZ (OPEN) PRELIMS XTASY 2025.pptx
Ronisha Das
Ā 
THE STG QUIZ GROUP D.pptx quiz by Ridip Hazarika
THE STG QUIZ GROUP D.pptx   quiz by Ridip HazarikaTHE STG QUIZ GROUP D.pptx   quiz by Ridip Hazarika
THE STG QUIZ GROUP D.pptx quiz by Ridip Hazarika
Ridip Hazarika
Ā 
Presentation of the MIPLM subject matter expert Erdem Kaya
Presentation of the MIPLM subject matter expert Erdem KayaPresentation of the MIPLM subject matter expert Erdem Kaya
Presentation of the MIPLM subject matter expert Erdem Kaya
MIPLM
Ā 
World war-1(Causes & impacts at a glance) PPT by Simanchala Sarab(BABed,sem-4...
World war-1(Causes & impacts at a glance) PPT by Simanchala Sarab(BABed,sem-4...World war-1(Causes & impacts at a glance) PPT by Simanchala Sarab(BABed,sem-4...
World war-1(Causes & impacts at a glance) PPT by Simanchala Sarab(BABed,sem-4...
larencebapu132
Ā 
YSPH VMOC Special Report - Measles Outbreak Southwest US 5-3-2025.pptx
YSPH VMOC Special Report - Measles Outbreak  Southwest US 5-3-2025.pptxYSPH VMOC Special Report - Measles Outbreak  Southwest US 5-3-2025.pptx
YSPH VMOC Special Report - Measles Outbreak Southwest US 5-3-2025.pptx
Yale School of Public Health - The Virtual Medical Operations Center (VMOC)
Ā 
CBSE - Grade 8 - Science - Chemistry - Metals and Non Metals - Worksheet
CBSE - Grade 8 - Science - Chemistry - Metals and Non Metals - WorksheetCBSE - Grade 8 - Science - Chemistry - Metals and Non Metals - Worksheet
CBSE - Grade 8 - Science - Chemistry - Metals and Non Metals - Worksheet
Sritoma Majumder
Ā 
Grade 3 - English - Printable Worksheet (PDF Format)
Grade 3 - English - Printable Worksheet  (PDF Format)Grade 3 - English - Printable Worksheet  (PDF Format)
Grade 3 - English - Printable Worksheet (PDF Format)
Sritoma Majumder
Ā 
Introduction to Vibe Coding and Vibe Engineering
Introduction to Vibe Coding and Vibe EngineeringIntroduction to Vibe Coding and Vibe Engineering
Introduction to Vibe Coding and Vibe Engineering
Damian T. Gordon
Ā 
SPRING FESTIVITIES - UK AND USA -
SPRING FESTIVITIES - UK AND USA            -SPRING FESTIVITIES - UK AND USA            -
SPRING FESTIVITIES - UK AND USA -
ColƩgio Santa Teresinha
Ā 
To study Digestive system of insect.pptx
To study Digestive system of insect.pptxTo study Digestive system of insect.pptx
To study Digestive system of insect.pptx
Arshad Shaikh
Ā 
Biophysics Chapter 3 Methods of Studying Macromolecules.pdf
Biophysics Chapter 3 Methods of Studying Macromolecules.pdfBiophysics Chapter 3 Methods of Studying Macromolecules.pdf
Biophysics Chapter 3 Methods of Studying Macromolecules.pdf
PKLI-Institute of Nursing and Allied Health Sciences Lahore , Pakistan.
Ā 
apa-style-referencing-visual-guide-2025.pdf
apa-style-referencing-visual-guide-2025.pdfapa-style-referencing-visual-guide-2025.pdf
apa-style-referencing-visual-guide-2025.pdf
Ishika Ghosh
Ā 
pulse ppt.pptx Types of pulse , characteristics of pulse , Alteration of pulse
pulse  ppt.pptx Types of pulse , characteristics of pulse , Alteration of pulsepulse  ppt.pptx Types of pulse , characteristics of pulse , Alteration of pulse
pulse ppt.pptx Types of pulse , characteristics of pulse , Alteration of pulse
sushreesangita003
Ā 
How to Manage Opening & Closing Controls in Odoo 17 POS
How to Manage Opening & Closing Controls in Odoo 17 POSHow to Manage Opening & Closing Controls in Odoo 17 POS
How to Manage Opening & Closing Controls in Odoo 17 POS
Celine George
Ā 
How to Manage Purchase Alternatives in Odoo 18
How to Manage Purchase Alternatives in Odoo 18How to Manage Purchase Alternatives in Odoo 18
How to Manage Purchase Alternatives in Odoo 18
Celine George
Ā 
Exercise Physiology MCQS By DR. NASIR MUSTAFA
Exercise Physiology MCQS By DR. NASIR MUSTAFAExercise Physiology MCQS By DR. NASIR MUSTAFA
Exercise Physiology MCQS By DR. NASIR MUSTAFA
Dr. Nasir Mustafa
Ā 
Geography Sem II Unit 1C Correlation of Geography with other school subjects
Geography Sem II Unit 1C Correlation of Geography with other school subjectsGeography Sem II Unit 1C Correlation of Geography with other school subjects
Geography Sem II Unit 1C Correlation of Geography with other school subjects
ProfDrShaikhImran
Ā 
03#UNTAGGED. Generosity in architecture.
03#UNTAGGED. Generosity in architecture.03#UNTAGGED. Generosity in architecture.
03#UNTAGGED. Generosity in architecture.
MCH
Ā 
APM Midlands Region April 2025 Sacha Hind Circulated.pdf
APM Midlands Region April 2025 Sacha Hind Circulated.pdfAPM Midlands Region April 2025 Sacha Hind Circulated.pdf
APM Midlands Region April 2025 Sacha Hind Circulated.pdf
Association for Project Management
Ā 
SCI BIZ TECH QUIZ (OPEN) PRELIMS XTASY 2025.pptx
SCI BIZ TECH QUIZ (OPEN) PRELIMS XTASY 2025.pptxSCI BIZ TECH QUIZ (OPEN) PRELIMS XTASY 2025.pptx
SCI BIZ TECH QUIZ (OPEN) PRELIMS XTASY 2025.pptx
Ronisha Das
Ā 
THE STG QUIZ GROUP D.pptx quiz by Ridip Hazarika
THE STG QUIZ GROUP D.pptx   quiz by Ridip HazarikaTHE STG QUIZ GROUP D.pptx   quiz by Ridip Hazarika
THE STG QUIZ GROUP D.pptx quiz by Ridip Hazarika
Ridip Hazarika
Ā 
Presentation of the MIPLM subject matter expert Erdem Kaya
Presentation of the MIPLM subject matter expert Erdem KayaPresentation of the MIPLM subject matter expert Erdem Kaya
Presentation of the MIPLM subject matter expert Erdem Kaya
MIPLM
Ā 
World war-1(Causes & impacts at a glance) PPT by Simanchala Sarab(BABed,sem-4...
World war-1(Causes & impacts at a glance) PPT by Simanchala Sarab(BABed,sem-4...World war-1(Causes & impacts at a glance) PPT by Simanchala Sarab(BABed,sem-4...
World war-1(Causes & impacts at a glance) PPT by Simanchala Sarab(BABed,sem-4...
larencebapu132
Ā 
CBSE - Grade 8 - Science - Chemistry - Metals and Non Metals - Worksheet
CBSE - Grade 8 - Science - Chemistry - Metals and Non Metals - WorksheetCBSE - Grade 8 - Science - Chemistry - Metals and Non Metals - Worksheet
CBSE - Grade 8 - Science - Chemistry - Metals and Non Metals - Worksheet
Sritoma Majumder
Ā 
Grade 3 - English - Printable Worksheet (PDF Format)
Grade 3 - English - Printable Worksheet  (PDF Format)Grade 3 - English - Printable Worksheet  (PDF Format)
Grade 3 - English - Printable Worksheet (PDF Format)
Sritoma Majumder
Ā 
Introduction to Vibe Coding and Vibe Engineering
Introduction to Vibe Coding and Vibe EngineeringIntroduction to Vibe Coding and Vibe Engineering
Introduction to Vibe Coding and Vibe Engineering
Damian T. Gordon
Ā 
To study Digestive system of insect.pptx
To study Digestive system of insect.pptxTo study Digestive system of insect.pptx
To study Digestive system of insect.pptx
Arshad Shaikh
Ā 
apa-style-referencing-visual-guide-2025.pdf
apa-style-referencing-visual-guide-2025.pdfapa-style-referencing-visual-guide-2025.pdf
apa-style-referencing-visual-guide-2025.pdf
Ishika Ghosh
Ā 
pulse ppt.pptx Types of pulse , characteristics of pulse , Alteration of pulse
pulse  ppt.pptx Types of pulse , characteristics of pulse , Alteration of pulsepulse  ppt.pptx Types of pulse , characteristics of pulse , Alteration of pulse
pulse ppt.pptx Types of pulse , characteristics of pulse , Alteration of pulse
sushreesangita003
Ā 
How to Manage Opening & Closing Controls in Odoo 17 POS
How to Manage Opening & Closing Controls in Odoo 17 POSHow to Manage Opening & Closing Controls in Odoo 17 POS
How to Manage Opening & Closing Controls in Odoo 17 POS
Celine George
Ā 
How to Manage Purchase Alternatives in Odoo 18
How to Manage Purchase Alternatives in Odoo 18How to Manage Purchase Alternatives in Odoo 18
How to Manage Purchase Alternatives in Odoo 18
Celine George
Ā 
Exercise Physiology MCQS By DR. NASIR MUSTAFA
Exercise Physiology MCQS By DR. NASIR MUSTAFAExercise Physiology MCQS By DR. NASIR MUSTAFA
Exercise Physiology MCQS By DR. NASIR MUSTAFA
Dr. Nasir Mustafa
Ā 
Geography Sem II Unit 1C Correlation of Geography with other school subjects
Geography Sem II Unit 1C Correlation of Geography with other school subjectsGeography Sem II Unit 1C Correlation of Geography with other school subjects
Geography Sem II Unit 1C Correlation of Geography with other school subjects
ProfDrShaikhImran
Ā 
03#UNTAGGED. Generosity in architecture.
03#UNTAGGED. Generosity in architecture.03#UNTAGGED. Generosity in architecture.
03#UNTAGGED. Generosity in architecture.
MCH
Ā 

Unit-3 greedy method, Prim's algorithm, Kruskal's algorithm.pdf

  • 1. UNIT-3 The Greedy Method: Introduction, Huffman Trees and codes, Minimum Coin Change problem, Knapsack problem, Job sequencing with deadlines, Minimum Cost Spanning Trees, Single Source Shortest paths. Q) Define the following terms. i. Feasible solution ii. Objective function iii. Optimal solution Feasible Solution: Any subset that satisfies the given constraints is called feasible solution. Objective Function: Any feasible solution needs either maximize or minimize a given function which is called objective function. Optimal Solution: Any feasible solution that maximizes or minimizes the given objective function is called an optimal solution. Q) Describe Greedy technique with an example. Greedy method constructs a solution to an optimization problem piece by piece through a sequence of choices that are: ļ‚· feasible, i.e. satisfying the constraints. ļ‚· locally optimal, i.e., it has to be the best local choice among all feasible choices available on that step. ļ‚· irrevocable, i.e., once made, it cannot be changed on subsequent steps of the algorithm. For some problems, it yields a globally optimal solution for every instance. The following is the general greedy approach for control abstraction of subset paradigm. Algorithm Greedy(a,n) //a[1:n] contains n inputs { solution := // initializes to empty for i:=1 to n do { x := select(a); if Feasible(solution x) then solution := union(solution, x) } return solution; } Eg. Minimum Coin Change: Given unlimited amounts of coins of denominations d1 > … > dm , give change for amount n with the least number of coins. here, d1 = 25c, d2 =10c, d3 = 5c, d4 = 1c and n = 48c
  • 2. Greedy approach: At each step we take a maximum denomination coin which is less than or equal to remaining amount required. Step 1: 48 – 25 = 23 Step 2: 23 – 10 = 13 Step 3: 13 – 10 = 03 Step 4: 03 – 01 = 02 Step 5: 02 – 01 = 01 Step 6: 01 – 01 = 00 Solution: <1, 2, 0, 3> i.e; d1 – 1coin, d2 – 2 coins, d3 – 0 coin and d4 – 3 coins. Greedy solution is optimal for any amount and ā€œnormal’’ set of denominations. Q) Explain Huffman tree and Huffman code with suitable example. Huffman tree is any binary tree with edges labeled with 0’s and 1’s yields a prefix-free code of characters assigned to its leaves. Huffman coding or prefix coding is a lossless data compression algorithm. The idea is to assign variable-length codes to input characters, lengths of the assigned codes are based on the frequencies of corresponding characters. Algorithm to build Huffman tree: // Input is an array of unique characters along with their frequency of occurrences and output is Huffman Tree. 1. Create a leaf node for each unique character and build a min heap of all leaf nodes. 2. Extract two nodes with the minimum frequency from the min heap. 3. Create a new internal node with a frequency equal to the sum of the two nodes frequencies. Make the first extracted node as its left child and the other extracted node as its right child. Add this node to the min heap. 4. Repeat step2 and step3 until the heap contains only one node. The remaining node is the root node and the tree is complete. Time complexity: O(nlogn) where n is the number of unique characters. If there are n nodes, extractMin() is called 2*(n – 1) times. extractMin() takes O(logn) time as it calles minHeapify(). So, overall complexity is O(nlogn). Eg. character A B C D _ frequency 0.35 0.1 0.2 0.2 0.15 The code word for the character will be 001, 010, 011, 100 and 101 (fixed length encoding) without using Huffman coding, i.e; on an average we need 3 bits to represent a character. Step1:
  • 3. Step2: Step3: Step4: Step5: Therefore, the codeword we get after using Huffman coding is character A B C D _ frequency 0.35 0.1 0.2 0.2 0.15 codeword 11 100 00 01 101 Average bits per character using Huffman coding = 2*0.35 + 3*0.1 + 2*0.2 + 2*0.2 + 3*0.15 = 2.25 Therefore, compression ratio: (3 - 2.25)/3*100% = 25%
  • 4. Q) Briefly explain about knapsack problem with an example. Knapsack Problem Given a set of items, each with a weight and a value, determine a subset of items to include in a collection so that the total weight is less than or equal to a given limit and the total value is as large as possible. Fractional Knapsack In this case, items can be broken into smaller pieces, hence we can select fractions of items. According to the problem statement, ļ‚· There are n items in the store ļ‚· Weight of ith item wi > 0 ļ‚· Profit for ith item pi>0 and ļ‚· Capacity of the Knapsack is W In this version of Knapsack problem, items can be broken into smaller pieces. So, the thief may take only a fraction xi of ith item. 0 ≤ xi ≤ 1 The ith item contributes the weight xi *wi to the total weight in the knapsack and profit xi.pi to the total profit. Hence, the objective of this algorithm is to Maximize āˆ‘ subject to constraint, āˆ‘ ≤ W It is clear that an optimal solution must fill the knapsack exactly, otherwise we could add a fraction of one of the remaining items and increase the overall profit. Thus, an optimal solution can be obtained by āˆ‘ = W Algorithm Greedyknapsack(m,n) //p[1:n] and w[1:n] contain the prfits and weights respectively //all n objects are ordered p[i]/w[i] ≄ p[i+1]/w[i+1] //m is the knapsack size and x[1:n] is the solution vector { for i:=1 to n do x[i]:=0.0; u := m; for i:=1 to n do { if(w[i] > u) then break; x[i] := 1; u := u - w[i]; } if(i ≤ n) then x[i]:= u/w[i]; }
  • 5. Analysis If the provided items are already sorted into a decreasing order of pi/wi, then the while loop takes a time in O(n); Therefore, the total time including the sort is in O(n logn). Eg. Let us consider that the capacity of the knapsack W = 60 and the list of provided items are shown in the following table āˆ’ Item A B C D Profit 280 100 120 120 Weight 40 10 20 24 Step 1: find p/w ratio for each item. Item A B C D Profit 280 100 120 120 Weight 40 10 20 24 Ratio pi/wi 7 10 6 5 Step2: As the provided items are not sorted based on pi/wi. After sorting, the items are as shown in the following table. Item B A C D Profit 100 280 120 120 Weight 10 40 20 24 Ratio pi/wi 10 7 6 5 Step3: We choose 1st item B as weight of B is less than the capacity of the knapsack. Now knapsack contains weight = 60 – 10 = 50 Step4: item A is chosen, as the available capacity of the knapsack is greater than the weight of A. Now knapsack contains weight = 50 – 40 = 10 Step5: Now, C is chosen as the next item. However, the whole item cannot be chosen as the remaining capacity of the knapsack is less than the weight of C. Hence, fraction of C (i.e. (60 āˆ’ 50)/20) is chosen. Now, the capacity of the Knapsack is equal to the selected items. Hence, no more item can be selected. The total weight of the selected items is 10 + 40 + 20 * (10/20) = 60 And the total profit is 100 + 280 + 120 * (10/20) = 380 + 60 = 440
  • 6. Q) Explain job sequencing with deadlines indetail with an example. We are given a set of n jobs. Associated with job i is an integer deadline di ≄ 0 and a profit pi>0. For any job i, the profit pi is earned iff the job is completed by its deadline. To complete a job one has to process the job on a machine for one unit of time. Only one machine is available for processing jobs. A feasible solution for this problem is a subset J of jobs such that each job in this subset can be completed by its deadline. The value of a feasible solution J is the sum of the profits of the jobs in J. i.e; is āˆ‘ An optimal solution is a feasible solution with maximum value.
  • 7. Eg. The above is exhaustive technique in which we check all 1 and 2 jobs feasible possibilities and the optimal is 3rd sequence which is 4,1 sequence. The following algorithm is a high level description of job sequencing:
  • 8. The following JS is the correct implementation of above algorithm: The above algorithm assumes that the jobs are already sorted such that P1 ≄ p2 ≄ ... ≄ pn. Further it assumes that n>=1 and the deadline d[i] of job i is atleast 1. For the above algorithm JS there are 2 possible parameters in terms of which its time complexity can be measured. 1. the number of jobs, n 2. the number of jobs included in the solution J, which is s. The while loop in the above algorithm is iterated atmost k times. Each iteration takes O(1) time. The body of the conditional operator if require O(k-r) time to insert a job i. Hence the total time for each iteration of the for loop is O(k). This loop is iterated for n-1 times. If s is the final value of k, that is, S is the number of jobs in the final solution, then the total time needed by the algorithm is O(sn). Since s ≤ n, in worst case, the time complexity is O(n2)
  • 10. Q) What is minimum spanning tree? i) Explain Prim’s algorithm with an example. ii) Explain Kruskal’s algorithm with an example. A spanning tree of an undirected connected graph is its connected acyclic subgraph (i.e., a tree) that contains all the vertices of the graph. If such a graph has weights assigned to its edges, A minimum spanning tree is its spanning tree of the smallest weight, where the weight of a tree is defined as the sum of the weights on all its edges. The minimum spanning tree problem is the problem of finding a minimum spanning tree for a given weighted connected graph. Eg. In the above image (a) is given graph and (b),(c) are two different spanning trees. Image (c) is the minimum spanning tree as it have less cost compare to (b). i. Prim’s algorithm: ļ‚· Start with tree T1 consisting of one (any) vertex and ā€œgrowā€ tree one vertex at a time to produce MST through a series of expanding subtrees T1, T2, …, Tn ļ‚· On each iteration, construct Ti+1 from Ti by adding vertex not in Ti that is closest to those already in Ti (this is a ā€œgreedyā€ step!) ļ‚· Stop when all vertices are included.
  • 11. ļ‚· Needs priority queue for locating closest fringe(not visited) vertex. ļ‚· Efficiency: i. O(n2) for weight matrix representation of graph and array implementation of priority queue ii. O(m log n) for adjacency lists representation of graph with n vertices and m edges and min-heap implementation of the priority queue
  • 12. Eg.
  • 15. ii. Kruskal’s algorithm: ļ‚· Sort the edges in nondecreasing order of lengths ļ‚· ā€œGrowā€ tree one edge at a time to produce MST through a series of expanding forests F1, F2, …, Fn-1 ļ‚· On each iteration, add the next edge on the sorted list unless this would create a cycle. (If it would, skip the edge.) ļ‚· Algorithm looks easier than Prim’s but is harder to implement (checking for cycles!) ļ‚· Cycle checking: a cycle is created iff added edge connects vertices in the same connected component ļ‚· Runs in O(m log m) time, with m = |E|. The time is mostly spent on sorting.
  • 16. Q) Explain indetail about single source shortest path problem. Single Source Shortest Paths Problem: Given a weighted connected (directed) graph G, find shortest paths from source vertex s to each of the other vertices. Dijkstra’s algorithm: Similar to Prim’s MST algorithm, with a different way of computing numerical labels: Among vertices not already in the tree, it finds vertex u with the smallest sum dv + w(v,u)
  • 17. where v is a vertex for which shortest path has been already found on preceding iterations (such vertices form a tree rooted at s) dv is the length of the shortest path from source s to v w(v,u) is the length (weight) of edge from v to u. ļ‚· Doesn’t work for graphs with negative weights ļ‚· Applicable to both undirected and directed graphs ļ‚· Efficiency o O(|V|2) for graphs represented by weight matrix and array implementation of priority queue o O(|E|log|V|) for graphs represented by adj. lists and min-heap implementation of priority queue
  • 21. Eg 2. The shortest paths and their lengths are: From a to b: a – b of length 3 From a to d: a – b – d of length 5 From a to c: a – b – c of length 7 From a to e: a – b – d – e of length 9