0% found this document useful (0 votes)
2 views

_DAA Exp5

The document details an experiment conducted by Ria Talsania on Minimum Spanning Trees (MST) using Prim's and Kruskal's algorithms. It outlines the objectives, theoretical background, program codes for both algorithms, and their respective time and space complexities. The conclusion emphasizes the effectiveness of greedy approaches in constructing MSTs based on graph density.

Uploaded by

Tanay Kinariwala
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

_DAA Exp5

The document details an experiment conducted by Ria Talsania on Minimum Spanning Trees (MST) using Prim's and Kruskal's algorithms. It outlines the objectives, theoretical background, program codes for both algorithms, and their respective time and space complexities. The conclusion emphasizes the effectiveness of greedy approaches in constructing MSTs based on graph density.

Uploaded by

Tanay Kinariwala
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

Name: Ria Talsania

UID: 2023300242

Experiment No. 5

Date of Submission 26/2/25

AIM: Experiment on Graphs: To understand Graphs and implement Prim’s and Kruskal’s
Algorithm to find Minimum Spanning Trees.

Program 1

Objectives: 1.​ To learn Prim’s and Kruskal’s Algorithm


2.​ To find MST’s using both algorithms.

3. To find the best, worst and average cases of each algorithm and find
time and space complexity for each

Theory: Prim’s Algorithm:

Prim’s algorithm builds the Minimum Spanning Tree (MST) by starting from any
vertex and greedily adding the smallest edge that connects a new vertex to the
growing MST.

Kruskal’s Algorithm:

Kruskal’s algorithm sorts all edges by weight and greedily picks the smallest edge
that does not form a cycle, using the Union-Find data structure to manage
connected components.
Prim’s Time and Space Complexity:
Kruskal’s Time and Space Complexity:
Program: Code for Prim’s Algorithm:

#include <stdio.h>

#include <stdlib.h>

#include <limits.h>

#define MAX 100

void print(int size, int matrix[size][size]) {

for (int i = 0; i < size; i++) {

for (int j = 0; j < size; j++) {

printf("%d ", matrix[i][j]);

printf("\n");

printf("\n");

int** prims(int V, int graph[V][V]) {

int visited[V];

for (int i = 0; i < V; i++) visited[i] = 0;

int prev, next;

int totalWeight = 0;

int** span = (int**)malloc(V * sizeof(int*));

for (int i = 0; i < V; i++) {


span[i] = (int*)calloc(V, sizeof(int));

visited[0] = 1;

printf("Chosen edges:\n");

for (int i = 0; i < V - 1; i++) {

int minWeight = INT_MAX;

for (int j = 0; j < V; j++) {

if (visited[j]) {

for (int k = 0; k < V; k++) {

if (graph[j][k] && !visited[k] && graph[j][k] < minWeight) {

minWeight = graph[j][k];

prev = j;

next = k;

span[prev][next] = minWeight;

span[next][prev] = minWeight;

visited[next] = 1;

totalWeight += minWeight;

printf("%d-%d : weight(%d)\n", prev, next, minWeight);


}

printf("Total weight of spanning tree: %d\n", totalWeight);

return span;

int main() {

int V, E;

printf("Enter number of vertices and edges: ");

scanf("%d %d", &V, &E);

int graph[V][V];

for (int i = 0; i < V; i++) {

for (int j = 0; j < V; j++) {

graph[i][j] = 0;

printf("Enter edges (source destination weight):\n");

for (int i = 0; i < E; i++) {

int src, dest, weight;

scanf("%d %d %d", &src, &dest, &weight);

graph[src][dest] = weight;

graph[dest][src] = weight;

}
printf("Original Graph:\n");

print(V, graph);

int** mst = prims(V, graph)

return 0;

Example 1

Output:
Dry Run:
Observations:
Prim’s Algorithm:

●​ Best Case:
○​ Occurs when the graph is dense (many edges), and a priority queue with a
Fibonacci heap is used.
●​ Worst Case:
○​ Happens when the graph is sparse (few edges) and implemented with an adjacency
matrix instead of a priority queue.
○​ Every edge lookup takes O(V), leading to inefficient edge selection.
○​ Time Complexity: O(V²) (with adjacency matrix).
●​ Average Case:
○​ When the graph is neither too dense nor too sparse, the priority queue performs well,
balancing edge selection and heap updates.
○​ Time Complexity: O(E log V) (using a binary heap).

Program 2

Problem: Kruskal’s

Program: #include <stdio.h>


#include <stdlib.h>

#define MAX 100

int parent[MAX];

int findParent(int v) {
while (parent[v] != v)
v = parent[v];
return v;
}

int partition_array(int edges[MAX][3], int low, int high) {


int pivot = edges[low][2];
int i = high + 1;
for (int j = high; j > low; j--) {
if (edges[j][2] > pivot) {
i--;

int temp0 = edges[i][0], temp1 = edges[i][1], temp2 = edges[i][2];


edges[i][0] = edges[j][0];
edges[i][1] = edges[j][1];
edges[i][2] = edges[j][2];
edges[j][0] = temp0;
edges[j][1] = temp1;
edges[j][2] = temp2;
}
}

int temp0 = edges[i - 1][0], temp1 = edges[i - 1][1], temp2 = edges[i -


1][2];
edges[i - 1][0] = edges[low][0];
edges[i - 1][1] = edges[low][1];
edges[i - 1][2] = edges[low][2];
edges[low][0] = temp0;
edges[low][1] = temp1;
edges[low][2] = temp2;

return i - 1;
}

void recursive_quick_sort(int edges[MAX][3], int low, int high) {


if (low >= high) return;
int pivot = partition_array(edges, low, high);
recursive_quick_sort(edges, low, pivot - 1);
recursive_quick_sort(edges, pivot + 1, high);
}

void kruskals(int edges[MAX][3], int V, int E) {


int result[MAX][3], e = 0, i = 0, totalCost = 0;

recursive_quick_sort(edges, 0, E - 1);

for (int v = 0; v < V; v++) {


parent[v] = v;
}

while (e < V - 1 && i < E) {


int *edge = edges[i++];
int x = findParent(edge[0]);
int y = findParent(edge[1]);

if (x != y) {
result[e][0] = edge[0];
result[e][1] = edge[1];
result[e][2] = edge[2];
totalCost += edge[2];
e++;
parent[x] = y;
}
}

printf("Minimum Spanning Tree:\n");


for (i = 0; i < e; i++) {
printf("%d -- %d == %d\n", result[i][0], result[i][1], result[i][2]);
}
printf("Total cost of Minimum Spanning Tree: %d\n", totalCost);
}

int main() {
int V, E, edges[MAX][3];
printf("Enter number of vertices and edges: ");
scanf("%d %d", &V, &E);

printf("Enter edges (source destination weight):\n");


for (int i = 0; i < E; i++) {
scanf("%d %d %d", &edges[i][0], &edges[i][1], &edges[i][2]);
}

kruskals(edges, V, E);

return 0;
}
Example 1

Output:

Dry Run:
Observations:
Kruskal’s Algorithm:

●​ Best Case:
○​ Occurs when edges are already sorted by weight. The algorithm directly processes
edges without additional sorting overhead.
○​ Time Complexity: O(E log V) (due to Union-Find operations).
●​ Worst Case:
○​ Happens when edges are in random order, requiring O(E log E) sorting.
○​ If Union-Find without path compression is used, the merge operations can be slow.
○​ Time Complexity: O(E log E) (sorting dominates).
●​ Average Case:
○​ When edges are randomly distributed, the sorting step and Union-Find operations
balance out.
○​ Time Complexity: O(E log E) (sorting + Union-Find).

Conclusion:

By applying both Prim’s and Kruskal’s algorithms efficiently construct a Minimum Spanning
Tree (MST) while making locally optimal choices at each step.

From Prim’s algorithm, I learned that by expanding the MST from a starting node and always
selecting the smallest edge connected to the tree, we can efficiently build the MST, especially in
dense graphs.

From Kruskal’s algorithm, I observed that sorting edges by weight and merging components
using the Union-Find technique allows us to construct the MST efficiently, making it ideal for
sparse graphs.

These applications reinforced how the greedy approach ensures an optimal solution in different
scenarios by prioritizing the smallest possible edge at each step, leading to an efficient MST
construction.

You might also like