Assignment 5
Assignment 5
Search Algorithms
Example: A Search Algorithm* – Used in pathfinding and graph traversal, it finds the
shortest path from a start node to a goal node using heuristics.
Sorting Algorithms
Example: Decision Tree – A supervised learning algorithm used for classification and
regression, where data is split based on feature values.
1. A Search Algorithm*
The A (A-star) search algorithm* is an informed search algorithm used for finding the shortest
path in a weighted graph or grid. It is widely used in pathfinding, navigation, and artificial
intelligence for games.
2. Step-by-step process:
1. Initialize: Start with the initial node and add it to an open list (a list of nodes to be
evaluated).
2. Calculate cost: Each node has a cost function: f(n)=g(n)+h(n)f(n) = g(n) + h(n)f(n)=g(n)
+h(n)
o g(n)g(n)g(n) = cost from the start node to the current node.
o h(n)h(n)h(n) = heuristic estimate from the current node to the goal (e.g., straight-
line distance).
3. Expand the best node: Select the node with the lowest f(n)f(n)f(n) and expand its
neighbors.
4. Update costs: If a new path to a neighbor is shorter, update its cost and add it to the open
list.
5. Repeat until goal is reached: Continue expanding nodes until the goal node is found.
3. Real-world applications:
❌ Weaknesses:
2. QuickSort Algorithm
2. Step-by-step process:
1. Choose a pivot: Select an element (often the first, last, or middle) as the pivot.
2. Partition the list: Move all elements smaller than the pivot to its left and all larger
elements to its right.
3. Recursively apply QuickSort: Sort the left and right sublists using the same process.
4. Combine results: Once all elements are sorted, merge them.
3. Real-world applications:
✅ Strengths:
❌ Weaknesses:
2. Step-by-step process:
1. Start with the dataset: Identify the features (input variables) and the target (output).
2. Choose the best feature: Use criteria like Gini Impurity or Information Gain to find
the best feature to split the data.
3. Split the dataset: Divide the dataset into subsets based on the chosen feature.
4. Repeat recursively: Apply the same logic to each subset until all are classified.
5. Stop when conditions are met: If further splitting doesn’t improve accuracy, stop
growing the tree.
3. Real-world applications:
✅ Strengths:
❌ Weaknesses:
class Node:
def __init__(self, position, parent=None, g=0, h=0):
self.position = position # (x, y) coordinates
self.parent = parent # Previous node
self.g = g # Cost from start to current node
self.h = h # Heuristic cost to goal
self.f = g + h # Total cost
def __lt__(self, other):
return self.f < other.f
while open_list:
current_node = heapq.heappop(open_list) # Get node with lowest f(n)
if current_node.position == goal:
path = []
while current_node:
path.append(current_node.position)
current_node = current_node.parent
return path[::-1] # Return reversed path
closed_set.add(current_node.position)
for dx, dy in [(-1, 0), (1, 0), (0, -1), (0, 1)]: # Possible moves
neighbor_pos = (current_node.position[0] + dx, current_node.position[1] + dy)
if neighbor_pos in closed_set or not (0 <= neighbor_pos[0] < len(grid) and 0 <=
neighbor_pos[1] < len(grid[0])):
continue # Ignore if out of bounds or already visited
g_cost = current_node.g + 1
h_cost = heuristic(neighbor_pos, goal)
neighbor_node = Node(neighbor_pos, current_node, g_cost, h_cost)
heapq.heappush(open_list, neighbor_node)
if path:
print("Shortest Path:", path)
else:
print("No path found")