Algorithm Design
Algorithm Design
STRUCTURE
AND
ALGORITHM DESIGN
TECHNIQUES
In this module we will discuss
hierarchical relationships
a hierarchy
Root node
Child node
Application of Tree data structure
Organization chart
Tree Representation
A binary tree is one in which each node has a maximum of two children.
Thus the node can have one child or two children or none at all.
BST is a collection of nodes arranged in a way where they maintain BST properties.
BST properties:
• Every node in the left subtree has a key, whose value is
less than the value of its parent node’s key value
class TreeNode {
int data;
TreeNode left, right;
}
Search
Algorithm for search an element in BST
Iterative-Tree-Search(x, k)
start at the root
1. while x NIL and k key[x]
REPEAT until you reach a terminal node
2. do if k < key[x]
IF value at the node = search value THEN
3. then x left[x]
found the element and return
4. else x right[x]
IF value at node < search value THEN
5. return x
move to left descendant
ELSE move to right descendant
END REPEAT Recursive-Tree-Search(x, k)
1. if x = NIL or k = key[x]
2. then return x
3. if k < key[x]
k is the key that is searched for and 4. then return Tree-Search(left[x], k)
x is the start node 5. else return Tree-Search(right[x], k)
Insertion
Preorder traversal:
The order of visit is: The current, then the left subtree node, then
the right subtree nodes
Thus the root is present in-between the left and right subtrees.
Inorder traversal:
The order of visit is: The left subtree nodes, the current node,
then the right subtree nodes.
Thus the root is present before the left and right subtree nodes
Postorder traversal:
The order of visit is: The left subtree nodes, the right subtree
nodes, then the current node
Thus the root is present after the left and right subtree nodes
Preorder traversal algorithm
Preorder Traversal:
• Visit and print the root node
• Recursively traverse left sub-tree(in pre-order)
• Recursively traverse right sub-tree(in pre-order)
Inorder Traversal:
• Recursively traverse left subtree(inorder)
• Visit and print the root node
• Recursively traverse right subtree(inorder)
Postorder Traversal:
• Recursively traverse left subtree (in post order)
• Recursively traverse right subtree (in post order)
• Visit and print the root node
Directed Graphs.
• A graph whose edges are directed (i.e. have a direction). An arrow
from u to v is drawn only if (u,v) is in the Edge set.
• In a directed graph the order of the vertices in the pairs in the edge
set matters.
• The Edge set = {(A,B),(B,C),(D,C), (B,D),(D,B) ,(E,D),(B,E)}
Graph variations
Cyclic Graph : A graph contains cycles or closed regions Acyclic Graph: A graph contains no cycles
Adjacency List
Cons:
Cons: • Queries like whether there is an edge from
• Consumes huge amount of memory for storing
vertex u to vertex v; are not efficient and can
big graphs
be done O(V)
Graph Representations
Adjacency Matrix:
Adjacency Matrix is a 2D array of size V x V
Adjacency matrix for the
V is the number of vertices in a graph
given graph
Example:
Let the 2D array be graph[5][5]. A
slot graph[i][j] = 1 indicates that
there is an edge from vertex i to
vertex j
Greedy Algorithm
Dynamic Programming
He repeatedly eats his favorite candy among the other candies, until
he’s satisfied or until his mother tells him to stop!
The local optimal (greedy) choice Sub-problems are solved first. Greedy
choice can be made first before solving further sub-problems.
Greedy algorithms are simpler and more efficient when we compare with
other optimization problem solutions.
Examples for Greedy algorithm
Project
Completion Assembly
Deployed
Examples of Divide and Conquer algorithm
Binary search
Merge sort
Quick sort
Dynamic Programming
Like Greedy algorithm, Dynamic Programming algorithm is also used to solve optimization problems
Like Divide and Conquer algorithm, Dynamic Programming is also an algorithmic paradigm that solves
a given complex problem by breaking it into similar
sub problems and stores the results of sub problems to avoid computing the same results again.
Before solving the in-hand sub-problem, dynamic algorithm will try to examine the results of the
previously solved sub-problems.
The solutions of sub-problems are combined in order to achieve the best solution.
Dynamic Programming
Dynamic Programming
Main concept:
• The problem should be able to divide itself into smaller overlapping sub-problems.
• Solve smaller instances once.
• Record solutions in an array/table (memorization).
• Extract solutions to the initial instance from that table.
Dynamic programming is comparatively slow but can solve many problems that greedy algorithm cannot solve.
It yields an optimal solution.