0% found this document useful (0 votes)
5 views

Lecture9

The document covers advanced algorithm design topics, focusing on hash tables and binary search trees (BST). It discusses various probing techniques for hash tables, the properties and operations of BSTs, and introduces dynamic programming as a method for solving optimization problems. Key concepts include search, insertion, deletion in BSTs, and the principles of dynamic programming, including the longest common subsequence problem.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Lecture9

The document covers advanced algorithm design topics, focusing on hash tables and binary search trees (BST). It discusses various probing techniques for hash tables, the properties and operations of BSTs, and introduces dynamic programming as a method for solving optimization problems. Key concepts include search, insertion, deletion in BSTs, and the principles of dynamic programming, including the longest common subsequence problem.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 63

CMPE 130

Advanced Algorithm Design

Lecture 9
Today
• Hash Table Continued
• Binary Search Tree
Linear Probing (f(i) = i)

3
Linear Probing Example

4
Quadratic Probing

5
Quadratic Probing Example

6
Motivation To Use Hash Table

7
Chapter 11

Binary Search Tree


Motivation for binary search trees

• Let’s compare linked list and arrays:

• (Sorted) linked lists

HEAD 1 2 3 4 5 7 8
• (Sorted) arrays

1 2 3 4 5 7 8

9
Sorted linked lists
1 2 3 4 5 7 8
• O(1) insert/delete (assuming we have a pointer to
the location of the insert/delete):
HEAD 1 2 3 4 5 7 8

6
• O(n) search/select:

HEAD 1 2 3 4 5 7 8

10
Sorted Arrays 1 2 3 4 5 7 8

• O(n) insert/delete:

1 2 3 4 54.5 7 8

• O(log(n)) search, O(1) select:

1 2 3 4 5 7 8
Search: Binary search to see if 3 is in A.

Select: Third smallest is A[3].


11
The best of both worlds

Binary Search
Sorted Arrays Linked Lists
Trees*

Search O(log(n)) O(n) O(log(n))

Insert/Delete O(n) O(1) O(log(n))

12
Review

13
Review

14
Binary Search Tree

15
Binary Search Tree

16
Binary Search Tree

The binary-search-tree property allows us to print keys in a


binary search tree in order, recursively, using an algorithm called
an inorder tree walk. Elements are printed in monotonically
increasing order.
17
Binary Search Trees
• Every LEFT descendent of a node has key ≤ that node.
• Every RIGHT descendent of a node has key ≥ that node.

5 5

3 7 3 7

2 4 8 2 4 8
NOT a Binary
1 Binary Search Tree 1 Search Tree
18
Binary Search Trees

19
Inorder Walk
How INORDER-TREE-WALK
works:
• Check to make sure
that 𝑥 is not NIL.
• Recursively print the
keys of the nodes in
𝑥’s left subtree.
• Print 𝑥’s key.
Time
• Recursively print the Intuitively, the walk takes Θ(𝑛) time.
keys of the nodes in
𝑥’s right subtree.
20
Binary Search Tree

21
BST Search
Searching
Example
Search for values 𝐸, 𝐿, 𝐹, and 𝐻
in the example tree.
Time
The algorithm recurses, visiting
nodes on a downward path from
the root. Thus, running time is
O(ℎ), where ℎ is the height of
the tree.

22
BST Search
Iterative version
The iterative version unrolls
the recursion into a while
loop. It’s usually more efficient
than the recursive version,
since it avoids the overhead of
recursive calls.

23
Maximum and Minimum
The binary-search-tree property
guarantees that
• the minimum key of a binary
search tree is located at the
leftmost node, and
• the maximum key of a binary
search tree is located at the
rightmost node.
Traverse the appropriate pointers Time: Both procedures visit
(left or right) until NIL is reached. In nodes that form a downward
the following procedures, the path from the root to a leaf. Both
parameter 𝑥 is the root of a subtree. procedures run in O(ℎ) time,
The first call has 𝑥 = 𝑇. 𝑟𝑜𝑜𝑡. where ℎ is the height of the tree.

24
Successor and Predecessor

25
Successor and Predecessor

TREE-PREDECESSOR is symmetric to TREE-SUCCESSOR.


Example
Find successor of 15, 6, and 4.

26
Successor and Predecessor

27
Successor and Predecessor

28
BST Insert

29
BST Insert

30
BST Insert

31
BST Deletion

32
BST Deletion, Transplant
TRANSPLANT(𝑇, 𝑢, 𝑣)
replaces the subtree rooted at
𝑢 by the subtree rooted at 𝑣:
• Makes 𝑢’s parent become
𝑣’s parent (unless 𝑢 is the
root, in which case it makes
𝑣 the root).
• 𝑢’s parent gets 𝑣 as either
its left or right child,
depending on whether 𝑢
was a left or right child.
• Doesn’t update 𝑣. 𝑙𝑒𝑓𝑡 or
𝑣. 𝑟𝑖𝑔ℎ𝑡, leaving that up to
TRANSPLANT’s caller.
33
BST Deletion
TREE − DELETE(𝑇, 𝑧): deleting node
𝑧 from binary search tree. (Case 1 and 2
from the previous slide)

• If 𝑧 has no left child, replace 𝑧 by its


right child. The right child may or
may not be NIL. (Case 2)
• If 𝑧’s right child is NIL, then this
case handles the situation in which
𝑧 has no children. (Case 1)
• If 𝑧 has just one child, and that child
is its left child, then replace 𝑧 by its
left child. (Case 2)

34
BST Deletion
Otherwise, 𝑧 has two children. Find 𝑧’s successor 𝑦, 𝑦 must lie in 𝑧’s right
subtree and have no left child (why?).
Goal is to replace 𝑧 by 𝑦, splicing 𝑦 out of its current location.

• If 𝑦 is 𝑧’s right child, replace 𝑧 by 𝑦 and leave 𝑦’s


right child alone.

• Otherwise, 𝑦 lies within 𝑧’s right


subtree but is not the root of this
subtree. Replace 𝑦 by its own right
child. Then replace 𝑧 by 𝑦.

35
BST Deletion Example

36
BST Deletion Example

37
BST Deletion Example

38
BST Deletion

Note that the last three lines execute when 𝑧 has two
children, regardless of whether 𝑦 𝑦 is 𝑧’s right child.
39
BST Deletion
On this binary search tree 𝑇,
run the following 4 examples:
TREE − DELETE 𝑇, 𝐼
TREE − DELETE(𝑇, 𝐺)
TREE − DELETE(𝑇, 𝐾)
TREE − DELETE(𝑇, 𝐵)
Time
𝑂(ℎ), on a tree of height ℎ.
Everything is 𝑂(1) except for
the call to TREE-MINIMUM.

40
Chapter 15

Dynamic Programming
Three Principles to Design Algorithms

42
Overview
• Dynamic programming is a design method, just like divide-
and-conquer, that solves problems by combining the
solutions to subproblems.
• D&C algorithms partition the problem into disjoint
subproblems, solve the problem recursively, and combine
the solution
• Dynamic programming applies when subproblems overlap
• Subproblems share subsubproblems

43
Overview
• A DP algorithm solves each subsubproblem just once and
then saves its answer in a table, avoiding having to solve the
recomputing the answer every time it solves each
subproblem.
• • Typically DP are applied to optimization problems, which
can have many possible solutions.
• Each solution has a value and we wish to find an optimal
solution (minimum or maximum)
• The solution is “an” optimal solution instead of “the”
optimal solution since there may be several solutions that
have the same optimal values

44
DP vs. Greedy Algorithms
• Dynamic programming (DP) and greedy algorithms (GA) both
apply to optimization problems
• Both break problems into subproblems
• DP reuses the optimal solutions to the subproblems and
combine the subproblem solutions into the final solution
• Can transforms exponential-time algorithms into
polynomial-time algorithms
• GA makes each choice in a locally optimal manner of the
subproblem and eventually arrives at the solution of the
problem.
• Can arrive at an optimal solution faster than a DP
approach

45
Dynamic Programming
• Strategy:
• Solve the subproblems and store results
• Large problems are decomposed into subproblems, but
solutions to subproblems are looked up (from a table)
• Four steps method
• Characterize the structure of an optimal solution
• Recursively define the value of an optimal solution
• Compute the value of an optimal solution, typically in a
bottom up fashion
• Construct an optimal solution from computed
information
• Can transform exponential-time algorithms into polynomial-
time algorithms.
46
Example: Longest Common Sequence
• A subsequence of a string S is a set of characters that appear
in left-to-right order, but not necessarily consecutive.
• Example: S = ACTTGCG
• ACT, ATTC, T, ACTTGC are all subsequences
• TTA is not a subsequence

47
Example: DNA Sequencing

48
LCS

49
LCS
• Given two sequences x[1..m] and y[1..n]. Find a longest
sequence common to both.
• There might be several of them
• x: A B C B D A B
• y: B D C A B A

• LCS(x, y) = BDAB or BCAB or BCBA

50
LCS

51
LCS

52
LCS

53
Example

54
Optimal Substructure

55
Recursive Algorithm for LCS

56
Recursive Algorithm for LCS

57
Overlapping Subproblems

58
Memorization

59
Dynamic Programming

60
Reconstruct LCS

61
Print LCS

62
Print LCS

63

You might also like