0% found this document useful (0 votes)
5 views

lecture 5+6-4

Uploaded by

tanveer457928
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

lecture 5+6-4

Uploaded by

tanveer457928
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 40

Design and Analysis of Algorithm

COMP 222

Lecture 5+6

Dr. Huma Ayub


[email protected]

10/16/2024
Design and Optimization Example

Maximum Sum Subsequence


Maximum Sum Subsequence

• Consider the string A[1..n] of both positive and negative integers. The
goal is to find the subsequence in A with the maximum sum.

Example
• A[1..8] = {2, -4, 1, 9, -6, 7 -5, 3}. The subsequence with maximum sum
is {2, 1, 9, 7, 3} and the sum is 22.
Maximum Sum Subsequence

• (1) Brute-force solution


• The brute-force way to solve this problem is to compute all possible
subsequences and determine the one that has the maximum sum.
For n elements, there are 2n subsequences possible.
• The brute-force solution can be implemented using recursion given as
follows.
• Let MSS(1..n) denote the length of the longest subsequence for S.
Maximum Sum Subsequence

• (1) Brute-force solution



Maximum Sum Subsequence
• The recursion tree is shown below. Each call path from the root to the
leaf gives one of the 2n possible subsequences. Let MSS(1,n) denote
the maximum sum subsequence of the A[1..n].
• Time complexity
• The recursive solution takes O(2n) time


Optimization : Greedy approach
• To apply greedy strategy we need to first check if the problem exhibits
(i) optimal substructure property and
(ii) greedy choice property.
• Optimal substructure: The optimal solution to a problem contains
within itself optimal solutions to its subproblems.
• Greedy choice property: Locally optimal solutions leads to globally
optimum solution.
Optimization : Greedy approach
• To apply greedy strategy we need to first check if the problem exhibits
(i) optimal substructure property and
(ii) greedy choice property.
• Optimal substructure: The recursive fomulation above reveals the
optimal substructure. The problem of finding maximum sum
subsequence of A[1..n] contains within itself the subproblem of
finding maximum sum subsequence of A[1..n-1].
• Greedy choice property: To check if a greedy solution exists or not,
we need to determine if greedy choice property is satisfied. i.e. do we
have any way to decide which of the 2 subproblems need to be solved
at each recursive step?
Optimization : Greedy approach
• Greedy choice property: To check if a greedy solution exists or not, we need to
determine if greedy choice property is satisfied. i.e. do we have any way to decide which
of the 2 subproblems need to be solved at each recursive step?
• Yes. We can make a clear choice by checking if the element A[i] is positive or not. If A[i] >
0, it helps improve the sum. So we can take the right path which includes A[i]. Else, we
can take the left path which excludes A[i]. The recurrence equation for the greedy
version reduces to
• T(n) = T(n-1) + 1 = O(n)
• We initially start with sum = 0 and scan though the elements. If an element is > 0, we will
add the element to the sum. After scan is complete we return the sum. Checking if the
element is > 0 takes O(1) time while the number of iterations is n.
• There is one corner case to be dealt with. If all the elements happen to be negative, as
per our strategy no element will be picked. In this case the answer should be the max
element.
Optimization : Greedy approach
• The greedy solution can be implemented in an iterative fashion as
follows.
Optimization : Divide-and-Conquer Solution
This problem has a divide-and-conquer solution too. The basic idea is to split the array into
almost equal halves recursively until there exists only one element. While merging pick the
bigger of the three quantities: MSS[1..n/2], MSS{n/2+1..n], MSS[1..n/2]+MSS{n/2+1..n].
For the given example,
Optimization : Divide-and-Conquer Solution
Optimization : Divide-and-Conquer Solution
Algorithm
• The divide-and-conquer algorithm for maximum subsequence sum is
given as follows.

Time complexity
There depth of the recursion tree is logn. Divide phase requires O(1) operation. During the conquer phase, at most
n/2 comparisons are done. Hence, the time complexity is O(nlogn).
Multiplying large integers
Multiplying large integers

• Let A[n-1….0] is an integer


• Let m= n/2
• Let w=A[n-1…m] x=A[m-1….0]
• And y=B[n-1…m] z=[m-1….0]
W X A
Y Z B

WZ
WY XY XZ

WY WZ+XY XZ Product
Multiplying large integers

• Let W,X,Y,Z are n/2 digit number then


• A= W.10m + X
• B= Y.10m + Z

and their product is

• A.B =(W.Y) 102m +(WZ +XY) 10m + XZ

• Example A= 1234 W=12 X=34


• B= 5678 Y=56 Z=78
Multiplying large integers

• Example A= 1234 W=12 X=34


• B= 5678 Y=56 Z=78

• A . B = (W . 56) 102m +(WZ + X Y) 10m + X Z


• 1234X5678 =(12X56) 102X2 +(12X78 + 34X56) 102 + 34X78
• =7006652
ANALYSIS
• A.B =(W.Y) 102m +(WZ +XY) 10m + XZ
• 3T(n/2)+1
•By iterative method O(3log2n) or O(nlog23 )
Logrithm Rule
• blogan=n If b=a
• Else
• blogan=nlogab
ITERATIVE Method
Logarithms
– properties of logarithms:
logb(xy) = logbx + logby
logb (x/y) = logbx - logby
logbxa = alogbx
logba = logxa/logxb
blogan=n If b=a
Else

blogan=nlogab
Divide and Conquer

Sorting Algorithms
Sorting algorithms
• Selection and bubble sort have quadratic
best/average/worst-case performance
• Insertion sort has quadratic average-case
and worst-case performance
• The faster comparison based algorithm ?
O(nlogn)

• Mergesort and Quicksort


Merge Sort (Divide and
Conquer)
Example
• Partition into lists of size n/2

[10, 4, 6, 3, 8, 2, 5, 7]

[10, 4, 6, 3] [8, 2, 5, 7]

[10, 4] [6, 3] [8, 2] [5, 7]

[4] [10] [3][6] [2][8] [5][7]


Example Cont’d
• Merge

[2, 3, 4, 5, 6, 7, 8, 10 ]

[3, 4, 6, 10] [2, 5, 7, 8]

[4, 10] [3, 6] [2, 8] [5, 7]

[4] [10] [3][6] [2][8] [5][7]


Analysis of Merge Sort
Analysis of Merge Sort
•Time Complexity:
• Best Case: O(n log n), When the array is already sorted or nearly
sorted.
• Average Case: O(n log n), When the array is randomly ordered.
• Worst Case: O(n log n), When the array is sorted in reverse order.
•Auxiliary Space: O(n), Additional space is required for the temporary array
used during merging.
Analysis of Merge Sort
• Θ(nlgn) grows more slowly than Θ(n2)
• Therefore, merge sort asymptotically beats insertion sort in the worst
case.

• In practice, merge sort beats insertion sort for n >=3


Quick Sort
88 52
14
31
25 98 30
62 23
79

Divide and Conquer


Quick
Sort
Partition set into two using
randomly chosen pivot

88 52
14
31
25 98 30
62 23
79

14 88
98
30 ≤ 52 ≤
31 62
25 23 79
Quick Sort
14 88
98
30 ≤ 52 ≤
31 62
25 23 79

sort the first half. sort the second half.

14,23,25,30,31 62,79,98,88
Quick Sort
14,23,25,30,31
52
62,79,88,98

Glue pieces together.

14,23,25,30,31,52,62,79,88,9
8
DIVIDE AND CONQUER : QUICK SORT
Analysis of Quick Sort
Analysis of Quick Sort
Best Case
Best Case
WORST Case
WORST Case
WORST Case

T(n-k)+kcn –(n(n+1)/2)).c
} n-k=1
K=n-1
T(n)= T(1) +n-1(cn)+(n(n+1)/2)c
O(n2)

You might also like