Chapter Two - Divide & Conquer 1
Chapter Two - Divide & Conquer 1
1
Introduction
Divide and conquer is a fundamental algorithm design paradigm that
breaks a problem into smaller subproblems, solves them recursively,
and combines their solutions. The master theorem provides a way
to analyze the time complexity of divide-and-conquer algorithms.
Divide and Conquer Steps:
•Divide: Break the problem into smaller subproblems of the same
type.
•Conquer: Solve each sub problem recursively. If the subproblems
are small enough, solve them directly.
•Combine: Merge the solutions of the subproblems to form the final
solution.
2
3
Divide and conquer Example on sorting
Divide-and-conquer
approach to sort the list
(38, 27, 43, 3, 9, 82, 10) in
increasing order. Upper
half: splitting into
sublists; mid: a one-
element list is trivially
sorted; lower
half: composing sorted
sublists.
4
• The specific computer algorithms are based on the Divide & Conquer
approach:
• Maximum and Minimum Problem
• Binary Search
• Sorting (merge sort and quick sort)
5
Master Method for Analysis
•The master method applies to recurrences of the form
• T(n) = a T(n/b) + f (n) , or
•where a 1, b > 1, and f is asymptotically positive.
•T(n): Time complexity of the problem.
•a: Number of subproblems.
•b: Factor by which the problem size is divided.
•f(n): Cost of dividing the problem and combining the results.
It helps you determine the time complexity of divide-and-conquer
algorithms quickly without requiring the substitution or recursion tree
method.
6
Cases of Master Theorem:
Let p=Compare f(n) with
Case 1: If f(n)∈O(−ϵ)for somϵ>0,
Then,T(n)∈Θ().
•Non-recursive work grows slower than npn^pnp.
•The recursive work dominates.
Case 2: If f(n)∈Θ(logkn) for k≥0
then,T(n)∈Θ(logk+1n).
•Non-recursive and recursive work grow at the same rate.
•Both contribute equally.
Case 3: If f(n)∈Ω(+ϵ) for some ϵ>0 \epsilon > 0 ϵ>0, and
Condition: a f(n/b)≤c f(n) for c<1
thenT(n)∈ Θ(f(n))
•Non-recursive work grows faster than
•The non-recursive work dominates.
7
Steps to Apply Master Theorem:
1. Identify a, b, and f(n) from the recurrence.
2. Compute p=
3.Compare f(n) with to determine which case applies.
4. Write the time complexity based on the applicable case.
8
Example :merge sort
T(n) = 2T(n/2) + O(n)
Time Complexity
Here: T(n)=Θ(logn)=Θ(nlogn)
a=2: Two recursive calls. At each level of recursion,
the total merging work is
b=2: Each subproblem is half the size of the original
proportional to n.
f(n)=O(n): The merging step The number of levels in the
Step 1: Compute p===1 recursion tree is
logn .Because we divide the
Step2:compare f(n)=O(n)with = array in half at each step).
F(n)=O() which means f(n) matches Combining these, the
overall work is O(nlogn
• Step 3: Identify the Case
• From the Master Theorem:
• If f(n)=Θ(), then T(n)=Θ(logn
• Thus, merge sort falls under Case 2 of the Master Theorem. 9
Example 2 Binary Search
10
Maximum Minimum Problem
• The minimum and maximum problem involves finding the
smallest and largest elements in an array. A divide-and-
conquer approach can solve this problem more efficiently
than a naive O(n)approach in terms of comparisons.
• Let’s understand how to solve this problem step by step and
analyze its time complexity using.
11
Divide-and-Conquer Algorithm for Min and Max
•Divide:
•Split the array into two halves.
•Conquer:
•Recursively find the minimum and maximum in each
half.
•Combine:
•Compare the two minima and two maxima from the
subproblems to determine the overall minimum and
maximum.
12
13
#Pseudocode
function find_min_max(arr, low, high):
if low == high: # Base case: only one element
return (arr[low], arr[low]) # Both min and max are the same
15
Cont’d…
• Analysis:
• Method 1: if we apply the general approach to the array of size n, the
number of comparisons required are 2n-2.
• Method-2: In another approach, we will divide the problem into sub-
problems and find the max and min of each group, now max.
• Of each group will compare with the only max of another group and
min with min.
16
Cont’d…
• Let n = is the size of items in an array
• Let T (n) = time required to apply the algorithm on an array of size n.
Here we divide the terms as T(n/2).
• here tends to the comparison of the minimum with minimum and
maximum with maximum as in above example.
• T (n) = 2 T Max - Min Problem → Eq (i)
• T (2) = 1, time required to compare two elements/items. (Time is
measured in units of the number of comparisons)
17
Cont’d…
18
Cont’d…
19
Cont’d…
20
Cont’d…
21
Binary Search Using Divide and
Conquer
• 1. In Binary Search technique, we search an element in a sorted array by
recursively dividing the interval in half.
• 2. Firstly, we take the whole array as an interval.
• 3. If the Pivot Element (the item to be searched) is less than the item in the
middle of the interval, We discard the second half of the list and recursively
repeat the process for the first half of the list by calculating the new middle
and last element.
• 4. If the Pivot Element (the item to be searched) is greater than the item in
the middle of the interval, we discard the first half of the list and work
recursively on the second half by calculating the new beginning and middle
element.
• 5. Repeatedly, check until the value is found or interval is empty. 22
Example Binary search
23
Example Binary search cont..
24
Example Binary search cont.…
1. Best Case Complexity - In Binary search, best case occurs when the element
to search is found in first comparison, i.e., when the first middle element itself
is the element to be searched. The best-case time complexity of Binary search
is O(1).
• Average Case Complexity - The average case time complexity of Binary
search is O(logn).
• Worst Case Complexity - In Binary search, the worst case occurs, when we
have to keep reducing the search space till it has only one element. The
worst-case time complexity of Binary search is O(logn).
25
Analysis:
• Input: an array A of size n, already sorted in the ascending or
descending order.
• Output: analyze to search an element item in the sorted array of size
n.
• Logic: Let T (n) = number of comparisons of an item with n elements
in a sorted array.
• Set BEG = 1 and END = n
• Find mid =
• Compare the search item with the mid item.
26
Time Complexity
27
Cont’d…
• To calculate the time complexity of binary search, we have to
add:
• Divide part: O(1)
• Conquer part: solving sub problem of T(n/2)
• Combine part: O(1)
• T(n)= O(1)+T(n/2)+O(1)
• T(n)= T(n/2)+ c
• N^Log2^1= n^0=1
• According to master theorem time
complexity=1logn=O(logn)
28
Merge Sort
• Merge sort is a popular choice for sorting large datasets because it is
relatively efficient and easy to implement.
• It is often used in conjunction with other algorithms, such as
quicksort, to improve the overall performance of a sorting routine.
• Think of it as a recursive algorithm continuously splits the array in half
until it cannot be further divided.
• If the array has multiple elements, split the array into halves and
recursively invoke the merge sort on each of the halves.
• Finally, when both halves are sorted, the merge operation is applied.
29
Cont’d…
• To know the functioning of merge sort lets consider an array arr[] =
{38, 27, 43, 3, 9, 82, 10}
• Rules
• At first, check if the left index of array is less than the right index, if
yes then calculate its mid point
30
Cont’d…
• Now, as we already know that merge sort first divides the whole array
iteratively into equal halves, unless the atomic values are achieved.
• Here, we see that an array of 7 items is divided into two arrays of size
4 and 3 respectively.
31
Cont’d…
• Now, again find that is left index is less than the right index for both
arrays, if found yes, then again calculate mid points for both the
arrays.
32
Cont’d…
• Now, further divide these two arrays into further halves, until the
atomic units of the array is reached and further division is not
possible.
33
Cont’d…
• After dividing the array into smallest units, start merging the elements
again based on comparison of size of elements
• Firstly, compare the element for each list and then combine them into
another list in a sorted manner.
34
Cont’d…
• After the final merging, the list looks like this:
35
36
Time Complexity
• Recurrence relation:
T(n) = 2T(n/2) + O(n)
• Variables:
a=2
b=2
f(n) = O(n)
• Comparison:
nlogb(a) <=O(n)
n1 == O(n)
• Here we see that the cost of f(n) and the subproblems are
the same, so this is Case 2:
T(n) = O(nlogn) 37
QuickSort
• QuickSort is a sorting algorithm based on the Divide and
Conquer algorithm that picks an element as a pivot and
partitions the given array around the picked pivot by placing
the pivot in its correct position in the sorted array.
• The key process in quickSort is a partition().
• The target of partitions is to place the pivot (any element can
be chosen to be a pivot) at its correct position in the sorted
array and put all smaller elements to the left of the pivot, and
all greater elements to the right of the pivot.
38
Example
39
Choice of Pivot:
• There are many different choices for picking pivots.
• Always pick the first element as a pivot.
• Always pick the last element as a pivot (implemented below)
• Pick a random element as a pivot.
• Pick the middle as the pivot.
40
Pseudo Code for Quick Sort:
• /* low –> Starting index, high –> Ending index */
• quickSort(arr[], low, high) {
• if (low < high) {
• pi = partition(arr, low, high);
• quickSort(arr, low, pi – 1); // Before pi
• quickSort(arr, pi + 1, high); // After pi
• }
•}
41
Quick Sort
42
Cont’d…
• Consider an example given below, wherein
• P is the pivot element.
• L is the left pointer.
• R is the right pointer.
• The elements are 6, 3, 7, 2, 4, 5.
43
Cont’d…
44
Cont’d…
45
Cont’d…
46
Cont’d…
47
Cont’d…
48
Cont’d…
• Now,
• The pivot is in fixed position.
• All the left elements are less.
• The right elements are greater than pivot.
• Now, divide the array into 2 sub arrays left part
and right part.
• Take left partition apply quick sort.
49
Cont’d…
50
Cont’d…
51
Cont’d…
• Now,
• The pivot is in fixed position.
• All the left elements are less and sorted
• The right elements are greater and are in sorted order.
• The final sorted list is combining two sub arrays is 2, 3, 4, 5, 6, 7
52
53