0% found this document useful (0 votes)
4 views

lect2_Foundations

The document outlines the design and analysis of algorithms, focusing on the divide-and-conquer approach, particularly in sorting algorithms like merge sort. It explains the steps involved in merge sort, including dividing the problem, solving subproblems, and combining solutions, along with its time complexity of Θ(n log n). The document also compares merge sort with insertion sort, highlighting its efficiency for larger inputs.

Uploaded by

ammar eneen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
4 views

lect2_Foundations

The document outlines the design and analysis of algorithms, focusing on the divide-and-conquer approach, particularly in sorting algorithms like merge sort. It explains the steps involved in merge sort, including dividing the problem, solving subproblems, and combining solutions, along with its time complexity of Θ(n log n). The document also compares merge sort with insertion sort, highlighting its efficiency for larger inputs.

Uploaded by

ammar eneen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Design and Analysis

Algorithms
Lecture 2
Dr. Metwally Rashad
2022
Table of Contents
1. Foundations
- The Role of Algorithms in Computing
- Design and Analysis algorithms
- Growth of Functions
2. Sorting
- Heapsort
- Quicksort
- Sorting in Linear Time
3. Graph Algorithms
- Elementary Graph Algorithms
- Minimum Spanning Trees
4. Advanced Design and Analysis Techniques
- Dynamic Programming
- Greedy Algorithms
5. Selected Topics
- Linear Programming
- String Matching
1/14
Ch1: Foundations 2/14
Designing Divide-and-Conquer Algorithms

 The Divide-and-Conquer Approach


- We’ll use divide and conquer to design a sorting algorithm
- The divide-and-conquer paradigm involves three steps at each level of the recursion
o Divide the problem into a number of subproblems that are smaller instances of
the same problem
o Conquer the subproblems by solving them recursively. If the subproblem sizes
are small enough, however, just solve the subproblems in a straightforward
manner
o Combine the solutions to the subproblems into the solution for the original
problem 3/14
Merge Sort Algorithm
 The merge sort algorithm follows the Big Problem
divide-and-conquer paradigm
o Divide: Divide the 𝑛-element sequence Sub-Problem Sub-Problem
to be sorted into two subsequences of
𝑛/2 elements each
Sub- Sub- Sub- Sub-
o Conquer: Sort the two subsequences
Problem Problem Problem Problem
recursively using merge sort
o Combine: Merge the two sorted
subsequences to produce the sorted Sub-Solution Sub-Solution
answer
 In the worst , average and best Cases , the
time complexity of merge sort is Θ(𝒏𝒍𝒐𝒈n) Solution
4/14
Example of Merge Sort Algorithm
Example1 Divide 8 2 9 4 5 3 1 6
Divide 8 2 9 4 5 3 1 6
Divide 8 2 9 4 5 3 1 6
1 element 8 2 9 4 5 3 1 6

Sort 2 8 4 9 3 5 1 6

Merge 2 4 8 9 1 3 5 6
Merge 1 2 3 4 5 6 8 9 5/14
Design Merge Sort Algorithm

Note that:
𝐴[𝑝 … … 𝑟]
𝑛1 =𝐴[𝑝 … . . 𝑞]
𝑛2 = 𝐴[𝑞 + 1 … … 𝑟] 6/14
Merge Sort Example
Example2

7/14
Merge Sort Example (cont.)

8/14
Analyzing Divide-and-Conquer Algorithms
 When an algorithm contains a recursive call to itself, we can often
describe its running time by a recurrence equation or recurrence,
which describes the overall running time on a problem of size 𝑛 in terms
of the running time on smaller inputs.
 let 𝑻(𝒏) be the running time on a problem of size 𝑛.
 If the problem size is small enough, say 𝒏 ≤ 𝒄 for some constant 𝒄, the
straightforward solution takes constant time, which we write as Θ(𝟏)
 Suppose that our division of the problem yields 𝒂 subproblems, each of
which is 𝟏/𝒃 the size of the original.
9/14
Analyzing Divide-and-Conquer Algorithms
 Takes time 𝑇(𝑛/𝑏) to solve one subproblem of size 𝑛/𝑏, and so it takes
time 𝑎𝑇(𝑛/𝑏) to solve of them.
 We take 𝑫(𝒏) time to divide the problem into subproblems and 𝐂(𝒏)
time to combine the solutions to the subproblems into the solution to the
original problem, we get the recurrence

10/14
Analysis of Merge Sort Algorithm

 Each divide step then yields two subsequences of size exactly 𝑛/2.

 Divide: The divide step just computes the middle of the subarray, which takes
constant time. Thus, 𝑫(𝒏) = Θ(𝟏)

 Conquer: We recursively solve two subproblems, each of size 𝑛/2, which


contributes 2𝑇(𝑛/2) to the running time.

 Combine: We have already noted that the MERGE procedure on an 𝑛-element


subarray takes time Θ(𝒏), and so 𝑪(𝒏) = Θ(𝒏)

11/14
Recurrence for Merge Sort Algorithm
 We shall usually omit stating the base case when 𝑇(𝑛) = Θ(1) for sufficiently small
𝑛, but only when it has no effect on the asymptotic solution to the recurrence.

 In next lecturer, we can use to show that 𝑻(𝒏) is 𝚯(𝒏𝒍𝒈𝒏)


 For large enough inputs, merge sort, with its Θ 𝑛𝑙𝑔𝑛 running time, outperforms
insertion sort, whose running time is Θ(𝑛2 ) in the worst case.
 Let us rewrite recurrence
Constant 𝑐 is the time required to
solve problems of size 1 12/14
Recursion Tree
o Construct a recursion tree for the recurrence 𝑻(𝒏) = 𝟐𝑻(𝒏/𝟐) + 𝒄𝒏, where c > 0
is constant.
Conclusions

 Θ(𝑛𝑙𝑔𝑛) grows more slowly than Θ(𝑛2 ).


 Therefore, merge sort is better than insertion sort in the
worst case.
 In practice, merge sort beats insertion sort for n> 30 or so .
 Go test it out for yourself!

14/14

You might also like