0% found this document useful (0 votes)
36 views

Lec 5 Dec, 7 Dec

The document discusses different types of analysis for algorithms including worst-case, average-case, and best-case. It then analyzes the running time of insertion sort, selection sort, bubble sort, and merge sort. Merge sort uses a divide and conquer approach by recursively dividing the problem into subproblems, solving those subproblems, and then combining the solutions. The running time of merge sort is analyzed using recursion trees and shown to be Θ(n log n).

Uploaded by

Awais Manzoor
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views

Lec 5 Dec, 7 Dec

The document discusses different types of analysis for algorithms including worst-case, average-case, and best-case. It then analyzes the running time of insertion sort, selection sort, bubble sort, and merge sort. Merge sort uses a divide and conquer approach by recursively dividing the problem into subproblems, solving those subproblems, and then combining the solutions. The running time of merge sort is analyzed using recursion trees and shown to be Θ(n log n).

Uploaded by

Awais Manzoor
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 45

Design and Analysis of Algorithm

Kinds of Analysis
• Worst-case: T(n) = maximum time of algorithm
on any input of size n.
• Average-case: T(n) = expected time of
algorithm over all inputs of size n.
– Requires assumption of statistical distribution of
inputs.
• Best-case: T(n) = minimum time of algorithm on
any input of size n.
– Problematic because a generally slow algorithm may
works fast on some input.
Running Time
Lets analyze our algorithm once more..
INSERTION-SORT (A) cost times
1 for j ← 2 to length[A] c1 n
2 do key ← A[ j] c2 n-1
3 i←j–1 c3 n-1
4 while i > 0 and A[i] > key c4 ∑j=2..ntj
5 do A[i+1] ← A[i] c5 ∑j=2..n (tj-1)
6 i←i–1 c6 ∑j=2..n(tj-1)
7 A[i+1] = key c7 n-1
Best Case
INSERTION-SORT (A) cost times
1 for j ← 2 to length[A] c1 n
2 do key ← A[ j] c2 n-1
3 i←j–1 c3 n-1
4 while i > 0 and A[i] > key c4 ∑j=2..ntj
5 do A[i+1] ← A[i] c5 ∑j=2..n (tj-1)
6 i←i–1 c6 ∑j=2..n(tj-1)
7 A[i+1] = key c7 n-1
T(n) = (c1+c2+c3+c4+c7)n – (c2+c3+c4+c7)
Worst Case
INSERTION-SORT (A) cost times
1 for j ← 2 to length[A] c1 n
2 do key ← A[ j] c2 n-1
3 i←j–1 c3 n-1
4 while i > 0 and A[i] > key c4 ∑j=2..ntj
5 do A[i+1] ← A[i] c5 ∑j=2..n (tj-1)
6 i←i–1 c6 ∑j=2..n(tj-1)
7 A[i+1] = key c7 n-1
T(n) = (c4+c5+c6)n2/2 + (c1+c2+c3+c4/2-c5/2-
c6/2c7)n – (c2 + c3 + c4 + c7)
Average Case
INSERTION-SORT (A) cost times
1 for j ← 2 to length[A] c1 n
2 do key ← A[ j] c2 n-1
3 i←j–1 c3 n-1
4 while i > 0 and A[i] > key c4 ∑j=2..ntj
5 do A[i+1] ← A[i] c5 ∑j=2..n (tj-1)
6 i←i–1 c6 ∑j=2..n(tj-1)
7 A[i+1] = key c7 n-1
tj = j/2 .. Only out of order half the time..
T(n) = (c4+c5+c6)n2/4 + (c1+c2+c3+c4/4-c5/4-
c6/4c7)n – (c2 + c3 + c4 + c7)
Machine Independent Analysis
As the size of the input becomes large, the
constants ci don’t matter as much as the
exponents and log factors. The constants
also make machine independent analysis
impossible.
– Ignore the constants.
– Examine growth of T(n) as n → ∞.
– Asymptotic Analysis
Order of Growth
• The rate of growth is of primary interest,
so we consider only the leading term and
ignore all constants (e.g. n^2)
• Thus, the worst case running time of
Insertion Sort is Θ(n2). Quadratic time.
• We will define this more precisely later.
Searching algorithm
Linear-search(A,n,key)
1.For i  1 to n……. C1---n+1
2. if A[i] == key………C2--n
3. Return i…………….c3---n
4. Halt………………….c4--n
3.Return Nil……….....c5--n
T(n)= c1(n+1)+c2(n)+c3(n)+c4(n)+c5(n)
T(n)= c1n+c1+c2n+c3n+c4(n)+c5(n)
T(n)= Θ(n).
Can we improve it??
If element are sorted, we can improve the searching time by
applying binary search, (Divide and conquer)
Binary-search(A,v)
1.if (first == last)
2. if (first ==v)
3. Return first
4. Break
5. else mid = (first+last)/2
6. if A[mid]==v
7. Return mid
8. Break
9. Else if A[mid] > v
10. Last = mid - 1
11. Else First = mid+1
12. Binary-search(A,v)
13.Return Nil
Complexity is T(n)= Θ(logn)
Complexity of binary search
Selection Sort
Selection-Sort(A,n) Cost
1.For i  1 to n-1 C1……..
2. For j  i+1 to n C2……..
3. if (A[i]<A[j]) C3……..
4. Swap A[i] A[j] C4….....

T(n) = Θ(n2), in best, average and worst


case,
In worst case, all the statements will be executed.
In best case, c4 will not execute because condition in
statement 3 will never turn true.
It means that swapping will be reduced but number of
comparisons will still be same.
Selection Sort
Selection-Sort(A,n)
1.For i  1 to n-1
2.Min  i
3. For j  i+1 to n
4. if (A[j]<A[min])
5. Min  j
6.temp  A[i]
7.A[i]  A[min]
8.A[min]  temp
Bubble sort
Bubble-sort(A) Cost
flag=0…………………........... C1……
for i  1 to n - 1 …………… C2……
if(flag ==0)…………….... C3……
flag=1………………….. C4…....
for j  1+1 to n…....... C5…....
if (A[j]) > A[j+1])………. C6…….
swap(A[j] with A[j+1]).. C7…….
flag=0…………………. C8…….
Design Approach: Divide and
Conquer
• Divide the problem into a number of
subproblems.
• Conquer the subproblems recursively.
• Combine the subproblem solutions into the
solution for the original problem.

• Recursion: when an algorithm calls itself.


Merge Sort
• Divide: Divide an n-element array into two
subsequences of n/2 elements each.
• Conquer: Sort the two subsequences recursively
with merge sort.
• Combine: Merge the two sorted arrays to
produce the sorted sequence.

• Special Case: If the sequence has only one


element the recursion “bottoms out” as the
sequence is sorted by definition.
Merge Sort

MERGE-SORT (A[1 . . n])


1. If n = 1, return A.
2. L = A[ 1 . . n/2 ]
3. R = A[ n/2+1 . . n ]
4. L = Merge-Sort(L)
5. R = Merge-Sort(R)
6. Return Merge(L, R)
Merge Sort
Merging two sorted arrays

20 12
13 11
7 9
2 1
Merging two sorted arrays

20 12
13 11
7 9
2 1

1
Merging two sorted arrays

20 12 20 12
13 11 13 11
7 9 7 9
2 1 2

1
Merging two sorted arrays

20 12 20 12
13 11 13 11
7 9 7 9
2 1 2

1 2
Merging two sorted arrays

20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2

1 2
Merging two sorted arrays

20 12 20 12 20 12
13 11 13 11 13 11
7 9 7 9 7 9
2 1 2

1 2 7
Merging two sorted arrays

20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2

1 2 7
Merging two sorted arrays

20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2

1 2 7 9
Merging two sorted arrays

20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2

1 2 7 9
Merging two sorted arrays

20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11
7 9 7 9 7 9 9
2 1 2

1 2 7 9 11
Merging two sorted arrays

20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2

1 2 7 9 11
Merging two sorted arrays

20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2

1 2 7 9 11 12
Merging two sorted arrays

20 12 20 12 20 12 20 12 20 12 20 12
13 11 13 11 13 11 13 11 13 11 13
7 9 7 9 7 9 9
2 1 2

1 2 7 9 11 12

Time = (n) to merge a total


of n elements (linear time).
Analyzing merge sort

T(n) MERGE-SORT A[1 . . n]


(1) 1. If n = 1, done.
2T(n/2) 2. Recursively sort A[ 1 . . n/2 ]
and A[ n/2+1 . . n ] .
(n) 3. “Merge” the 2 sorted lists
Sloppiness: Should be T( n/2 ) + T( n/2 ) ,
but it turns out not to matter asymptotically.
Recurrence for merge sort
(1) if n = 1;
T(n) =
2T(n/2) + (n) if n > 1.
• We shall usually omit stating the base
case when T(n) = (1) for sufficiently
small n, but only when it has no effect on
the asymptotic solution to the recurrence.
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
T(n)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
T(n/2) T(n/2)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2

T(n/4) T(n/4) T(n/4) T(n/4)


Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2

cn/4 cn/4 cn/4 cn/4


(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn
cn/2 cn/2
h = lg n cn/4 cn/4 cn/4 cn/4

(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2
h = lg n cn/4 cn/4 cn/4 cn/4

(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n cn/4 cn/4 cn/4 cn/4

(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n cn/4 cn/4 cn/4 cn/4 cn


(1)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n cn/4 cn/4 cn/4 cn/4 cn


(1) #leaves = n (n)
Recursion tree
Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.
cn cn
cn/2 cn/2 cn
h = lg n cn/4 cn/4 cn/4 cn/4 cn


(1) #leaves = n (n)
Total(n lg n)
Conclusions

• (n lg n) grows more slowly than (n2).


• Therefore, merge sort asymptotically
beats insertion sort in the worst case.

You might also like