0% found this document useful (0 votes)
37 views

Design and Analysis of Algorithms (CS3052)

This document discusses algorithms and their analysis. It introduces asymptotic notations like Big-O, Omega, and Theta that are used to analyze the time complexity of algorithms as the input size increases. Examples are given of different algorithms and their time complexities, including linear search which has best, average, and worst case complexities of Ω(1), Θ(n), and O(n) respectively.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
37 views

Design and Analysis of Algorithms (CS3052)

This document discusses algorithms and their analysis. It introduces asymptotic notations like Big-O, Omega, and Theta that are used to analyze the time complexity of algorithms as the input size increases. Examples are given of different algorithms and their time complexities, including linear search which has best, average, and worst case complexities of Ω(1), Θ(n), and O(n) respectively.
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 25

Design and Analysis of

Algorithms (CS3052)
Prof. G. G. Shingan
Computer science and Engineering
Department
RIT, Rajaramnagar.
Lecture No.2
General overview of algorithm,
searching and Sorting
Techniques
CHAPTER No. 1
Outline
• Asymptotic Notations
▫ Best Case
▫ Average Case
▫ Worst Case
Learning Outcomes
• Analyze running time complexity of algorithms
• Compare running time complexity of two different
algorithms
Design and Analysis of Algorithm
PROBLEM

Program
Design

A1 A2 A3 A4 A5 An

1. Space Complexity
2. Time Complexity Analysis

Program
7

Analysis of Algorithms
• An algorithm is a finite set of precise instructions for
performing a computation or for solving a problem.
• What is the goal of analysis of algorithms?
• To compare algorithms mainly in terms of running time but
also in terms of other factors (e.g., memory requirements,
programmer's effort etc.)
• What do we mean by running time analysis?
• Determine how running time increases as the size of the
problem increases.
8

How do we compare algorithms?


• We need to define a number of objective measures.

(1) Compare execution times?


Not good: times are specific to a particular computer !!

(2) Count the number of statements executed?


Not good: number of statements vary with the style of
individual programmer.
9

Ideal Solution

• Express running time as a function of the input size n


(i.e., f(n)).
• Compare different functions corresponding to running
times.
• Such an analysis is independent of machine time,
programming style, etc.
10

Input Size
• Input size (number of elements in the input)
▫ size of an array
▫ polynomial degree n! 2n

▫ # of elements in a matrix
▫ # of bits in the binary representation of the input
▫ vertices and edges in a graph
Types of Analysis
• Worst case
▫ Provides an upper bound on running time
▫ An absolute guarantee that the algorithm would not run longer, no
matter what the inputs are
• Best case
▫ Provides a lower bound on running time
▫ Input is the one for which the algorithm runs the fastest
• Average case
▫ Provides a prediction about the running time
▫ Assumes that the input is random

Lower Bound  Running Time  Upper Bound


12

Asymptotic Analysis
• Asymptotic Notations are languages that allow us to
analyze an algorithm's running time by identifying its
behavior as the input size for the algorithm increases.
• To compare two algorithms with running times f(n) and
g(n), we need a rough measure that characterizes how
fast each function grows.
• Hint: use rate of growth
• Compare functions in the limit, that is, asymptotically!
(i.e., for large values of n)
13

Asymptotic Notation

• O notation: asymptotic “less than”: worst

▫ f(n)=O(g(n)) implies: f(n) “≤” g(n)

•  notation: asymptotic “greater than”: best

▫ f(n)=  (g(n)) implies: f(n) “≥” g(n)

•  notation: asymptotic “equality”:

▫ f(n)=  (g(n)) implies: f(n) “=” g(n)


Asymptotic notations(O-notation)

f(n)= n2 and g(n)=2n


n f(n) g(n)
1 1 2
2 4 4
3 9 8
4 16 16
5 25 32
Big-O Visualization
O(g(n)) is the set of
functions with
smaller or same order
of growth as g(n)
Examples
▫ F(n)= 3n+2 g(n)=n
c=4 and n0 = 2

▫ F(n)=n2+n g(n)=n3
Examples
▫ F(n)= 3n+2 g(n)=n F(n)= O( g(n) )
c=4 and n0 = 2 Input Values Condition Remark
satisfactio
n(yes/no)
N=0 and F(n)=2 F(n)>c.g(n) No
c=1 G(n)=0
N=1 and F(n)=5 F(n)>c.g(n) No
c=1 G(n)=1

N=2 and F(n)=7 F(n)>c.g(n) We need to


c=1 G(n)=2 change the
value of c
N=2 and F(n)=7 F(n)<c.g(n) For N=2 and
c=4 G(n)=8 c=4 g(n) is
upper bound
of f(n)
Asymptotic notations ( - notation)

(g(n)) is the set of functions


with larger or same order of
growth as g(n)
Examples
▫ F(n)= 3n+2 g(n)=n
c = 1 and n0 = 1

▫ F(n)=n3+4n2 g(n)=n2
20

Asymptotic notations (Θ - notation)


Asymptotic notations (-notation)

(g(n)) is the set of functions


with the same order of growth
as g(n)
Examples
▫ F(n)= 3n+2 g(n)=n
c1 = 1 and c2=4 and n0 = 1

▫ F(n)=n2+5n+7 g(n)=n2
22

Common orders of magnitude


23

Examples

▫ F(n)=2n2 g(n)=n3

▫ F(n)= n2 g(n)= n2

▫ F(n)= 1000n2+1000n g(n)=n2

▫ F(n)= n g(n)=n2
24

Examples

▫ 2n2 = O(n3): 2n2 ≤ cn3  2 ≤ cn  c = 1 and n0= 2

▫ n = O(n ):
2 2 n2 ≤ cn2  c ≥ 1  c = 1 and n0= 1

▫ 1000n2+1000n = O(n2):
1000n2+1000n ≤ 1000n2+ n2 =1001n2 c=1001 and n0 = 1000

▫ n = O(n ):
2 n ≤ cn2  cn ≥ 1  c = 1 and n0= 1
Example: Linear Search
• N=10 5 7 1 2 4 10 20 11 28 3

• Searching element is ‘x’

X Number of complexity
compariso
ns
5 (first 1 Ω(1) Best case
element)
3 (last 10 O(n) Worst case
element)
4 n/2 (n/2) Average case
=(n)

You might also like