0% found this document useful (0 votes)
2 views

Lecture 03 Asymptotic Analysis

The document is a lecture on asymptotic notations, focusing on algorithm analysis and efficiency. It distinguishes between A Priori and A Posterior analysis, execution time cases, and introduces asymptotic notations such as Big Oh, Big Omega, and Theta. The lecture emphasizes the importance of understanding performance versus complexity in evaluating algorithms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Lecture 03 Asymptotic Analysis

The document is a lecture on asymptotic notations, focusing on algorithm analysis and efficiency. It distinguishes between A Priori and A Posterior analysis, execution time cases, and introduces asymptotic notations such as Big Oh, Big Omega, and Theta. The lecture emphasizes the importance of understanding performance versus complexity in evaluating algorithms.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 22

Lecture 03

Asymptotic Notations

Ms. Madiha Rehman


[email protected]

Institute of Computer Science


Khwaja Fareed UEIT
1
Agenda
• Introduction
• Algorithm Analysis
• A Priori Analysis
• A Posterior Analysis
• Execution Time Cases
• Asymptotic Notations
• Bio Oh
• Big Omega
• Theta
Introduction
• An important question is: How efficient is an
algorithm or piece of code?
• Efficiency covers lots of resources, including:
• CPU (time) usage
• memory usage
• disk usage
• network usage
Performance Vs.
Complexity
• Be careful to differentiate between:
1.Performance: how much time/memory/disk/... is
actually used when a program is run. This depends on
the machine, compiler, etc. as well as the code.
2.Complexity: how do the resource requirements of a
program or algorithm scale, i.e., what happens as the size
of the problem being solved gets larger.
• Complexity affects performance but not the other
way around.
Introduction (Contd…)
• The time required by a method is proportional to the
number of "basic operations" that it performs.
• Here are some examples of basic operations:
• one arithmetic operation (e.g., +, *).
• one assignment
• one test (e.g., x == 0)
• one read
• one write (of a primitive type)
A Priori
Algorithm Analysis
A Posterior
• Analysis is process of comparing two algorithms
w.r.t Time complexity and Space Complexity.

• Efficiency of an algorithm can be analyzed at two


different stages:

• Before implementation A Priori Analysis

• After implementation A Posterior Analysis


6
A Priori Analysis
• Algorithm analysis deals with the execution or running
time of various operations involved.
• In A Priori the running time of an operation can be defined
as the number of computer instructions executed.
• This is a theoretical analysis of an algorithm.
• It is Independent of a particular hardware
• Efficiency of an algorithm is measured by assuming that all other factors for
example, processor speed, are constant and have no effect on the
implementation.
• It gives approximate value.
• Asymptotic notation is used to represent A priori analysis

7
A Posterior Analysis
• This is an empirical analysis of an algorithm.
• The selected algorithm is implemented using
programming language.
• This is then executed on target computer machine.
• In this analysis, actual statistics like running time
and space required, are collected.
• Dependent on particular hardware.
• It gives exact value.

8
Execution Time Cases
• An algorithm may not have the same performance
for different types of inputs.
• With the increase in the input size, the performance will
change.
• Cases which are usually used to compare various
data structure's execution time in a relative
manner.
• Worst Case
• Average Case
• Best Case
Analysis of Time
Complexity
• Running time depends on many factors but we
compute it based on input size.
• Thus we need to calculate rate of increase in time
w.r.t input.

• Assumption:
• All arithmetical & logical operations take = 1 unit of time
• All return statements also take = 1 unit of time
Analysis of Time
Complexity
• Example of Constant Time

int sumOfNum (a,b)


{
1+1=2
int c = a+b;
1
return c;
}

Total Cost = 1+1+1 = 3

It means constant unit of time


Analysis of Time
Complexity
• Example of Linear Time

int sumOfArray (a[ ],n) Cost Total Executions


{ 1 1
int sum = 0; 1 (for i=0) 1
for (int i = 0; i<n; i++) 2 (for i<n & i++) n+1
{ 2 n
sum = a[ i ]+ sum;
} 1 1
return sum;
}

Total Cost = 1 + 1 + 2(n+1) + 2(n) + 1 = 4n + 5 f(n)

This is called Linear Time


Asymptotic Analysis
• The study of change in performance of the algorithm
with the change in the order of the input size is defined
as asymptotic analysis.
• It is used to mathematically calculate the running time of
any operation inside an algorithm.
• Asymptotic analysis is input bound
• Other than the "input" all other factors are considered
constant.
• Using asymptotic analysis, we can very well conclude
the:
• best case,
• average case, and
• worst case scenario of an algorithm.
Asymptotic Notations
• The commonly used asymptotic notations used for
calculating the running time complexity of an
algorithm are:

• Big oh Notation (Ο) Worst Case

• Big Omega Notation (Ω) Best Case

• Theta Notation (θ) Average Case


Big oh Notation (Ο)
• It is the formal way to express the upper boundary
of an algorithm running time.

• It tells how the time it takes to run your function


grows as the size of the input grows.

• It measures the worst case of time complexity

• Most widely used to analyze an algorithm


Big oh Notation (Ο)
Big oh Notation (Ο)

4n + 5 f(n)

f(n) = 4n + 5
g(n) = n , O(n)
f (n) <= c. g(n)
c. g(n) = 9n
n0 = 1
n>= n0
Big oh Notation (Ο)

5n2+2n+1 f(n)

f(n) = 5n2+2n+1
g(n) = n2 , O(n2)
f (n) <= c. g(n)
c. g(n) = 8n2
n0 = 1
n>= n0
Omega Notation (Ω-
notation)
• Omega notation represents the lower
bound of the running time of an
algorithm.
• It provides the best case complexity of an
algorithm.
Omega Notation
5n2+2n+1 f(n)
f(n) = 5n2+2+1
g(n) = n2 , O(n2)
f (n) >= c. g(n)
c. g(n) = 5n2
n0 = 1
n>= n0
Ω(g(n)) = { f(n): there exist positive constants c and n such that 0 ≤ cg(n)
0

≤ f(n)
for all n ≥ n }
Theta Notation (θ-
notation)
• Theta notation encloses the function from
above and below.
• It represents the upper and the lower
bound of the running time of an
algorithm.
• It is used for analyzing the average-case
complexity of an algorithm.
Theta Notation

(g(n)) = { f(n): there exist positive constants c1, c2 and n0 such that 0 ≤ c1g(n) ≤ f(n) ≤ c2g(n
for all n ≥ n0 }

You might also like