SlideShare a Scribd company logo
ECE 250 Algorithms and Data Structures
Douglas Wilhelm Harder, M.Math. LEL
Department of Electrical and Computer Engineering
University of Waterloo
Waterloo, Ontario, Canada
ece.uwaterloo.ca
dwharder@gmail.com
© 2006-2013 by Douglas Wilhelm Harder. Some rights reserved.
Asymptotic Analysis
2
Asymptotic Analysis
Outline
In this topic, we will look at:
– Justification for analysis
– Quadratic and polynomial growth
– Counting machine instructions
– Landau symbols
– Big-Q as an equivalence relation
– Little-o as a weak ordering
2.3
3
Asymptotic Analysis
Background
Suppose we have two algorithms, how can we tell which is better?
We could implement both algorithms, run them both
– Expensive and error prone
Preferably, we should analyze them mathematically
– Algorithm analysis
2.3
4
Asymptotic Analysis
Asymptotic Analysis
In general, we will always analyze algorithms with respect to one or
more variables
We will begin with one variable:
– The number of items n currently stored in an array or other data
structure
– The number of items expected to be stored in an array or other data
structure
– The dimensions of an n × n matrix
Examples with multiple variables:
– Dealing with n objects stored in m memory locations
– Multiplying a k × m and an m × n matrix
– Dealing with sparse matrices of size n × n with m non-zero entries
2.3.1
5
Asymptotic Analysis
Maximum Value
For example, the time taken to find the largest object in an array of n
random integers will take n operations
int find_max( int *array, int n ) {
int max = array[0];
for ( int i = 1; i < n; ++i ) {
if ( array[i] > max ) {
max = array[i];
}
}
return max;
}
2.3.1
6
Asymptotic Analysis
Maximum Value
One comment:
– In this class, we will look at both simple C++ arrays and the standard
template library (STL) structures
– Instead of using the built-in array, we could use the STL vector class
– The vector class is closer to the C#/Java array
2.3.1
7
Asymptotic Analysis
Maximum Value
#include <vector>
int find_max( std::vector<int> array ) {
if ( array.size() == 0 ) {
throw underflow();
}
int max = array[0];
for ( int i = 1; i < array.size(); ++i ) {
if ( array[i] > max ) {
max = array[i];
}
}
return max;
}
2.3.1
8
Asymptotic Analysis
Linear and binary search
There are other algorithms which are significantly faster as the
problem size increases
This plot shows maximum
and average number of
comparisons to find an entry
in a sorted array of size n
– Linear search
– Binary search
n
2.3.2
9
Asymptotic Analysis
Asymptotic Analysis
Given an algorithm:
– We need to be able to describe these values mathematically
– We need a systematic means of using the description of the algorithm
together with the properties of an associated data structure
– We need to do this in a machine-independent way
For this, we need Landau symbols and the associated asymptotic
analysis
2.3.3
10
Asymptotic Analysis
Quadratic Growth
Consider the two functions
f(n) = n2 and g(n) = n2 – 3n + 2
Around n = 0, they look very different
2.3.3
11
Asymptotic Analysis
Quadratic Growth
Yet on the range n = [0, 1000], they are (relatively) indistinguishable:
2.3.3
12
Asymptotic Analysis
Quadratic Growth
The absolute difference is large, for example,
f(1000) = 1 000 000
g(1000) = 997 002
but the relative difference is very small
and this difference goes to zero as n → ∞
0.3%
0.002998
)
1000
f(
)
1000
g(
)
1000
f(



2.3.3
13
Asymptotic Analysis
Polynomial Growth
To demonstrate with another example,
f(n) = n6 and g(n) = n6 – 23n5+193n4 –729n3+1206n2 – 648n
Around n = 0, they are very different
2.3.3
14
Asymptotic Analysis
Polynomial Growth
Still, around n = 1000, the relative difference is less than 3%
2.3.3
15
Asymptotic Analysis
Polynomial Growth
The justification for both pairs of polynomials being similar is that, in
both cases, they each had the same leading term:
n2 in the first case, n6 in the second
Suppose however, that the coefficients of the leading terms were
different
– In this case, both functions would exhibit the same rate of growth,
however, one would always be proportionally larger
2.3.3
16
Asymptotic Analysis
Examples
We will now look at two examples:
– A comparison of selection sort and bubble sort
– A comparison of insertion sort and quicksort
2.3.4
17
Asymptotic Analysis
Counting Instructions
Suppose we had two algorithms which sorted a list of size n and the
run time (in ms) is given by
bworst(n) = 4.7n2 – 0.5n + 5 Bubble sort
bbest(n) = 3.8n2 + 0.5n + 5
s(n) = 4n2 + 14n + 12 Selection sort
The smaller the value, the fewer instructions are run
– For n ≤ 21, bworst(n) < s(n)
– For n ≥ 22, bworst(n) > s(n)
2.3.4.1
18
Asymptotic Analysis
Counting Instructions
With small values of n, the algorithm described by s(n) requires more
instructions than even the worst-case for bubble sort
2.3.4.1
19
Asymptotic Analysis
Counting Instructions
Near n = 1000, bworst(n) ≈ 1.175 s(n) and bbest(n) ≈ 0.95 s(n)
2.3.4.1
20
Asymptotic Analysis
Counting Instructions
Is this a serious difference between these two algorithms?
Because we can count the number instructions, we can also
estimate how much time is required to run one of these algorithms
on a computer
2.3.4.1
21
Asymptotic Analysis
Counting Instructions
Suppose we have a 1 GHz computer
– The time (s) required to sort a list of up to n = 10 000 objects is under half
a second
2.3.4.1
22
Asymptotic Analysis
Counting Instructions
To sort a list with one million elements, it will take about 1 h
2.3.4.1
Bubble sort could, under some conditions, be 200 s faster
23
Asymptotic Analysis
Counting Instructions
How about running selection sort on a faster computer?
– For large values of n, selection sort on a faster computer will always be
faster than bubble sort
2.3.4.1
24
Asymptotic Analysis
Counting Instructions
Justification?
– If f(n) = aknk + ··· and g(n) = bknk + ···,
for large enough n, it will always be true that
f(n) < Mg(n)
where we choose
M = ak/bk + 1
In this case, we only need a computer which is M times faster (or
slower)
Question:
– Is a linear search comparable to a binary search?
– Can we just run a linear search on a slower computer?
2.3.4.1
25
Asymptotic Analysis
Counting Instructions
As another example:
– Compare the number of instructions required for insertion sort and for
quicksort
– Both functions are concave up, although one more than the other
2.3.4.2
26
Asymptotic Analysis
Counting Instructions
Insertion sort, however, is growing at a rate of n2 while quicksort
grows at a rate of n lg(n)
– Never-the-less, the graphic suggests it is more useful to use insertion
sort when sorting small lists—quicksort has a large overhead
2.3.4.2
27
Asymptotic Analysis
Counting Instructions
If the size of the list is too large (greater than 20), the additional
overhead of quicksort quickly becomes insignificant
– The quicksort algorithm becomes significantly more efficient
– Question: can we just buy a faster computer?
2.3.4.2
28
Asymptotic Analysis
Weak ordering
Consider the following definitions:
– We will consider two functions to be equivalent, f ~ g, if
where
– We will state that f < g if
For functions we are interested in, these define a weak ordering
2.3.5


 c
0
( )
lim
( )
n
f n
c
g n


( )
lim 0
( )
n
f n
g n


29
Asymptotic Analysis
Weak ordering
Let f(n) and g(n) describe either the run-time of two algorithms
– If f(n) ~ g(n), then it is always possible to improve the performance of
one function over the other by purchasing a faster computer
– If f(n) < g(n), then you can never purchase a computer fast enough so
that the second function always runs in less time than the first
Note that for small values of n, it may be reasonable to use an
algorithm that is asymptotically more expensive, but we will consider
these on a one-by-one basis
2.3.5
30
Asymptotic Analysis
Weak ordering
In general, there are functions such that
– If f(n) ~ g(n), then it is always possible to improve the performance of
one function over the other by purchasing a faster computer
– If f(n) < g(n), then you can never purchase a computer fast enough so
that the second function always runs in less time than the first
Note that for small values of n, it may be reasonable to use an
algorithm that is asymptotically more expensive, but we will consider
these on a one-by-one basis
2.3.5
31
Asymptotic Analysis
Landau Symbols
Recall Landau symbols from 1st year:
A function f(n) = O(g(n)) if there exists N and c such that
f(n) < c g(n)
whenever n > N
– The function f(n) has a rate of growth no greater than that of g(n)
2.3.5
32
Asymptotic Analysis
Landau Symbols
Before we begin, however, we will make some assumptions:
– Our functions will describe the time or memory required to solve a
problem of size n
– We conclude we are restricting ourselves to certain functions:
• They are defined for n ≥ 0
• They are strictly positive for all n
– In fact, f(n) > c for some value c > 0
– That is, any problem requires at least one instruction and byte
• They are increasing (monotonic increasing)
2.3.5
33
Asymptotic Analysis
Landau Symbols
Another Landau symbol is Q
A function f(n) = Q(g(n)) if there exist positive N, c1, and c2 such that
c1 g(n) < f(n) < c2 g(n)
whenever n > N
– The function f(n) has a rate of growth equal to that of g(n)
2.3.5
34
Asymptotic Analysis
Landau Symbols
These definitions are often unnecessarily tedious
Note, however, that if f(n) and g(n) are polynomials of the same
degree with positive leading coefficients:
where
c
n
n
n


 )
g(
)
f(
lim 

 c
0
2.3.5
35
Asymptotic Analysis
Landau Symbols
Suppose that f(n) and g(n) satisfy
From the definition, this means given c > e > 0 there
exists an N > 0 such that whenever n > N
That is,
e

c
n
n
)
g(
)
f(
c
n
n
n



)
g(
)
f(
lim
e
e 


 c
n
n
c
)
g(
)
f(
   
e
e 


 c
n
n
c
n )
g(
)
f(
)
g(
2.3.5
36
Asymptotic Analysis
Landau Symbols
However, the statement
says that f(n) = Q(g(n))
Note that this only goes one way:
If where , it follows that f(n) = Q(g(n))
   
e
e 


 c
n
n
c
n )
g(
)
f(
)
g(
c
n
n
n



)
g(
)
f(
lim 

 c
0
2.3.5
37
Asymptotic Analysis
Landau Symbols
We have a similar definition for O:
If where , it follows that f(n) = O(g(n))
There are other possibilities we would like to describe:
If , we will say f(n) = o(g(n))
– The function f(n) has a rate of growth less than that of g(n)
We would also like to describe the opposite cases:
– The function f(n) has a rate of growth greater than that of g(n)
– The function f(n) has a rate of growth greater than or equal to that of
g(n)
c
n
n
n



)
g(
)
f(
lim 

 c
0
0
)
g(
)
f(
lim 


n
n
n
2.3.6
38
Asymptotic Analysis
))
(g(
)
f( n
n Θ

Landau Symbols
We will at times use five possible descriptions



 )
g(
)
f(
lim
n
n
n
))
(g(
)
f( n
n o

))
(g(
)
f( n
n ω

0
)
g(
)
f(
lim 

 n
n
n




 )
g(
)
f(
lim
0
n
n
n
0
)
g(
)
f(
lim 

 n
n
n
))
(g(
)
f( n
n O

))
(g(
)
f( n
n Ω




 )
g(
)
f(
lim
n
n
n
2.3.7
39
Asymptotic Analysis
Landau Symbols
For the functions we are interested in, it can be said that
f(n) = O(g(n)) is equivalent to f(n) = Q(g(n)) or f(n) = o(g(n))
and
f(n) = W(g(n)) is equivalent to f(n) = Q(g(n)) or f(n) = w(g(n))
2.3.7
40
Asymptotic Analysis
Landau Symbols
Graphically, we can summarize these as follows:
We say
if
2.3.7
41
Asymptotic Analysis
Landau Symbols
Some other observations we can make are:
f(n) = Q(g(n)) ⇔ g(n) = Q(f(n))
f(n) = O(g(n)) ⇔ g(n) = W(f(n))
f(n) = o(g(n)) ⇔ g(n) = w(f(n))
2.3.8
42
Asymptotic Analysis
Big-Q as an Equivalence Relation
If we look at the first relationship, we notice that
f(n) = Q(g(n)) seems to describe an equivalence relation:
1. f(n) = Q(g(n)) if and only if g(n) = Q(f(n))
2. f(n) = Q(f(n))
3. If f(n) = Q(g(n)) and g(n) = Q(h(n)), it follows that f(n) = Q(h(n))
Consequently, we can group all functions into equivalence classes,
where all functions within one class are big-theta Q of each other
2.3.8
43
Asymptotic Analysis
Big-Q as an Equivalence Relation
For example, all of
n2 100000 n2 – 4 n + 19 n2 + 1000000
323 n2 – 4 n ln(n) + 43 n + 10 42n2 + 32
n2 + 61 n ln2(n) + 7n + 14 ln3(n) + ln(n)
are big-Q of each other
E.g., 42n2 + 32 = Q( 323 n2 – 4 n ln(n) + 43 n + 10 )
2.3.8
44
Asymptotic Analysis
Big-Q as an Equivalence Relation
Recall that with the equivalence class of all 19-year olds, we only
had to pick one such student?
Similarly, we will select just one element to represent the entire class
of these functions: n2
– We could chose any function, but this is the simplest
2.3.8
45
Asymptotic Analysis
Big-Q as an Equivalence Relation
The most common classes are given names:
Q(1) constant
Q(ln(n)) logarithmic
Q(n) linear
Q(n ln(n)) “n log n”
Q(n2) quadratic
Q(n3) cubic
2n, en, 4n, ... exponential
2.3.8
46
Asymptotic Analysis
Logarithms and Exponentials
Recall that all logarithms are scalar multiples of each other
– Therefore logb(n)= Q(ln(n)) for any base b
Alternatively, there is no single equivalence class for exponential
functions:
– If 1 < a < b,
– Therefore an = o(bn)
However, we will see that it is almost universally undesirable to have
an exponentially growing function!
0
lim
lim 











n
n
n
n
n b
a
b
a
2.3.8
47
Asymptotic Analysis
Logarithms and Exponentials
Plotting 2n, en, and 4n on the range [1, 10] already shows how
significantly different the functions grow
Note:
210 = 1024
e10 ≈ 22 026
410 = 1 048 576
2.3.8
48
Asymptotic Analysis
Little-o as a Weak Ordering
We can show that, for example
ln( n ) = o( np )
for any p > 0
Proof: Using l’Hôpital’s rule, we have
Conversely, 1 = o(ln( n ))
0
lim
1
1
lim
/
1
lim
)
ln(
lim 1



 









p
n
p
n
p
n
p
n
n
p
pn
pn
n
n
n
2.3.9
49
Asymptotic Analysis
Little-o as a Weak Ordering
Other observations:
– If p and q are real positive numbers where p < q, it follows that
np = o(nq)
– For example, matrix-matrix multiplication is Q(n3) but a refined algorithm
is Q(nlg(7)) where lg(7) ≈ 2.81
– Also, np = o(ln(n)np), but ln(n)np = o(nq)
• np has a slower rate of growth than ln(n)np, but
• ln(n)np has a slower rate of growth than nq for p < q
2.3.9
50
Asymptotic Analysis
Little-o as a Weak Ordering
If we restrict ourselves to functions f(n) which are Q(np) and
Q(ln(n)np), we note:
– It is never true that f(n) = o(f(n))
– If f(n) ≠ Q(g(n)), it follows that either
f(n) = o(g(n)) or g(n) = o(f(n))
– If f(n) = o(g(n)) and g(n) = o(h(n)), it follows that f(n) = o(h(n))
This defines a weak ordering!
2.3.9
51
Asymptotic Analysis
Little-o as a Weak Ordering
Graphically, we can shown this relationship by marking these
against the real line
2.3.9
52
Asymptotic Analysis
Algorithms Analysis
We will use Landau symbols to describe the complexity of
algorithms
– E.g., adding a list of n doubles will be said to be a Q(n) algorithm
An algorithm is said to have polynomial time complexity if its run-
time may be described by O(nd) for some fixed d ≥ 0
– We will consider such algorithms to be efficient
Problems that have no known polynomial-time algorithms are said to
be intractable
– Traveling salesman problem: find the shortest path that visits n cities
– Best run time: Q(n2 2n)
2.3.10
53
Asymptotic Analysis
Algorithm Analysis
In general, you don’t want to implement exponential-time or
exponential-memory algorithms
– Warning: don’t call a quadratic curve “exponential”, either...please
2.3.10
54
Asymptotic Analysis
Summary
In this class, we have:
– Reviewed Landau symbols, introducing some new ones: o O Q W w
– Discussed how to use these
– Looked at the equivalence relations
55
Asymptotic Analysis
References
Wikipedia, https://en.wikipedia.org/wiki/Mathematical_induction
These slides are provided for the ECE 250 Algorithms and Data Structures course. The
material in it reflects Douglas W. Harder’s best judgment in light of the information available to
him at the time of preparation. Any reliance on these course slides by any party for any other
purpose are the responsibility of such parties. Douglas W. Harder accepts no responsibility for
damages, if any, suffered by any party as a result of decisions made or actions based on these
course slides for any other purpose than that for which it was intended.

More Related Content

PPT
Design and analysis of algorithm in Computer Science
PDF
Analysis of Algorithms
PDF
Design & Analysis of Algorithms Lecture Notes
PPT
CS3114_09212011.ppt
PDF
Data Structures and Algorithms Lecture 2: Analysis of Algorithms, Asymptotic ...
PPTX
Analysis of algorithms
PPT
Analysis.ppt
Design and analysis of algorithm in Computer Science
Analysis of Algorithms
Design & Analysis of Algorithms Lecture Notes
CS3114_09212011.ppt
Data Structures and Algorithms Lecture 2: Analysis of Algorithms, Asymptotic ...
Analysis of algorithms
Analysis.ppt

Similar to 2.03.Asymptotic_analysis.pptx (20)

PPTX
Analysis of algorithms
PDF
Analysis of algorithm. big-oh notation.omega notation theta notation.performa...
PDF
Data Structure - Lecture 1 - Introduction.pdf
PPT
Algorithm analysis
PPT
introegthnhhdfhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhppt
PPTX
BCSE202Lkkljkljkbbbnbnghghjghghghghghghghgh
PDF
complexity analysis.pdf
PPT
Introduction to Algorithms
PPT
Algorithms
PPT
Data Structure and Algorithms Department of Computer Science
PPT
algorithmAnalysisanddatasteucturesof.ppt
PPT
Introduction of Analysis of Algorithm , asymptotic notations
PPTX
Introduction to data structures and complexity.pptx
PDF
Discrete structure ch 3 short question's
PPT
Algorithm analysis
PPTX
Chapter two
PPT
CS8451 - Design and Analysis of Algorithms
PPT
algorithm and Analysis daa unit 2 aktu.ppt
PPT
Analysis design and analysis of algorithms ppt
PPTX
Data Structure Algorithm -Algorithm Complexity
Analysis of algorithms
Analysis of algorithm. big-oh notation.omega notation theta notation.performa...
Data Structure - Lecture 1 - Introduction.pdf
Algorithm analysis
introegthnhhdfhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhhppt
BCSE202Lkkljkljkbbbnbnghghjghghghghghghghgh
complexity analysis.pdf
Introduction to Algorithms
Algorithms
Data Structure and Algorithms Department of Computer Science
algorithmAnalysisanddatasteucturesof.ppt
Introduction of Analysis of Algorithm , asymptotic notations
Introduction to data structures and complexity.pptx
Discrete structure ch 3 short question's
Algorithm analysis
Chapter two
CS8451 - Design and Analysis of Algorithms
algorithm and Analysis daa unit 2 aktu.ppt
Analysis design and analysis of algorithms ppt
Data Structure Algorithm -Algorithm Complexity
Ad

Recently uploaded (20)

PDF
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
PDF
Fluorescence-microscope_Botany_detailed content
PDF
Taxes Foundatisdcsdcsdon Certificate.pdf
PPTX
oil_refinery_comprehensive_20250804084928 (1).pptx
PPTX
Introduction-to-Cloud-ComputingFinal.pptx
PPTX
Logistic Regression ml machine learning.pptx
PPT
Quality review (1)_presentation of this 21
PPTX
ALIMENTARY AND BILIARY CONDITIONS 3-1.pptx
PPTX
climate analysis of Dhaka ,Banglades.pptx
PPTX
IB Computer Science - Internal Assessment.pptx
PPTX
05. PRACTICAL GUIDE TO MICROSOFT EXCEL.pptx
PPTX
Global journeys: estimating international migration
PPTX
Bharatiya Antariksh Hackathon 2025 Idea Submission PPT.pptx
PPTX
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
PPTX
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
PPTX
Introduction to machine learning and Linear Models
PPTX
advance b rammar.pptxfdgdfgdfsgdfgsdgfdfgdfgsdfgdfgdfg
PPTX
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
PDF
Galatica Smart Energy Infrastructure Startup Pitch Deck
22.Patil - Early prediction of Alzheimer’s disease using convolutional neural...
Fluorescence-microscope_Botany_detailed content
Taxes Foundatisdcsdcsdon Certificate.pdf
oil_refinery_comprehensive_20250804084928 (1).pptx
Introduction-to-Cloud-ComputingFinal.pptx
Logistic Regression ml machine learning.pptx
Quality review (1)_presentation of this 21
ALIMENTARY AND BILIARY CONDITIONS 3-1.pptx
climate analysis of Dhaka ,Banglades.pptx
IB Computer Science - Internal Assessment.pptx
05. PRACTICAL GUIDE TO MICROSOFT EXCEL.pptx
Global journeys: estimating international migration
Bharatiya Antariksh Hackathon 2025 Idea Submission PPT.pptx
Introduction to Basics of Ethical Hacking and Penetration Testing -Unit No. 1...
DISORDERS OF THE LIVER, GALLBLADDER AND PANCREASE (1).pptx
Introduction to machine learning and Linear Models
advance b rammar.pptxfdgdfgdfsgdfgsdgfdfgdfgsdfgdfgdfg
MODULE 8 - DISASTER risk PREPAREDNESS.pptx
Galatica Smart Energy Infrastructure Startup Pitch Deck
Ad

2.03.Asymptotic_analysis.pptx

  • 1. ECE 250 Algorithms and Data Structures Douglas Wilhelm Harder, M.Math. LEL Department of Electrical and Computer Engineering University of Waterloo Waterloo, Ontario, Canada ece.uwaterloo.ca [email protected] © 2006-2013 by Douglas Wilhelm Harder. Some rights reserved. Asymptotic Analysis
  • 2. 2 Asymptotic Analysis Outline In this topic, we will look at: – Justification for analysis – Quadratic and polynomial growth – Counting machine instructions – Landau symbols – Big-Q as an equivalence relation – Little-o as a weak ordering 2.3
  • 3. 3 Asymptotic Analysis Background Suppose we have two algorithms, how can we tell which is better? We could implement both algorithms, run them both – Expensive and error prone Preferably, we should analyze them mathematically – Algorithm analysis 2.3
  • 4. 4 Asymptotic Analysis Asymptotic Analysis In general, we will always analyze algorithms with respect to one or more variables We will begin with one variable: – The number of items n currently stored in an array or other data structure – The number of items expected to be stored in an array or other data structure – The dimensions of an n × n matrix Examples with multiple variables: – Dealing with n objects stored in m memory locations – Multiplying a k × m and an m × n matrix – Dealing with sparse matrices of size n × n with m non-zero entries 2.3.1
  • 5. 5 Asymptotic Analysis Maximum Value For example, the time taken to find the largest object in an array of n random integers will take n operations int find_max( int *array, int n ) { int max = array[0]; for ( int i = 1; i < n; ++i ) { if ( array[i] > max ) { max = array[i]; } } return max; } 2.3.1
  • 6. 6 Asymptotic Analysis Maximum Value One comment: – In this class, we will look at both simple C++ arrays and the standard template library (STL) structures – Instead of using the built-in array, we could use the STL vector class – The vector class is closer to the C#/Java array 2.3.1
  • 7. 7 Asymptotic Analysis Maximum Value #include <vector> int find_max( std::vector<int> array ) { if ( array.size() == 0 ) { throw underflow(); } int max = array[0]; for ( int i = 1; i < array.size(); ++i ) { if ( array[i] > max ) { max = array[i]; } } return max; } 2.3.1
  • 8. 8 Asymptotic Analysis Linear and binary search There are other algorithms which are significantly faster as the problem size increases This plot shows maximum and average number of comparisons to find an entry in a sorted array of size n – Linear search – Binary search n 2.3.2
  • 9. 9 Asymptotic Analysis Asymptotic Analysis Given an algorithm: – We need to be able to describe these values mathematically – We need a systematic means of using the description of the algorithm together with the properties of an associated data structure – We need to do this in a machine-independent way For this, we need Landau symbols and the associated asymptotic analysis 2.3.3
  • 10. 10 Asymptotic Analysis Quadratic Growth Consider the two functions f(n) = n2 and g(n) = n2 – 3n + 2 Around n = 0, they look very different 2.3.3
  • 11. 11 Asymptotic Analysis Quadratic Growth Yet on the range n = [0, 1000], they are (relatively) indistinguishable: 2.3.3
  • 12. 12 Asymptotic Analysis Quadratic Growth The absolute difference is large, for example, f(1000) = 1 000 000 g(1000) = 997 002 but the relative difference is very small and this difference goes to zero as n → ∞ 0.3% 0.002998 ) 1000 f( ) 1000 g( ) 1000 f(    2.3.3
  • 13. 13 Asymptotic Analysis Polynomial Growth To demonstrate with another example, f(n) = n6 and g(n) = n6 – 23n5+193n4 –729n3+1206n2 – 648n Around n = 0, they are very different 2.3.3
  • 14. 14 Asymptotic Analysis Polynomial Growth Still, around n = 1000, the relative difference is less than 3% 2.3.3
  • 15. 15 Asymptotic Analysis Polynomial Growth The justification for both pairs of polynomials being similar is that, in both cases, they each had the same leading term: n2 in the first case, n6 in the second Suppose however, that the coefficients of the leading terms were different – In this case, both functions would exhibit the same rate of growth, however, one would always be proportionally larger 2.3.3
  • 16. 16 Asymptotic Analysis Examples We will now look at two examples: – A comparison of selection sort and bubble sort – A comparison of insertion sort and quicksort 2.3.4
  • 17. 17 Asymptotic Analysis Counting Instructions Suppose we had two algorithms which sorted a list of size n and the run time (in ms) is given by bworst(n) = 4.7n2 – 0.5n + 5 Bubble sort bbest(n) = 3.8n2 + 0.5n + 5 s(n) = 4n2 + 14n + 12 Selection sort The smaller the value, the fewer instructions are run – For n ≤ 21, bworst(n) < s(n) – For n ≥ 22, bworst(n) > s(n) 2.3.4.1
  • 18. 18 Asymptotic Analysis Counting Instructions With small values of n, the algorithm described by s(n) requires more instructions than even the worst-case for bubble sort 2.3.4.1
  • 19. 19 Asymptotic Analysis Counting Instructions Near n = 1000, bworst(n) ≈ 1.175 s(n) and bbest(n) ≈ 0.95 s(n) 2.3.4.1
  • 20. 20 Asymptotic Analysis Counting Instructions Is this a serious difference between these two algorithms? Because we can count the number instructions, we can also estimate how much time is required to run one of these algorithms on a computer 2.3.4.1
  • 21. 21 Asymptotic Analysis Counting Instructions Suppose we have a 1 GHz computer – The time (s) required to sort a list of up to n = 10 000 objects is under half a second 2.3.4.1
  • 22. 22 Asymptotic Analysis Counting Instructions To sort a list with one million elements, it will take about 1 h 2.3.4.1 Bubble sort could, under some conditions, be 200 s faster
  • 23. 23 Asymptotic Analysis Counting Instructions How about running selection sort on a faster computer? – For large values of n, selection sort on a faster computer will always be faster than bubble sort 2.3.4.1
  • 24. 24 Asymptotic Analysis Counting Instructions Justification? – If f(n) = aknk + ··· and g(n) = bknk + ···, for large enough n, it will always be true that f(n) < Mg(n) where we choose M = ak/bk + 1 In this case, we only need a computer which is M times faster (or slower) Question: – Is a linear search comparable to a binary search? – Can we just run a linear search on a slower computer? 2.3.4.1
  • 25. 25 Asymptotic Analysis Counting Instructions As another example: – Compare the number of instructions required for insertion sort and for quicksort – Both functions are concave up, although one more than the other 2.3.4.2
  • 26. 26 Asymptotic Analysis Counting Instructions Insertion sort, however, is growing at a rate of n2 while quicksort grows at a rate of n lg(n) – Never-the-less, the graphic suggests it is more useful to use insertion sort when sorting small lists—quicksort has a large overhead 2.3.4.2
  • 27. 27 Asymptotic Analysis Counting Instructions If the size of the list is too large (greater than 20), the additional overhead of quicksort quickly becomes insignificant – The quicksort algorithm becomes significantly more efficient – Question: can we just buy a faster computer? 2.3.4.2
  • 28. 28 Asymptotic Analysis Weak ordering Consider the following definitions: – We will consider two functions to be equivalent, f ~ g, if where – We will state that f < g if For functions we are interested in, these define a weak ordering 2.3.5    c 0 ( ) lim ( ) n f n c g n   ( ) lim 0 ( ) n f n g n  
  • 29. 29 Asymptotic Analysis Weak ordering Let f(n) and g(n) describe either the run-time of two algorithms – If f(n) ~ g(n), then it is always possible to improve the performance of one function over the other by purchasing a faster computer – If f(n) < g(n), then you can never purchase a computer fast enough so that the second function always runs in less time than the first Note that for small values of n, it may be reasonable to use an algorithm that is asymptotically more expensive, but we will consider these on a one-by-one basis 2.3.5
  • 30. 30 Asymptotic Analysis Weak ordering In general, there are functions such that – If f(n) ~ g(n), then it is always possible to improve the performance of one function over the other by purchasing a faster computer – If f(n) < g(n), then you can never purchase a computer fast enough so that the second function always runs in less time than the first Note that for small values of n, it may be reasonable to use an algorithm that is asymptotically more expensive, but we will consider these on a one-by-one basis 2.3.5
  • 31. 31 Asymptotic Analysis Landau Symbols Recall Landau symbols from 1st year: A function f(n) = O(g(n)) if there exists N and c such that f(n) < c g(n) whenever n > N – The function f(n) has a rate of growth no greater than that of g(n) 2.3.5
  • 32. 32 Asymptotic Analysis Landau Symbols Before we begin, however, we will make some assumptions: – Our functions will describe the time or memory required to solve a problem of size n – We conclude we are restricting ourselves to certain functions: • They are defined for n ≥ 0 • They are strictly positive for all n – In fact, f(n) > c for some value c > 0 – That is, any problem requires at least one instruction and byte • They are increasing (monotonic increasing) 2.3.5
  • 33. 33 Asymptotic Analysis Landau Symbols Another Landau symbol is Q A function f(n) = Q(g(n)) if there exist positive N, c1, and c2 such that c1 g(n) < f(n) < c2 g(n) whenever n > N – The function f(n) has a rate of growth equal to that of g(n) 2.3.5
  • 34. 34 Asymptotic Analysis Landau Symbols These definitions are often unnecessarily tedious Note, however, that if f(n) and g(n) are polynomials of the same degree with positive leading coefficients: where c n n n    ) g( ) f( lim    c 0 2.3.5
  • 35. 35 Asymptotic Analysis Landau Symbols Suppose that f(n) and g(n) satisfy From the definition, this means given c > e > 0 there exists an N > 0 such that whenever n > N That is, e  c n n ) g( ) f( c n n n    ) g( ) f( lim e e     c n n c ) g( ) f(     e e     c n n c n ) g( ) f( ) g( 2.3.5
  • 36. 36 Asymptotic Analysis Landau Symbols However, the statement says that f(n) = Q(g(n)) Note that this only goes one way: If where , it follows that f(n) = Q(g(n))     e e     c n n c n ) g( ) f( ) g( c n n n    ) g( ) f( lim    c 0 2.3.5
  • 37. 37 Asymptotic Analysis Landau Symbols We have a similar definition for O: If where , it follows that f(n) = O(g(n)) There are other possibilities we would like to describe: If , we will say f(n) = o(g(n)) – The function f(n) has a rate of growth less than that of g(n) We would also like to describe the opposite cases: – The function f(n) has a rate of growth greater than that of g(n) – The function f(n) has a rate of growth greater than or equal to that of g(n) c n n n    ) g( ) f( lim    c 0 0 ) g( ) f( lim    n n n 2.3.6
  • 38. 38 Asymptotic Analysis )) (g( ) f( n n Θ  Landau Symbols We will at times use five possible descriptions     ) g( ) f( lim n n n )) (g( ) f( n n o  )) (g( ) f( n n ω  0 ) g( ) f( lim    n n n      ) g( ) f( lim 0 n n n 0 ) g( ) f( lim    n n n )) (g( ) f( n n O  )) (g( ) f( n n Ω      ) g( ) f( lim n n n 2.3.7
  • 39. 39 Asymptotic Analysis Landau Symbols For the functions we are interested in, it can be said that f(n) = O(g(n)) is equivalent to f(n) = Q(g(n)) or f(n) = o(g(n)) and f(n) = W(g(n)) is equivalent to f(n) = Q(g(n)) or f(n) = w(g(n)) 2.3.7
  • 40. 40 Asymptotic Analysis Landau Symbols Graphically, we can summarize these as follows: We say if 2.3.7
  • 41. 41 Asymptotic Analysis Landau Symbols Some other observations we can make are: f(n) = Q(g(n)) ⇔ g(n) = Q(f(n)) f(n) = O(g(n)) ⇔ g(n) = W(f(n)) f(n) = o(g(n)) ⇔ g(n) = w(f(n)) 2.3.8
  • 42. 42 Asymptotic Analysis Big-Q as an Equivalence Relation If we look at the first relationship, we notice that f(n) = Q(g(n)) seems to describe an equivalence relation: 1. f(n) = Q(g(n)) if and only if g(n) = Q(f(n)) 2. f(n) = Q(f(n)) 3. If f(n) = Q(g(n)) and g(n) = Q(h(n)), it follows that f(n) = Q(h(n)) Consequently, we can group all functions into equivalence classes, where all functions within one class are big-theta Q of each other 2.3.8
  • 43. 43 Asymptotic Analysis Big-Q as an Equivalence Relation For example, all of n2 100000 n2 – 4 n + 19 n2 + 1000000 323 n2 – 4 n ln(n) + 43 n + 10 42n2 + 32 n2 + 61 n ln2(n) + 7n + 14 ln3(n) + ln(n) are big-Q of each other E.g., 42n2 + 32 = Q( 323 n2 – 4 n ln(n) + 43 n + 10 ) 2.3.8
  • 44. 44 Asymptotic Analysis Big-Q as an Equivalence Relation Recall that with the equivalence class of all 19-year olds, we only had to pick one such student? Similarly, we will select just one element to represent the entire class of these functions: n2 – We could chose any function, but this is the simplest 2.3.8
  • 45. 45 Asymptotic Analysis Big-Q as an Equivalence Relation The most common classes are given names: Q(1) constant Q(ln(n)) logarithmic Q(n) linear Q(n ln(n)) “n log n” Q(n2) quadratic Q(n3) cubic 2n, en, 4n, ... exponential 2.3.8
  • 46. 46 Asymptotic Analysis Logarithms and Exponentials Recall that all logarithms are scalar multiples of each other – Therefore logb(n)= Q(ln(n)) for any base b Alternatively, there is no single equivalence class for exponential functions: – If 1 < a < b, – Therefore an = o(bn) However, we will see that it is almost universally undesirable to have an exponentially growing function! 0 lim lim             n n n n n b a b a 2.3.8
  • 47. 47 Asymptotic Analysis Logarithms and Exponentials Plotting 2n, en, and 4n on the range [1, 10] already shows how significantly different the functions grow Note: 210 = 1024 e10 ≈ 22 026 410 = 1 048 576 2.3.8
  • 48. 48 Asymptotic Analysis Little-o as a Weak Ordering We can show that, for example ln( n ) = o( np ) for any p > 0 Proof: Using l’Hôpital’s rule, we have Conversely, 1 = o(ln( n )) 0 lim 1 1 lim / 1 lim ) ln( lim 1               p n p n p n p n n p pn pn n n n 2.3.9
  • 49. 49 Asymptotic Analysis Little-o as a Weak Ordering Other observations: – If p and q are real positive numbers where p < q, it follows that np = o(nq) – For example, matrix-matrix multiplication is Q(n3) but a refined algorithm is Q(nlg(7)) where lg(7) ≈ 2.81 – Also, np = o(ln(n)np), but ln(n)np = o(nq) • np has a slower rate of growth than ln(n)np, but • ln(n)np has a slower rate of growth than nq for p < q 2.3.9
  • 50. 50 Asymptotic Analysis Little-o as a Weak Ordering If we restrict ourselves to functions f(n) which are Q(np) and Q(ln(n)np), we note: – It is never true that f(n) = o(f(n)) – If f(n) ≠ Q(g(n)), it follows that either f(n) = o(g(n)) or g(n) = o(f(n)) – If f(n) = o(g(n)) and g(n) = o(h(n)), it follows that f(n) = o(h(n)) This defines a weak ordering! 2.3.9
  • 51. 51 Asymptotic Analysis Little-o as a Weak Ordering Graphically, we can shown this relationship by marking these against the real line 2.3.9
  • 52. 52 Asymptotic Analysis Algorithms Analysis We will use Landau symbols to describe the complexity of algorithms – E.g., adding a list of n doubles will be said to be a Q(n) algorithm An algorithm is said to have polynomial time complexity if its run- time may be described by O(nd) for some fixed d ≥ 0 – We will consider such algorithms to be efficient Problems that have no known polynomial-time algorithms are said to be intractable – Traveling salesman problem: find the shortest path that visits n cities – Best run time: Q(n2 2n) 2.3.10
  • 53. 53 Asymptotic Analysis Algorithm Analysis In general, you don’t want to implement exponential-time or exponential-memory algorithms – Warning: don’t call a quadratic curve “exponential”, either...please 2.3.10
  • 54. 54 Asymptotic Analysis Summary In this class, we have: – Reviewed Landau symbols, introducing some new ones: o O Q W w – Discussed how to use these – Looked at the equivalence relations
  • 55. 55 Asymptotic Analysis References Wikipedia, https://en.wikipedia.org/wiki/Mathematical_induction These slides are provided for the ECE 250 Algorithms and Data Structures course. The material in it reflects Douglas W. Harder’s best judgment in light of the information available to him at the time of preparation. Any reliance on these course slides by any party for any other purpose are the responsibility of such parties. Douglas W. Harder accepts no responsibility for damages, if any, suffered by any party as a result of decisions made or actions based on these course slides for any other purpose than that for which it was intended.