Lecture 2
Lecture 2
Asymptotic Analysis
T1(n)=an+b
Dominant term: a n
T3(n)=a n2+bn+c
T4(n)=an+b n +c (a>1)
Dominant term: a n2
Dominant term: an
T2(kn)=a log(kn)=T2(n)+alog k
T3(kn)=a (kn)2=k2 T3(n) T4(n)=akn=(an)k
Quadratic
Exponential
Growth Rate
Growth Rate of Diferent Functions
300 250
Function Value
lg n n lg n n square n cube 2 raise to power n
32
12
51
20
81
Data Size
32
76 8
48
92
Growth Rates
n 0 1 2 4 8 16 32 64 128 256 512 1024 2048 lgn #NUM! 0 1 2 3 4 5 6 7 8 9 10 11 nlgn #NUM! 0 2 8 24 64 160 384 896 2048 4608 10240 22528 n2 0 1 4 16 64 256 1024 4096 16384 65536 262144 1048576 4194304 n3 0 1 8 64 512 4096 32768 262144 2097152 16777216 1.34E+08 1.07E+09 8.59E+09 2n 1 2 4 16 256 65536 4.29E+09 1.84E+19 3.4E+38 1.16E+77 1.3E+154
Constant Factors
The growth rate is not affected by
constant factors or lower-order terms 102n + 105 is a linear function 105n2 + 108n is a quadratic function
Examples
Order Notation
There may be a situation, e.g.
g(n)
f(n)
Tt 1
f(n) <= g(n)
n0
for all n >= n0 Or
g(n) is an asymptotic upper bound on f(n). f(n) = O(g(n)) iff there exist two positive constants c and n0 such that f(n) <= cg(n) for all n >= n0
Order Notation
Asymptotic Lower Bound: f(n) = (g(n)),
iff there exit positive constants c and n0 such that f(n) >= cg(n) for all n >= n0
f(n)
g(n) n0 n
Order Notation
Asymptotically Tight Bound: f(n) = (g(n)),
iff there exit positive constants c1 and c2 and n0 such that c1 g(n) <= f(n) <= c2g(n) for all n >= n0
c2g(n)
f(n) c1g(n)
n0
This means that the best and worst case requires the same amount of time to within a constant factor.
Big-Oh Notation
Given functions f(n) and g(n), we say that f(n) is O(g(n)) if there are positive constants c and n0 such that
f(n) cg(n) for n n0 Example: 2n + 10 is O(n)
10,000
3n
1,000
2n+10 n
100
10
1 1 10
100
1,000
Big-Oh Example
Example: the function n2 is not O(n)
10
100
1,000
3n3 + 20n2 + 5
3n3 + 20n2 + 5 is O(n3) need c > 0 and n0 1 such that 3n3 + 20n2 + 5 cn3 for n n0 this is true for c = 4 and n0 = 21
No Yes
Yes Yes
Relatives of Big-Oh
big-Omega big-Theta little-oh ? little-omega ?
Big-Oh Rules
If is f(n) a polynomial of degree d, then f(n) is O(nd), i.e.,
1.
2.
Drop lower-order terms Drop constant factors Say 2n is O(n) instead of 2n is O(n2) Say 3n + 5 is O(n) instead of 3n + 5 is O(3n)
Problem
Basic operation Key comparison Floating point multiplication Floating point multiplication Visiting a vertex or traversing an edge
Compute an
Graph problem
n
#vertices and/or edges
Examples(1)
Sum=0; For(i=1;i<=n;i++) For(j=1;j<=n;j++) Sum++;
Examples (2)
Sum=0; For(i=1;i<=n;i++) For(j=1;j<=i;j++) Sum++;
Examples(3)
Non Recursive
Matrix multiplication Selection sort Insertion sort
Recursive
Factorial Binary Search
or
or
or
Binary Search
Algorithm check middle, then search lower or upper T(n) = T(n/2) + c
where c is some constant, the cost of checking the middle
(1)
(2)
(3)
T(n) = T(n/2) + c
T(n) = T(n/4) + 2c T(n) = T(n/8) + 3c
1
2 3
T(n) = T(n/16) + 4c
T(n) = T(n/4) + 2c
T(n) = T(n/8) + 3c T(n) = T(n/16) + 4c
=T(n/22) + 2c
=T(n/23) + 3c =T(n/24) + 4c
2
3 4