Asymptotic Analysis-1/KICSIT/Mr. Fahim Khan/1
Asymptotic Analysis-1/KICSIT/Mr. Fahim Khan/1
Asymptotic Notation
The standard asymptotic notations commonly used in the analysis of algorithms are known
as O (Big Oh), Ω (Big Omega), and θ(Theta).
Sometimes, additional notations o( small-oh) and ω( small-omega) are also used to show
the growth rates of algorithms
The behavior of f(n) and g(n) is portrayed in the diagram. It follows that for n<n0, ,f(n) may
lie above or below g(n), but for all n ≥ n0, f(n) falls consistently below g(n)..
Since the worst-case running time of an algorithm is the maximum running time for any
input, it would follow that g(n) provides an upper bound on the worst running time
The notation O( g(n) ) does not imply that g(n) is the worse running time; it simply means
that worst running time would never exceed upper limit determined by g(n).
=4n2
(Simplifying )
The choice of constant c is not unique. However, for each different c there is a
corresponding value of n0 which satisfies the basic relation .This behavior is illustrated
by the next example
Asymptotic Analysis-1/KICSIT/Mr. Fahim Khan/7
O-Notation
Basic Method
Example(2): In the preceding example it was shown that 3n2 + 10n ≤ c.n2 for c=4, n0=10. We
now show that the relation holds true for a different value of c and corresponding n0.
sides)
cg(n)=4n2
f(n)=3n2 + 10 n
cg(n)= 13n2
n0=1 n0=10
Growth of functions 13n2 and 4n2 versus the function 3n2 + 10 n
≤ ck g(n) for ≥ nk
where c1, c2, c3 ,.. ck are constants and n1 ,n2 ,n3,.. nk are positive integers. The functions f1(n),
f2(n), f3(n), .. fk(n) are said to belong to the class O( g(n)). In set notation, the relation is
denoted by
O( g(n) )= { f1(n) , f2(n), f3(n)…,fk(n)}
then,
Alternatively,f(n) O(g(n))notation, if
using set-builder
Asymptotic Analysis-1/KICSIT/Mr. Fahim Khan/100
Ω-Notation
Definition
If f(n) is running time of an algorithm, and g(n) is some standard growth function such
that for some positive constants c, positive integer n0 ,
The behavior of f(n) and g(n) is portrayed in the graph. It follows that for n < n0, f(n) may
lie above or below g(n), but for all n ≥ n0, f(n) falls consistently above g(n). It also implies that
g(n) grows slower than f(n)
Since the best-case running time of an algorithm is the minimum running time for any input,
it would follow that g(n) provides a lower bound on best running time
As before, the notation Ω( g(n) ) does not imply that g(n) is the best running time; it simply
means that best running time would never be lower than g(n).
For, n ≥ n / 2 for n ≥ 0
( Obvious !)
n - 10 ≥ n / (2 x 10) for n ≥ 10 ( Dividing right side by 10, to maintain the relation )
= n / 20
Therefore, by definition ,
n2 -10n = Ω(n2).
f(n)= n2 - 10n
n
Growth of function n2 /20 versus the function
f(n)= n2 - 10 n
Asymptotic Analysis-1/KICSIT/Mr. Fahim Khan/144
Ω-Notation
Basic Method
Example(2): Next we show that 3n2 -25n = Ω (n2 ).
ie n≥9 )
3n2 – 25n ≥ 9n2 / 50 for n ≥ ( Multiplying both sides with 3n to obtain
9 = 3 n / 50 the for
desired
n ≥ 9function on the )left hand side)
(Simplifying
Therefore, by definition,
f(n)=3n2 -25n
n0=9
Asymptotic Lower bound
cg(n)=9.n2/50
n
Growth of function 9 n /50 versus the function f(n)=3 n2 -25 n
2
≥ ck g(n) for ≥ nk
where c1, c2, c3 ,.. ck are constants and n1 ,n2 ,n3,.. nk are positive integers. The functions f1(n),
f2(n), f3(n), .. fk(n) are said to belong to the class Ω(g(n)). In set notation, the relation is
denoted by
then Ω( g(n)Ω(g(n))
)= { f1(n) , f2(n), f3(n)…,fk(n)}
f(n) Asymptotic Analysis-1/KICSIT/Mr. Fahim Khan/177
Alternatively, using set-builder notation , if
θ-Notation
Definition
If f(n) is running time of an algorithm, and g(n) is some standard growth function
such that for some positive constants c1 , c2 and positive integer n0 ,
The behavior of f(n) and g(n) is portrayed in the graph. It follows that for n < n0, f(n) may
be above or below g(n), but for all n ≥ n0, f(n) falls consistently between c1.g(n) and c2.g(n). It
also implies that g(n) grows as fast as f(n) The function g(n) is said to be the asymptotic tight
bound for f(n)
Asymptotic Upper Bound
Ө(g(n)) = { f(n): there exit positive constants c1,c2 and positive integer n0 such that
= 5 n / 38 for n ≥ 4
cg(n)=5n2
f(n)=5n2 -19n
cg(n)=25n2/38
n0=4 AsymptoticLower bound
n
Growth of functions 5 n2 and 25n2/38 versus the function f(n)=5 n2 -19 n
Asymptotic Analysis-1/KICSIT/Mr. Fahim Khan/ 221
Asymptotic Notation
Constant Running Time
If running time T(n)=c is a constant, i. e independent of input size, then by convention,
the asymptotic behavior is denoted by the notation
O(c) = O(1) , θ (c) = θ(1), Ω(c) = Ω(1)
The convention implies that the running time of an algorithm ,which does not depend
on the size of input, can be expressed in any of the above ways.
The above results imply that in asymptotic notation the multiplier constants in an
expression for the running time can be dropped
Ω(n2) = { √n+5n2, n2+5n, lg n+4n2, n1.5+3n2 , 5n2+n3, n3 +n2+n, lg n+4n4, nlg n+3n4 }
√n √n+5n2
√n+5n2 5n2+n3
n+5 n2+5n
n2+5n n3 +n2+n
lg n+4n2
lg n+4n 1.5
n +3n2 lg n+4n2 lg n+4n4
n1.5+n n1.5+3n2 nlg n+3n4
O(n2)
Ω(n2)
√n √n+5n2 5n2+n3
n+5 n +5n
2
n3 +n2+n
lg n+4n lg n+4n2 lg n+4n4
n1.5+3n2 nlg n+3n4
n1.5+n
θ(n2)
Proof: By definition,
f1(n) ≤ c1. g1(n) for n ≥ n1
f2(n) ≤ c2. g2(n) for n ≥ n2
Let n0 = max( n1, n2)
c3=max(c1, c2) f1(n)≤ c3. g1(n)
f2(n) ≤ c3 g2(n) for n ≥ n0
for n ≥ n0
f1(n) + f2(n) c3.g1(n) + c3. g2(n) ) for n ≥ n0
≤Let h(n) = max( g1(n) , g2(n) )
f1(n) + f2(n) ≤ 2c3.h(n) = c. h(n) where c=2c3 for n ≥ n0
f1(n) + f2(n) c. h(n) = c. max( g1(n) , g2(n) )
for n ≥ n0
≤
Therefore,
f1(n) + f2(n) = O( max( g1(n) , g2(n) )
The theorem also applies to θ and Ω notations
Asymptotic Analysis-1/IIU 2008/Dr.A.Sattar/27
Asymptotic Notations
General
then
Theorem
Theorem: If f1(n)= O(g1(n)) , f2(n) =O(g2(n)), f3(n)=O(g3(n))…., fk(n)=O(gk(n))
f1(n) + f2(n) + f3(n) ….+ fk(n) = O( max (g1 (n), g2 (n), g3 (n)…, gk(n) ) )
The theorem can be proved by using basic definition of Big-Oh
where max means fastest growing function
It follows from the theorem that in an expression consisting of sum of several functions,
the comparatively slower growing functions can be discarded in favor of the fastest
growing function to obtain the Big oh notation for the whole expression This also true for
the Ө and Ω notations
which means that the function n2 grows faster than all other functions in the expression
Thus,
f(n) = n + √n + n1.5+ lg n + n lg n + n2
= O( max (n , √n , n1.5, lg n , n lg n , n2 ) )
= O(n2)