0% found this document useful (0 votes)
75 views30 pages

Lecture 1 Asymptotic Notations

This document discusses time complexity analysis of algorithms using asymptotic notations like Big-O, Omega, and Theta. It provides examples to illustrate the definitions and comparisons between different asymptotic functions like n, n^2, n^3, sqrt(n), log n, n log n, etc. It arranges these functions in ascending order based on their growth rates and proves certain relationships between them like n = o(n^2), log n = o(n), n = o(n log n), etc. Finally, it provides assignments to further practice asymptotic analysis using limits and definitions of the notations.

Uploaded by

Qory
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
75 views30 pages

Lecture 1 Asymptotic Notations

This document discusses time complexity analysis of algorithms using asymptotic notations like Big-O, Omega, and Theta. It provides examples to illustrate the definitions and comparisons between different asymptotic functions like n, n^2, n^3, sqrt(n), log n, n log n, etc. It arranges these functions in ascending order based on their growth rates and proves certain relationships between them like n = o(n^2), log n = o(n), n = o(n log n), etc. Finally, it provides assignments to further practice asymptotic analysis using limits and definitions of the notations.

Uploaded by

Qory
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 30

MCA 202: Discrete Structures

Instructor
Neelima Gupta
[email protected]
Table Of Contents

Growth Functions
Introduction
• For each problem to be solved, we may have multiple
algorithms as solution. However, the best must be
chosen.
• Comparison of the algorithms depends mainly on:
– Time Complexity
Time the algorithm takes to run.
– Memory Space
Memory the algorithm requires.

Thanks to Arti Seth, Roll No. - 4 (MCA


• Let us consider two algorithms for a same problem that run in
f1(n) and f2(n) respectively, ‘n’ being the input size (memory
words required for inputs).
• Consider the graph for f1(n) and f2(n). For smaller values
(n<n0), we don’t care, however we observe that for larger
values of n (n  n0), f1(n)  f2(n), i.e, f1(n) grows slower than
f2(n). Hence, we should prefer f1(n).

f2(n
)f (
1
n)

no

Thanks to Arti Seth, Roll No. - 4 (MCA


Asymptotic Notations
Big O Notation
• In general a function
– f(n) is O(g(n)) if there exist positive constants c and n0 such
that f(n)  c  g(n) for all n  n0
• Formally
– O(g(n)) = { f(n):  positive constants c and n0 such that f(n)
 c  g(n)  n  n0
• Intuitively, it means f(n) grows no faster than g(n).

Thanks to Arti Seth, Roll No. - 4 (MCA


Examples:
Q: f(n) = n2 g(n) = n2 – n Is f(n) = O(g(n))?

Sol 1: (by hit and trial method)

Let c = 1
Claim: n2  1(n2-n)
Proof:
To show: n2<=n(n-1)
T.S: n  n-1, which is not true for any value of n.

Let c = 2
Claim: n2  2(n2-n)
Proof:
T.S: n2  2n2-2n
T.S: 2n  n2, which is true for n  2.
Hence, f(n) = O(g(n))

Thanks to Arti Seth, Roll No. - 4 (MCA


Sol 2:
Let c = 2,
2 g(n) = 2(n2-n) = n2 + (n2-2n)
 n2 , for n2-2n  0
n2  2n
n2
= f(n)
Hence, f(n) = O(g(n)).

• If power and coefficient of leading terms of f(n) and g(n)


are same, then for simplicity ‘c’ can be taken as:
c = Number of negative terms + 1

Thanks to Arti Seth, Roll No. - 4 (MCA


Q: f(n) = n2 g(n) = n2 – n Is g(n) = O(f(n))?

Sol: g(n) = n2 – n
≤ n2 for all n ≥ 1
= f(n)
g(n) ≤ f(n) for all n ≥ 1

Hence, g(n) = O(f(n)).

Thanks to Arti Seth, Roll No. - 4 (MCA


Q: f(n) = n3 g(n) = n3 - n2 – n Is f(n) = O(g(n))?

Sol: Let c=3


3 g(n) = 3(n3 - n2 - n)
= n3 + (n3 - 3n2 ) + (n3 - 3n)
≥ n3 , for (n3 - 3n2 ) ≥ 0 and (n3 - 3n) ≥ 0
n ≥ 3 and n≥ 3
= f(n) , for n ≥ n0 , where n0 = max { 3, 3 }
=3
Hence, f(n) = O(g(n)).

Thanks to Arti Seth, Roll No. - 4 (MCA


Q: f(n) = n3 g(n) = n3 - n2 – n Is g(n) =
O(f(n))?

Sol: g(n) = n3 - n2 – n
≤ n3 for all n ≥ 1
= f(n)
g(n) ≤ f(n) for all n ≥ 1

Hence, g(n) = O(f(n)).

Thanks to Arti Seth, Roll No. - 4 (MCA


Omega Notation
• In general a function
– f(n) is (g(n)) if  positive constants c and n0 such that
0  cg(n)  f(n)  n  n0
• Intuitively, it means f(n) grows at least as fast as g(n).

f(n)
cg(n)

Thanks to Arti Seth, Roll No. - 4 (MCA


Theta Notation
• A function f(n) is O(g(n)) if  positive constants c1, c2
and n0, such that
c1g(n)  f(n)  c2g(n)  n  n0
n0 = max {n1 , n2}
c2g(
n)
f(
n)
c1g(
n)
n1
n2

Thanks to Arti Seth, Roll No. - 4 (MCA


f(n) g(n) c

n3 n3+ n2 -10n+5 f(n)=O(g(n)) >=1

n3 n3- 2n2 + 100n F(n)=O(g(n)) >1

n5 n5+ 4n4 - 3n3+ 2n2 – n +1 F(n)=Ω (g(n)) <1

n5 n5- 4n4 + 3n3- 2n2 + n -1 F(n)=Ω (g(n)) <=1

Thanks Anika Garg Roll no.-1 MCA 2012


Assignment 0: Relations
Between Q, , O
For any two functions g(n) and f(n),

f(n) = Q(g(n)) iff

f(n) = O(g(n)) and f(n) = (g(n)).


Assignment No 1
• Self study
a0 + a1 + … + an = (an+1 - 1)/(a - 1) for all a  1
– What is the sum for a = 2/3 as n  infinity? Is it O(1)?
Is it big or small?
– For a = 2, is the sum = O(2^n)? Is it big or small?

Q1 Show that a polynomial of degree k = theta(n^k).


Other Asymptotic Notations
• A function f(n) is o(g(n)) if for every positive
constant c, there exists a constant n0 > 0 such that
f(n) < c g(n)  n  n0
• A function f(n) is (g(n)) if for every positive
constant c, there exists a constant n0 > 0 such that
c g(n) < f(n)  n  n0
• Intuitively,
– () is like > – Q() is like =
– o() is like < – () is like 
– O() is like 
Arrange some functions
• f(n) = O(g(n)) => f(n) = o(g(n)) ?
• Is the converse true?
• Let us arrange the following functions in
ascending order (assume log n = o(n) is
known)
– n, n^2, n^3, sqrt(n), n^epsilon, log n, log^2 n, n
log n, n/log n, 2^n, 3^n
Relation between n & n 2

Intuitively, n appears to be smaller than n2 .


Lets prove it now.
T.P. For any constant c > 0
n < c n2
i.e. 1 < c n
i.e. n > 1 / c
Hence, n < c n2 for n > 1 / c
i.e. n = o( n2)

we can also write it as n < n2.


Relation between n2 & n3
Intuitively, n2 appears to be smaller than n3 .
Lets prove it now.
T.P. For any constant c > 0
n2 < c n3
i.e. 1 < c n
i.e. n > 1 / c
Hence, n2 < c n3 for n > 1 / c
i.e. n2 = o( n3)

we can also write it as n2 < n3.


Relation between n & n1/2
Since n = o (n2 ), we have,
For every constant c > 0, there exists n_c s.t.
n < c n2 for all n >= n_c

Thus sqrt(n) < c n for all sqrt(n) >= n_c


i.e. sqrt(n) < c n for all n >= n_c^2.

Thus,
n1/2 = o( n)
we can also write it as n1/2 < n. Combining the
previous result
n1/2 < n < n2 < n3
Relation between n & log n

• For the time being we can assume the result


log ( n ) = o(n)
log ( n ) < n
we will prove it later.
Relation between n1/2 & log n
Assume log n = o(n)

let c > 0 be any constant


for c/2 > 0 there exists m > 0 such that
log n < (c/2) n for n > m

changing variables from n to n1/2 we get


log(n1/2 ) < (c/2) n1/2 for n1/2 > m
½ log( n ) < (c/2) n1/2 for n > m2
Contd..
let m2 = k
log( n ) < c n1/2 for n > k
Since c > 0 was chosen arbitrarily hence

log n = o( n1/2 )
or
log n < n1/2
Combining the results we get
log n < n1/2 < n < n2 < n3
Relation between n 2 & nlog n
• Since log n = o(n)
for c > 0,  n0 > 0 such that  n  n0, we
have
log n < c n
Multiplying by n on both sides we get
n log( n ) < c n2  n  n0
 nlog n = o( n2 )
 nlog n < n2
Relation between n & nlog n
Solution: let c> 0 be any constant such that
n < c n log (n)
 1 < c log( n )
 log( n) > 1 / c
 n > e1/c
i.e. n < c n log n n > e1/c
Since c was chosen arbitrarily
\ n =o(n log n )or n < n log n

Combining the results we can get


log n < n1/2 < n < n logn < n2 < n3
Relation between n & n/log n
• We know that n = o(nlogn)
for c > 0,  n0 > 0 such that  n  n0, we
have
n < c n log n
dividing both sides by log n we get
n/ log( n) < c n  n  n0
Þ n / logn = o(n)
i.e. n / logn < n
Assignment No 2
• Show that log^M n = o(n^epsilon) for all
constants M>0 and epsilon > 0. Assume that
log n = o(n). Also prove the following
Corollary: log n = o(n/log n)
• Show that n^epsilon = o(n/logn ) for every
0 < epsilon < 1 .
Hence we have,

log n < n/log n < n1/2 < n < n logn < n2 < n3
Assignment No 3
• Show that
– lim f(n)/g(n) = 0 => f(n) = o(g(n)).
n→∞
– lim f(n)/g(n) = c => f(n) = θ(g(n)).
n → ∞, where c is a positive constant.
• Show that log n = o(n).
• Show that n^k = o(2^n) for every
positive constant k.
• Show by definition of ‘small o’ that
a^n = o(b^n) whenever a < b , a and b are
positive constants.

Hence we have,

log n < n/log n < n1/2 < n < n logn < n2 < n3
<2n < 3n

You might also like