0% found this document useful (0 votes)
1 views

Handout 1 Complexity Part 1

The document provides an overview of algorithm complexity analysis, focusing on estimating program run times, comparing algorithm efficiency, and understanding time complexity. It discusses methods for calculating time complexity, including the use of computational complexity and asymptotic analysis, while introducing concepts such as Big-O notation. Additionally, it highlights the importance of ignoring constants and lower-order terms when approximating time complexity for large input sizes.

Uploaded by

Jasia Nisar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

Handout 1 Complexity Part 1

The document provides an overview of algorithm complexity analysis, focusing on estimating program run times, comparing algorithm efficiency, and understanding time complexity. It discusses methods for calculating time complexity, including the use of computational complexity and asymptotic analysis, while introducing concepts such as Big-O notation. Additionally, it highlights the importance of ignoring constants and lower-order terms when approximating time complexity for large input sizes.

Uploaded by

Jasia Nisar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 21

Complexity Analysis

REVIEW
Text is mainly from

Chapter 2 of Alan Weiss’s book


Chapter 2 of Adam Drozdek’s book

Page:1
Algorithm Complexity
2Design and Analysis of Algorithms Chapter 2.1

Purpose of Algorithm Analysis

• To estimate how long a program will run.


• To estimate the largest input that can reasonably be given
to the program.
• To compare the efficiency of different algorithms.
• To help focus on the parts of code that are executed the
largest number of times.
• To choose an algorithm for an application.

Page:2
Algorithm Complexity
Cost of an Algo
• How to compute time complexity?
– We measure the cost of an algorithm in terms of
time and space.

– Time is usually considered more imp than space.

– For now we will only concentrate on time

Page:3
Algorithm Complexity
How to compute time complexity?
• Idea : Code the two algorithms and note the time they take.

• Issues: The machine should be same, and the language should


be same. We need to program and then we know the time
complexity.
– Not a reasonable approach

• Conclusion: The comparison should independent of


hardware/compiler/operating system

• To evaluate an algorithm’s efficiency, real-time units such as


microseconds and nanoseconds should not be used.

Page:4
Algorithm Complexity
How to compute time complexity?
• Idea2: Compute the time complexity without programing
(executing) the algorithm.

• How ?
• We need logical units that express a relationship between the
size n of input and the time t required to process the data
should be used.

• To compare the efficiency of algorithms, a measure of the


degree of difficulty of an algorithm called computational
complexity.

Page:5
Algorithm Complexity
Complexity function T(n)
• Estimate the performance of an algorithm through
– The number of operations required to process an input
– Process an input of certain size

• Require a function expressing relation between n & t called


time complexity function T(n)
– For calculating T(n) we need to compute the total number
of program steps …
• can be the number of executable statements or meaningful
program segment (comparison and assignment statements)

Page:6
Algorithm Complexity
7
The Execution Time of Algorithms

• Each operation in an algorithm (or a program) has a cost.


– Each operation takes a certain amount of time.
• To keep things simple lets assume each operation (comparison,
assignment, +,*, / etc) takes 1 unit of time (logical)
• count = count + 1;
– take a certain amount of time, but it is constant

A sequence of operations:
count = count + 1; 1 unit of time
sum = sum + count; 1 unit of time
➔ Total Cost = 2

Page:7
Algorithm Complexity
8
Analysis of Loop
Algo Cost (no of time it is executed)

i=0; 1
sum = 0; 1
while(i<N ){ N+1
sum ++; N
N
i++
}

T(N) = 1 + 1+ N+1 + N +N
T(N) = 3N+3

The condition always executes one more time than the loop
itself
Page:8
Algorithm Complexity
Analysis of Loop
• 2 important things to Note
– What is the effect of constant in T(N) = 3N + 3
– How T(n) varies for different N (input)
• Check for N =5, 10, 20,50,100,500, 1000, 10000 ,100000
T(N) 3N+3 3N+300
Chart Title
10 33 330
450
20 63 360
400
350 30 93 390
300
T(N) = 3N + 3 40 123 420
250
50 153 450
200
150 100 303 600
100 150 453 750
50
200 603 900
0
0 50 100 150 200 5000 15003 15300
Series2 10000 30003 30300
15000 45003 45300
20000 60003 60300
Page:9
Algorithm Complexity
10
NESTED Loop
Example: Nested Loop
Cost Times
i=1; 1 1
sum = 0; 1 1
while (i <= n) { 1 n+1
j=1; 1 n
while (j <= n) { 1 n*(n+1)
sum += i; 1 n*n
j++; 1 n*n
}
i++; 1 n
}
T(N) = 1 + 1 + (n+1) + n + n*(n+1)+n*n+ n*n+ n = 3n2 + 3n +3
➔ The time required for this algorithm is proportional to n2

Page:10
Algorithm Complexity
Partial SUM
Cost
int sum( int n ) 0
{
int partialSum;
1
1 partialSum = 0;
2 for( int i = 1; i <= n; ++i ) i=1 cost is 1,
i<=N cost is N+1
++i cost is N
3 partialSum += i * i * i; 4 ops 1 assignment, 3 multiplication, 1 add
Total 4 cost for line 3
4 return partialSum;
} 0

T(N)= 6N+4

• If we had to perform all this work every time we


needed to analyze a program, the task would quickly
become infeasible.
Page:11
Algorithm Complexity
Ignore Constants and Low order terms
• If T(n) = 7n+100
• What is T(n) for different values of n???
n T(n) Comment
1 107 Contributing factor is 100
5 135 Contributing factor is 7n and 100
10 170 Contributing factor is 7n and 100
100 800 Contribution of 100 is small
1000 7100 Contributing factor is 7n
10000 70100 Contributing factor is 7n
106 7000100 What is the contributing factor????

When approximating T(n) we can IGNORE the 100 term for very large value of n
and say that T(n) can be approximated by 7(n)
Page:12
Algorithm Complexity
Example 2
• T(n) = n2 + 100n + log10n +1000
n T(n) n2 100n log10n 1000

Val % Val % Val % Val %


1 1101 1 0.1% 100 9.1% 0 0% 1000 90.8%
10 2101 100 5.8% 1000 47.6% 1 0.05% 1000 47.6%
100 21002 10000 47.6% 10000 47.6% 2 0.99% 1000 4.76%

105 10,010,001,005 1010 99.9% 107 .099% 5 0.0% 1000 0.00%

When approximating T(n) we can IGNORE the last 3 terms and say that T(n) can be
approximated by n2

Page:13
Algorithm Complexity
Problems with T(n)
• T(n) is difficult to calculate
• T(n) is also not very meaningful as step size is not
exactly defined
• T(n) is usually very complicated so we need an
approximation of T(n)….close to T(n).
• This measure of efficiency or approximation of T(n)
is called ASYMPTOTIC COMPLEXITY or
ASYMPTOTIC ALGORITHM ANALYSIS

Asymptotic complexity studies the efficiency of an


algorithm as the input size becomes large

Page:14
Algorithm Complexity
Big-Oh or Big-O
f(n) is O(g(n)) if there exist positive numbers c & n0 such that f(n)<=cg(n)
for all n>= n0

g(n) is called the upper bound on f(n) OR


f(n) grows at the most as large as g(n)
Example:
T(n) = n2 + 3n + 4
n2 + 3n + 4 <= 2 n2 for all n0 >10
What is f(n) and what is g(n)?
n0
What is c & N

so we can say that T(n) is O(n2) OR


T(n) is in the order of n2.
T(n) is bounded above by a + real multiple of n2

Page:15
Algorithm Complexity
How to choose c and N
Example:
T(n) = n2 + 3n + 4
n2 + 3n + 4 <= c n2 for all n > n0 ?

What is c & n0 ??
Put value of n0=1,2,3, …. in this
equation to get value of c for that no

One inequality with two unknowns, different infinite pairs of


constants c and no for the same function g(n2) can be determined.

Page:16
Algorithm Complexity
17
More Examples
• Show that 3n+3 is O(n).
– Show c, n0: 3n+3  cn, n>n0 .
• 3+3/n <=c ….
• n0=1 c>=6
• n0=2 c>=4.5 so on

• Show that 2n+2 is O(n).


– n0=1 , c>=4
– c=3, n0=2.

Page:17
Algorithm Complexity
How to choose c and N
• The point is that f and g grow at the same rate.
• g is almost always greater than or equal to f if it
is multiplied by a constant c.

Page:18
Algorithm Complexity
ALGORITHM ANALYSIS OF
DIFFERENT CODES

Page:19
Algorithm Complexity
20
Nested For Loops
Rough estimate
for(i=0; i<N; i++) N+1
arr[i][i] =0; N
sum = 0; 1
for(i=0; i<N; i++) N+1
for(j=0; j<N; j++) N(N+1)
sum += arr[i][j]; N2
------------
O(N)+O(N2) = O(N2)
Page:20
Algorithm Complexity
Example 1 (arithmetic series)
j runs i times in each loop
• i = 0; N
i j # of time j loop runs
• while (i < n){ N+1
𝑛−1 0 0
– sum = a[0]; ෍1=𝑁
𝑖=1
N
1 1 1
– j = 1;
𝑎𝑠 𝑐𝑜𝑠𝑡 𝑜𝑓 𝑡ℎ𝑖𝑠 𝑠𝑡𝑒𝑝 𝑖𝑠 1
N 2 1,2 2
– while ( j <= i){ ?? 3 1,2, 3 3
• sum += a[j]; ?? n-1 1,2, ..n-1 n-1

• j++ ?? 𝑛−1 𝑖

෍ ෍1 … 𝑛𝑜 𝑜𝑓 𝑡𝑖𝑚𝑒𝑠 𝒔𝒖𝒎 += 𝒂 𝒊 𝒓𝒖𝒏𝒔


–} 𝑖=1 𝑗=1

𝑛−1
• cout<<”sum of subarray 0 to N 𝑛(𝑛 − 1)
“<< i <<” is “<<sum<<end1; ෍𝑖 = = 𝑂(𝑛2 )
2
𝑖=1
• i++
N
• } This is arithmetic series… Sum up all we
get
𝑂(𝑛2 ) + 𝑂(𝑛) = 𝑂(𝑛2 )
Page:21
Algorithm Complexity

You might also like