0% found this document useful (0 votes)
3 views

Lecture 3.1

The document discusses algorithm efficiency and complexity analysis, emphasizing the importance of measuring time and memory resources consumed by algorithms. It introduces concepts such as asymptotic complexity, Big O notation, and various orders of growth, illustrating how to analyze running time through examples. Additionally, it highlights the significance of algorithm efficiency over hardware performance in computational tasks.

Uploaded by

noobscaferen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

Lecture 3.1

The document discusses algorithm efficiency and complexity analysis, emphasizing the importance of measuring time and memory resources consumed by algorithms. It introduces concepts such as asymptotic complexity, Big O notation, and various orders of growth, illustrating how to analyze running time through examples. Additionally, it highlights the significance of algorithm efficiency over hardware performance in computational tasks.

Uploaded by

noobscaferen
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 29

Program Efficiency

&
Complexity Analysis

Mr. Shoaib Khan


Algorithm Review
• An algorithm is a definite procedure for solving a
problem in finite number of steps

• Algorithm is a well defined computational


procedure that takes some value (s) as input,
and produces some value (s) as output.

• Algorithm is finite number of computational


statements that transform input into the output
• Algorithm
– Finite sequence of instructions.
– Each instruction having a clear meaning.
– Each instruction requiring finite amount of
effort.
– Each instruction requiring finite time to
complete.
Good Algorithms?
• Run in less time
• Consume less memory

But computational resources (time


complexity) is usually more important
Measuring Efficiency
• The efficiency of an algorithm is a measure of the
amount of resources consumed in solving a
problem of size n.
• The resource we are most interested in is time
• We can use the same techniques to analyze the
consumption of other resources, such as memory
space.

• How to measure the efficiency of the algorithm?


Ways of measuring efficiency:
• Run the program and see how long it takes
• Run the program and see how much memory it uses

•Lots of variables to control:


• What is the input data?
• What is the hardware platform?
• What is the programming language/compiler?

Just because one program is faster than another


right now, means it will always be faster?

6
Want to achieve platform-independence

• Use an abstract machine that uses steps of time


and units of memory, instead of seconds or bytes

•each elementary operation takes 1 step

•each elementary instance occupies 1 unit of memory

11
Running Time of an Algorithm
• Running time is measured in terms of number of
steps/primitive operations performed

• Generally time grows with size of input, so running time of


an algorithm is usually measured as function of input size.

• Independent from machine, OS


Simple Example (1)
// Input: int A[N], array of N integers
// Output: Sum of all numbers in array A

int Sum(int A[], int N)


{
int s=0;
for (int i=0; i< N; i++)
s = s + A[i];
return s;
}
How should we analyse this?
Simple Example (2)
// Input: int A[N], array of N integers
// Output: Sum of all numbers in array A

int Sum(int A[], int N){


int s=0; 1
for (int i=0; i< N; i++)
2 3 4
s = s + A[i];
5 6 7 1,2,8: Once
return s; 3,4,5,6,7: Once per each iteration
}
8 of for loop, N iteration
Total: 5N + 3
The complexity function of the
algorithm is : f(N) = 5N +3
Simple Example (3) Growth of 5n+3
Estimated running time for different values of N:

N = 10 => 53 steps
N = 100 => 503 steps
N = 1,000 => 5003 steps
N = 1,000,000 => 5,000,003 steps

As N grows, the number of steps grow in linear


proportion to N for this function “Sum”
What Dominates in Previous
Example?
What about the +3 and 5 in 5N+3?
– As N gets large, the +3 becomes insignificant
– 5 is inaccurate, as different operations require
varying amounts of time and also does not
have any significant importance

What is fundamental is that the time is linear in N.

Asymptotic Complexity: As N gets large,


concentrate on the highest order term:
– Drop lower order terms such as +3
– Drop the constant coefficient of the highest
order term i.e. N
Asymptotic Complexity
• The 5N+3 time bound is said to "grow
asymptotically" like N
• This gives us an approximation of the complexity
of the algorithm
• Ignores lots of (machine dependent) details,
concentrate on the bigger picture
Big Oh Notation [1]
If f(N) and g(N) are two complexity functions, we
say

f(N) = O(g(N))

(read "f(N) is order g(N)", or "f(N) is big-O of g(N)")


if there are constants c and N0 such that for N > N0,
f(N) ≤ c * g(N)

for all sufficiently large N.


Example (1)
• Consider
f(n)=2n2+3
and g(n)=n2
Is f(n)=O(g(n))? i.e. Is 2n2+3 = O(n2)?
Proof:
2n2+3 ≤ c * n2
Assume N0 =1 and c=1?
Assume N0 =1 and c=2?
Assume N0 =1 and c=3?
• If true for one pair of N0 and c, then there exists
infinite set of such pairs of N0 and c
Example (2): Comparing
Functions 4000
• Which
3500
function is
better? 3000

10 n2 Vs n3 2500

10 n^2
2000
n^3

1500

1000

500

0
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15
Comparing Functions
• As inputs get larger, any algorithm of a
smaller order will be more efficient than an
algorithm of a larger order
0.05 N2 = O(N2)
Time (steps)

3N = O(N)

N = 60 Input (size)
100n2 Vs 5n3, which one is
better?
250000

200000

150000

100000

50000

0
1 1 1 1 1 1 1 1 1 1 2 2 2 2 2 2 2 2 2 2 3 3 3 3 3
1 2 3 4 5 6 7 8 9
0 1 2 3 4 5 6 7 8 9 0 1 2 3 4 5 6 7 8 9 0 1 2 3 4
100n2 10 40 90 16 25 36 49 64 81 10 12 14 16 19 22 25 28 32 36 40 44 48 52 57 62 67 72 78 84 90 96 1E 1E 1E
5n3 5 40 13 32 62 10 17 25 36 50 66 86 10 13 16 20 24 29 34 40 46 53 60 69 78 87 98 1E 1E 1E 1E 2E 2E 2E

19
Common Orders of Growth
Let N be the input size, and b and k be constants

O(k) = O(1) Constant Time

Increasing Complexity
O(logbN) = O(log N) Logarithmic Time
O(N) Linear Time
O(N log N)
O(N2) Quadratic Time
O(N3) Cubic Time
...
O(kN) Exponential Time

20
Size does matter[1]
What happens if we double the input size N?

N log2N 5N N log2N N2 2N
8 3 40 24 64 256
16 4 80 64 256 65536
32 5 160 160 1024 ~109
64 6 320 384 4096 ~1019
128 7 640 896 16384 ~1038
256 8 1280 2048 65536 ~1076
Size does matter[2]
• Suppose a program has run time O(n!) and
the run time for
n = 10 is 1 second

For n = 12, the run time is 2 minutes


For n = 14, the run time is 6 hours
For n = 16, the run time is 2 months
For n = 18, the run time is 50 years
For n = 20, the run time is 200 centuries
Standard Analysis Techniques
• Constant time statements
• Analyzing Loops
• Analyzing Nested Loops
• Analyzing Sequence of Statements
• Analyzing Conditional Statements
Constant time statements
• Simplest case: O(1) time statements

• Assignment statements of simple data types


int x = y;
• Arithmetic operations:
x = 5 * y + 4 - z;
• Array referencing:
A[j] = 5;
• Array assignment:
 j, A[j] = 5;
• Most conditional tests:
if (x < 12) ...
Analyzing Loops[1]
• Any loop has two parts:
– How many iterations are performed?
– How many steps per iteration?
int sum = 0,j;
for (j=0; j < N; j++)
sum = sum +j;
– Loop executes N times (0..N-1)
– 4 = O(1) steps per iteration
• Total time is N * O(1) = O(N*1) = O(N)
Analyzing Loops[2]
• What about this for loop?
int sum =0, j;
for (j=0; j < 100; j++)
sum = sum +j;
• Loop executes 100 times
• 4 = O(1) steps per iteration
• Total time is 100 * O(1) = O(100 * 1) =
O(100) = O(1)
Analyzing Nested Loops[1]
• Treat just like a single loop and evaluate
each level of nesting as needed:
int j,k;
for (j=0; j<N; j++)
for (k=N; k>0; k--)
sum += k+j;
• Start with outer loop:
– How many iterations? N
– How much time per iteration? Need to evaluate
inner loop
• Inner loop uses O(N) time
• Total time is N * O(N) = O(N*N) = O(N2)
Analyzing Sequence of Statements
• For a sequence of statements, compute their
complexity functions individually and add
them up

for (j=0; j < N; j++)


O(N2)
for (k =0; k < j; k++)
sum = sum + j*k;
for (l=0; l < N; l++)
sum = sum -l; O(N)
cout<<“Sum=”<<sum;
O(1)

Total cost is O(N2) + O(N) +O(1) = O(N2


SUM) RULE
Analyzing Conditional Statements
What about conditional statements such as

if (condition)
statement1;
else
statement2;
where statement1 runs in O(N) time and statement2
runs in O(N2) time?

We use "worst case" complexity: among all inputs of


size N, that is the maximum running time?
The analysis for the example above is O(N2)
Fast machine Vs Fast Algorithm

Get a 10 times fast computer, that can do a job in 103


seconds for which the older machine took 104 seconds .

Comparing the performance of algorithms with time


complexities T(n)s of n, n2 and 2n for different problems on
both the machines.

Question: Is it worth buying a 10 times fast machine?

31

You might also like