0% found this document useful (0 votes)
3 views

3 Time Complexity(1)

The document outlines a course on Advanced Algorithms, focusing on time complexity and various algorithm analysis techniques. It covers topics such as asymptotic notation, performance classification, and the analysis of loops and recursive algorithms. The course aims to equip students with the skills to evaluate algorithm efficiency and understand complexity classes.

Uploaded by

Dipesh
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
3 views

3 Time Complexity(1)

The document outlines a course on Advanced Algorithms, focusing on time complexity and various algorithm analysis techniques. It covers topics such as asymptotic notation, performance classification, and the analysis of loops and recursive algorithms. The course aims to equip students with the skills to evaluate algorithm efficiency and understand complexity classes.

Uploaded by

Dipesh
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 45

Advanced Algorithms

Time Complexity
Dr. Rubi Quiñones
Computer Science Department
Southern Illinois University Edwardsville
Course Timeline
• Introduction to algorithms

ALGORITH

ANALYSIS
MIC
• Median findings and order statistics
• Time complexity
STRATEGIES • Activity selection problems
GREEDY

• Water connection problem, Egyptian fraction


• Huffman (de)coding
• shelves, mice, and policemen problem

CONQUER

Max subarray sum and Nearest neighbor


DIVIDE
AND

• Newton’s and bisection algorithm


• Matrix multiplication, skyline and hanoi
• Fibonacci and path counting
PROGRAMMING

• Coin row and collecting problem


DYNAMIC

• Matrix chain multiplication and longest common subsequence


• Knapsack and optimal binary trees
• Floyd Warshall Algorithm and A*
Concep


Advanc

Algorithm intractability
ed

ts

• Randomized algorithms 2
Time Complexity

• Running Time
• Asymptotic Notation
• Big oh
• Big omega
• Big theta
• Performance classification
• Complexity classes
• Standard analysis techniques
• Constant time statements
• Analyzing loops
• Analyzing nested loops
• Analyzing sequence of statements
• Analyzing conditional statements
• Iteration method

3
Review

• Algorithm: a finite set of statements that guarantees an optimal solution in finite interval of time

• But what makes an algorithm good?

4
Review

• Algorithm: a finite set of statements that guarantees an optimal solution in finite interval of time

• But what makes an algorithm good?


• Runs in less time
• Consumes less memory
• Uses less computational resources (Time complexity)

5
Factors

• Hardware
• Operating system
• Compiler
• Size of input
• Nature of input
• algorithms

6
Running Time of an Algorithm

• Depends on
• Input size
• Nature of input
• Generally, time grows with size of input, so running time of an algorithm is usually measured as a function
of input size

7
Running Time of an Algorithm

• Depends on
• Input size
• Nature of input
• Generally, time grows with size of input, so running time of an algorithm is usually measured as a function
of input size
• Running time is measured in terms of number of steps/primitive operations performed

8
Finding Running Time of an Algorithm

• Running time is measured by number of steps/primitive operations performed


• Steps mean elementary operations like
• +, *, <, =, A[i], etc.,
• we will measure number of steps taken in terms of size of input

9
Example 1

How many steps does this algorithm take in relation to the input size? Ie, what is the complexity function?
Remember to take into consideration each operation. 10
Example 1
1 2 8
Once

3 4 5 6 7
Once per each iteration of for
loop, N iterations
1

Total: 5N+3
4
2 3
The complexity function of
5 6 7 the algorithm is f(N): 5N+3
8

How many steps does this algorithm take? 11


Example 1
Let’s estimate the running time for different N values:

N=10 => 53 steps


N=100 => 503 steps
N=1,000 => 5003 steps
N=1,000,000 => 5,000,003 steps

1
As N grows, the number of steps grow in linear
4 proportion to N for this function “Sum”
2 3

5 6 7
8

F(N)=5N+3 12
Example 1
What happens to the +3 and 5 in f(N)=5N+3 as N gets larger?

13
Example 1
What happens to the +3 and 5 in f(N)=5N+3 as N gets larger?
+3 becomes insignificant
5 becomes inaccurate, different operations require varying amounts of time

What is fundamental to know is that the time is linear in N

Asymptotic Complexity: As N gets large, concentrate on the highest order term:


drop lower order terms such as +3
Drop the constant coefficient of the highest order term

Example: 7N-3 would be N


Example: 8n2logn+5n2+n would be ______?

14
Example 1
What happens to the +3 and 5 in f(N)=5N+3 as N gets larger?
+3 becomes insignificant
5 becomes inaccurate, different operations require varying amounts of time

What is fundamental to know is that the time is linear in N

Asymptotic Complexity: As N gets large, concentrate on the highest order term:


drop lower order terms such as +3
Drop the constant coefficient of the highest order term

Example: 7n-3 would be N


Example: 8n2logn+5n2+n would be n2logn

15
Asymptotic Notation

Big Oh Notation: Upper bound


Omega Notation: Lower bound
Theta Notation: Tighter bound

16
Big Oh Notation
If f(N) and g(N) are two complexity functions, we say
f(N) = O(g(N))
“f(N) is order g(N)” or “f(N) is big-O of g(N)”

If there are constants c and N0 such that for N > N0,


f(N) <= c * g(N)
For all sufficiently large N

Function cg(n) always dominates


f(n) to the right of n0

17
Example 2
Which function is better?

19
Example 2
Which function is better?

As inputs get larger, any algorithm of a


smaller order will be more efficient than
an algorithm of a large order

20
Example 3

Think about these two function


comparisons and the value where
they’re equal

Which function would be better what


inputs?

21
Big Omega Notation
If we wanted to say “running time is at least…” we use omega

Big Omega notation, Ω, is used to express the lower bounds on a function

If f(n) and g(n) are two complexity functions then we say:

f(n) is Ω(g(n)) if there exist positive numbers c and n 0 such that 0<f(n)>=c (n) for all n>=n0

In this instance, cg(n) is dominated by function


f(n) to the right of n0

22
Big Theta Notation
If we wish to express tight bounds, we use the theta notation, 𝜃

f(n) = 𝜃(g(n)) means that f(n) = O(g(n)) and f(n) = Ω

Takes into consideration both big Oh and big Omega giving us the “average” complexity

23
What does it all mean?
If f(n)= 𝜃(g(n)) we say that f(n) and g(n) grow at the same rate, asymptotically

If f(n) = O(g(n)) and f(n) != Ω(g(n)), then we say that f(n) is asymptotically slower growing than g(n)

If f(n) = Ω(g(n)) and f(n) != O(g(n)), then we say that f(n) is asymptotically faster growing than g(n)

24
Which notation to use?
What is the common notation computer scientists use when designing algorithms?

25
Which notation to use?
What is the common notation computer scientists use when designing algorithms?

As computer scientists, we generally like to express our algorithms as big O since we would like to know the
upper bounds of our algorithms

Why?

26
Which notation to use?
What is the common notation computer scientists use when designing algorithms?

As computer scientists, we generally like to express our algorithms as big O since we would like to know the
upper bounds of our algorithms

Why?

If we know the worst case then we can aim to improve it and/or avoid it

27
Performance Classification

28
Complexity Classes

29
Size Does Matter
What happens if we double the input size of N?

This is especially important in high performance computing, parallelization computing, and supercomputing

30
Size Does Matter
Suppose a program has run time O(n!) and the run time for n=1 is 10 seconds.

For n=12, the run time is 2 minutes (in minutes)

For n=200, the run time is ? (in minutes)

For n=1,000, the run time is ? (in hours)

For n=50,000, the run time is ? (in hours)

For n=1,000,000, the run time is ? (in days)

31
Size Does Matter
Suppose a program has run time O(n!) and the run time for n=1 is 10 seconds.

For n=12, the run time is 2 minutes

For n=200, the run time is 33.3 minutes

For n=1,000, the run time is 2.77 hours

For n=50,000, the run time is 138.8 hours

For n=1,000,000, the run time is 115.74 days

32
Time Complexity

• Running Time
• Asymptotic Notation
• Big oh
• Big omega
• Big theta
• Performance classification
• Complexity classes
• Standard analysis techniques
• Constant time statements
• Analyzing loops
• Analyzing nested loops
• Analyzing sequence of statements
• Analyzing conditional statements
• Iteration method

33
Analyzing Loops - Linear Loops

Efficiency is proportional to the number of iterations.


Efficiency time function is:
f(n)= 1+ (n-1) +c*(n-1) + (n-1)
= (c+2)*(n-1)+1
= (c+2)n– (c+2) +1
Asymptotically, efficiency is: O(n)

34
Analyzing Nested Loops
Treat just like a single loop and evaluate each level of nesting as needed:

int j, k;
for (j=0; j<N; j++)
for (k=N; 0<k; k--)
sum += k+j;

Start with outer loop:


how many iterations? N
how much time per iteration? Need to evaluate the inner loop

35
Analyzing Nested Loops
Treat just like a single loop and evaluate each level of nesting as needed:

int j, k;
for (j=0; j<N; j++)
for (k=N; 0<k; k--)
sum += k+j;

Start with outer loop:


how many iterations? N
how much time per iteration? Need to evaluate the inner loop

Inner loop uses O(N) time


Total time is N* O(N) = O(N*N) = O(N^2)

36
Analyzing Nested Loops
What if the number of iterations of one loop depends on the counter of the other?

int j, k;
for (j=0; j<N; j++)
for (k=N; k<j; k++)
sum += k+j;

Analyze inner and outer loop together!

Number of iterations of the inner loop is:


0+1+2+…+(N-1) = O(N^2)

How did we get this answer?

37
Analyzing Nested Loops
What if the number of iterations of one loop depends on the counter of the other?

int j, k;
for (j=0; j<N; j++) When doing Big O analysis, we sometimes have to
for (k=N; k<j; k++) compute a series like:
sum += k+j;
i.e. sum of the first n numbers. What is the complexity of
Analyze inner and outer loop together! this?

Number of iterations of the inner loop is: Gauss figured out that the sum of the first n numbers is
0+1+2+…+(N-1) = O(N^2) always:
𝑛
How did we get this answer? 𝑛 ∗ (𝑛 + 1) 𝑛2 + 𝑛
෍𝑖 = = = 𝑂 𝑛2
2 2
𝑖=1

38
Sequence of Statements

For a sequence of statements, compute their complexity functions individually and add them up

Total cost is O(n^2) + O(n) + O(1) = O(n^2)

39
Conditional Statements

What about conditional statements such as

Lets say statement1 runs in O(n) time and statement2 runs in O(n^2) times

We use “worst case” complexity; among all inputs of size n, what is the maximum running time?

40
Time Complexity

• Running Time
• Asymptotic Notation
• Big oh
• Big omega
• Big theta
• Performance classification
• Complexity classes
• Standard analysis techniques
• Constant time statements
• Analyzing loops
• Analyzing nested loops
• Analyzing sequence of statements
• Analyzing conditional statements
• Iteration method

41
Deriving a Recurrence Equation
So far, all algorithms that we have been analyzing have been non recursive

Example: recursive power method

If N=0, then running time T(N) is 2

However, if N>=2, then running time T(N) is the cost of each step taken plus time required to compute
power(x, n-1). (i.e., T(N)=2+T(N-1) for N>=2)

How do we solve this? One way is to use the iteration method.


42
Iteration Method
This is sometimes known as ”Back Substituting”

Involves expanding the recurrence in order to see a pattern

Solving formula from previous example using the iteration method

Solution: Expand and apply to itself:


Let T(1) = n0 = 2
T(N) = 2+T(N-1)
= 2 + 2 + T(N-2)
= 2 + 2 + 2 + T(N-3)
= 2 + 2 + 2 + … + 2 + T(1)
= 2N + 2 remember that t(1) = n0 = 2 for N=1

So T(N) = 2N+2 is O(N) for last example

43
Summary
Algorithms can be classified according to their complexity (O notation)
only relevant for large input sizes

“Measurements” are machine dependent


worst-, average-, best-case analysis

44
Time Complexity

• Running Time
• Asymptotic Notation
• Big oh
• Big omega
• Big theta
• Performance classification
• Complexity classes
• Standard analysis techniques
• Constant time statements
• Analyzing loops
• Analyzing nested loops
• Analyzing sequence of statements
• Analyzing conditional statements
• Iteration method

45
Course Timeline
• Introduction to algorithms

ALGORITH

ANALYSIS
MIC
• Median findings and order statistics
• Time complexity
STRATEGIES • Activity selection problems
GREEDY

• Water connection problem, Egyptian fraction


• Huffman (de)coding
• shelves, mice, and policemen problem

CONQUER

Max subarray sum and Nearest neighbor


DIVIDE
AND

• Newton’s and bisection algorithm


• Matrix multiplication, skyline and hanoi
• Fibonacci and path counting
PROGRAMMING

• Coin row and collecting problem


DYNAMIC

• Matrix chain multiplication and longest common subsequence


• Knapsack and optimal binary trees
• Floyd Warshall Algorithm and A*
Concep


Advanc

Algorithm intractability
ed

ts

• Randomized algorithms 46

You might also like