0% found this document useful (0 votes)
63 views

Lec-03-Complexity of Algorithm

This document discusses the complexity of algorithms and why it is important to analyze. It begins by defining an algorithm as a well-defined computational procedure that takes inputs and produces outputs. It then explains that while computers may be fast, their time and memory are finite resources. The document goes on to define algorithmic complexity as how fast or slow an algorithm performs based on the size of the input. It notes that asymptotic analysis allows comparison of algorithms independently of machine or implementation. Examples are given comparing linear and exponential time complexities to show how faster-growing functions quickly dominate for large inputs. Finally, common asymptotic notations like Big-O, Big-Omega, and Big-Theta are introduced for describing an algorithm's worst-case running

Uploaded by

DJBRAVE131
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
63 views

Lec-03-Complexity of Algorithm

This document discusses the complexity of algorithms and why it is important to analyze. It begins by defining an algorithm as a well-defined computational procedure that takes inputs and produces outputs. It then explains that while computers may be fast, their time and memory are finite resources. The document goes on to define algorithmic complexity as how fast or slow an algorithm performs based on the size of the input. It notes that asymptotic analysis allows comparison of algorithms independently of machine or implementation. Examples are given comparing linear and exponential time complexities to show how faster-growing functions quickly dominate for large inputs. Finally, common asymptotic notations like Big-O, Big-Omega, and Big-Theta are introduced for describing an algorithm's worst-case running

Uploaded by

DJBRAVE131
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 38

Data Structure and Algorithms

Complexity of Algorithms

1
Algorithm
An Algorithm is any well-defined
computational procedure that takes some
values or set of values as input and produces
some values or set of values as output.

An Algorithm is a well defined list of steps to


solve a particular problem.

2
NOTION OF ALGORITHM
problem

algorith
m

input “computer” output

Algorithmic solution

3
Suppose computers were infinitely fast and
computer memory are free.

Is there any reason to study the complexity


algorithm ?

4
– Demonstrate that solution, methods and so with
correct answer.

If computers were infinitely fast, any correct


method for solving a problem would do.

You would probably want your implementation to


be within the bounds of good software engineering
practice.
5
In reality
Computers may be fast, but they are not
infinitely fast and Memory may be cheap
but it is not free.

Computing time is therefore a bounded


resource and so is the space in memory

6
Complexity
 Algorithmic complexity is concerned about
how fast or slow particular algorithm
performs.
 We define complexity as a numerical
function T(n) - time versus the input size n.
 Such an analysis is independent of
machine type, programming style, etc.

7
Complexity

In general, we are not so much interested in


the time and space complexity for small
inputs.

For example, while the difference in time


complexity between linear and binary
search is meaningless for a sequence with
n = 10, it is gigantic for n=230.
8
Complexity
For example, let us assume two algorithms A and B
that solve the same class of problems P.

The time complexity of A is 5,000 n, the one for


B is [1.1n] for an input with n elements.

For n = 10, A requires 50,000 steps, but B only


3, so B seems to be superior to A.

For n = 1000, however, A requires 5,000,000


steps, while B requires 2.5*1041 steps.
9
Complexity
Comparison: Time complexity of algorithms A &B.
Input Size Algorithm A Algorithm B

n 5,000n [1.1n]

10 50,000 3

100 500,000 13,781

1,000 5,000,000 2.5*1041

1,000,000 5.109 4.8*1041

10
Complexity
This means that algorithm B cannot be used for
large inputs, while algorithm A is still feasible.

So what is important is the growth of the


complexity functions.

The growth of time and space complexity with


increasing input size n is a suitable measure for
the comparison of algorithms.
11
Growth Function
The order of growth/rate of growth of the
running time of an algorithm gives a simple
characterization of the algorithm efficiency
and allow us to compare the relative
performance of alternative algorithm

12
Asymptotic Efficiency Algorithm
When the input size is large enough so that
the rate of growth/order of growth of the
running time is relevant.

Usually, an algorithm that is asymptotically


more efficient will be the best choice for all
but not very small inputs.

13
Asymptotic Notation
Asymptotic notations are convenient for
describing the worst-case running time
function T(n), which is defined only on
integer input size.

Let n be a non-negative integer representing


the size of the input to an algorithm

14
Asymptotic Notation
Let f(n) and g(n) be two positive functions,
representing the number of basic
calculations (operations, instructions) that
an algorithm takes (or the number of
memory words an algorithm needs).

15
Is input size everything that matters?

int find_a(char *str)


{
int i;
for (i = 0; str[i]; i++)
{
if (str[i] == ’a’)
return i;
}
return -1;
}
Time complexity:

Consider two inputs: “alibi” and “never”


Time for an algorithm to run T(n)
A function of input. However, we will attempt to
characterize this by the size of the input. We will try and
estimate the WORST CASE, and sometimes the BEST
CASE, and very rarely the AVERAGE CASE.

Worst Case: is the maximum run time


Best Case: minimum run time
 Average Case: It is the average run time.

We can measure this three using different function.

17
Asymptotic Notation
O – Big O
W - Big Omega
Q - Big Theta

18
Types of Analysis
Worst-case: (usually done)
upper bound on running time
maximum running time of algorithm on any input of
size n
Average-case: (sometimes done)
we take all possible inputs and calculate computing time
for all of the inputs
sum all the calculated values and divide the sum by total
number of inputs
we must know (or predict) distribution of cases
Best-case: (bogus)
lower bound on running time of an algorithm
minimum running time of algorithm on any input of
size n
O-Notation
For a given function g(n)
O (g(n)) = {f(n) : there exist positive constants c and
n0 such that 0 £ f(n) £ c g(n) for all n ³n0 }

Intuitively: Set of all functions whose rate of


growth is the same as or lower than that of c.g(n)

20
W- Notation
For a given function g(n)
W(g(n)) = {f(n) : there exist positive
constants c and n0 such that 0 £ c g(n) £ f(n)
for all n ³ n0}

Intuitively: Set of all functions whose rate


of growth is the same as or higher than that
of g(n).
21
Relations Between O, W

22
Relations Between Q, O, W
For any two function f(n) and g(n),
we have f(n) = Q(g(n)) if and only
if f(n) = O(g(n)) and f(n) =W(g(n))

That is
Q(g(n)) = O(g(n)) Ç W (g(n))

23
Asymptotic Upper Bound
Example:
Show that f(x) = x2 + 2x + 1 is O(x2).

For x > 1 we have:


x2 + 2x + 1  x2 + 2x2 + x2
 x2 + 2x + 1  4x2

Therefore, for C = 4 and k = 1:


f(x)  Cx2 whenever x > k.

 f(x) is O(x2).
Asymptotic Upper Bound
We say is in the order of , or
Growth rate of is constant, that is, it is not dependent on
problem size.
 is in the order of , or
Growth rate of is roughly proportional to the growth rate of .
 is in the order of , or
Growth rate of is roughly proportional to the growth rate of .
In general, any function is faster- growing than any
function.
For large , a algorithm runs a lot slower than a algorithm.
Asymptotic Upper Bound
Consider the example of buying elephants and goldfish:
Cost: cost_of_elephants + cost_of_goldfish
Cost ~ cost_of_elephants (approximation)
Easier way: Discard the low order terms, as well as
constants in a function since they are relatively
insignificant for large n
6n + 4 ~ n
n4 + 100n2 + 10n + 50 ~ n4
i.e., we say that n4 + 100n2 + 10n + 50 and n4 have the
same rate of growth
The Growth of Functions
Popular functions g(n) are
nlogn, 1, 2n, n2, n!, n, n3, log n.
Listed from slowest to fastest growth:
•1
• log n
•n
• n log n
• n2
• n3
• 2n
• n!
27
Growth of Functions
Comparing Growth Rates
n2 n log2 n
2n

T(n)

log2 n

Problem Size
29
Some Examples
Determine the time complexity for the following
algorithm.
 count = 0;
for(i=0; i<10000; i++)
count++;
Some Examples
Determine the time complexity for the following
algorithm. 𝑶(𝟏)
 count = 0;
for(i=0; i<10000; i++)
count++;
Some Examples
Determine the time complexity for the following
algorithm.
 count = 0;
for(i=0; i<N; i++)
count++;
Some Examples
Determine the time complexity for the following
algorithm. 𝑶(𝒏)
 count = 0;
for(i=0; i<N; i++)
count++;
Some Examples
Determine the time complexity for the following
algorithm.
 sum = 0;
for(i=0; i<N; i++)
for(j=0; j<N; j++)
sum += arr[i][j];
Some Examples
Determine the time complexity for the following
algorithm. 𝑶(𝒏𝟐)
 sum = 0;
for(i=0; i<N; i++)
for(j=0; j<N; j++)
sum += arr[i][j];
Some Examples
Determine the time complexity for the following
algorithm.
 count = 0;
for(i=1; i<N; i=i*2)
count++;
Some Examples
Determine the time complexity for the following
algorithm. 𝑶(log𝒏)
 count = 0;
for(i=N; i>1; i=i/2)
count++;
Thank You

38

You might also like