L5 - Time & Space Complexity-1.1
L5 - Time & Space Complexity-1.1
Operations
Complexity O (n log n)
O(n)
Elements
What is Time Complexity?
You can getTopic/Course
the time complexity by “counting” the number of operations performed
by your code.
This time complexity is defined as a function of the input size n using Big-O notation.
n indicates the input size, while O is the worst-case scenario growth rate function.
We use the Big-O notation to classify algorithms based on their running time or
space (memory used) as the input grows.
Lets start with a simple example. Suppose you are given an array and an
integer and you have to find if exists in array .
Simple solution to this problem is traverse the whole array and check if the
any element is equal to x
for(i=0;i<lengthA[n];i++)
{
If(A[i]==x)
return true;
else
return false;
}
Consider the above problem.
Let each operation takes ‘c’ time. The number of lines of code executed is actually depends on
the value of ‘x’.
The worst case scenario: The if condition will run ‘x’ if the array isn’t found
Best Case scenario: The if case will be run only once (element will be found in the first pass
The space requirement is constant. It is so because we store only an array and no additional
space is required
No matter how big the constant is and how slow the linear
increase is, linear will at some point surpass constant.
If you needed players of paint, then you could say that the
time is O (whp).
Big 0, Big Theta, and Big Omega
big 0, big theta, and big omega are used to describe runtimes.
O (big Oh): Big O describes an upper bound on the time. An algorithm that prints all the values
Topic/Course
in an array could be described as O(N), but it could also be described as O(N2), O(N3), or 0( 2N)
(or many other big O times).
The algorithm is at least as fast as each of these; therefore they are upper bounds on the
runtime. This is similar to a less-than-or-equal-to relationship.
big omega: Omega is the equivalent concept but for lower bound. Printing the values in
an array is O(N) as well as O(log N) and 0(1). After all, you know that it won't be faster than
those runtimes.
big theta: Theta means both O and Omega. That is, an algorithm is Theta(N) if it is both O(N)
and Omega( N)
Note that the time to run is a function of the length of the input and not the
actual execution time of the machine on which the algorithm is running on.
Topic/Course
Electronic Transfer: O(s), where s is the size of the file. This means that the
time to transfer the file increases linearly with the size of the file. (Yes, this is
a bit of a simplification, but that's okay for these purposes.)
Airplane Transfer: 0(1) with respect to the size of the file. As the size of the
file increases, it won't take any longer to get the file to your friend. The time
is constan
How To
Operations
Complexity O (n log n)
O(n)
Elements
Big O Cheatsheet
Big O Notation Name Examples
Topic/Course
O(1) Constant Odd or Even number
O(log n) Finding element on sorted array with binary
Logarithmic search
O(n) Linear Find max element in unsorted array
int fun()
{
for(i = 1; i <= sqrt(n); i++)
{
cout << “SIX PHRASE”;
}
return 0;
}
int fun()
{
for(i = 1; i*i <= n; i++)
{
cout << “SIX PHRASE”;
}
return 0;
}
int fun()
{
int n;
while(n > 1)
{
n = n / 2;
}
return 0;
}
return 0;
}
int fun(int n)
{
if(n > 1)
return fun(n – 1);
}
int fun()
{
for(i = 1; i < n; i++)
{
for(j = 1; j < sqrt(n); j++)
{
cout << “SIX PHRASE”;
}
}
return 0;
}