0% found this document useful (0 votes)
24 views

Algorithm Analysis and Design: Introduction To Algorithms

The document introduces algorithms and their properties, including that an algorithm must have finite steps and be precise. It discusses analyzing time and space complexity using asymptotic notations like Big O for upper bounds, Big Omega for lower bounds, and Big Theta for tight bounds. Big O represents the worst case, Big Omega the best case, and Big Theta the average case. Examples are given to illustrate analyzing linear search time complexity using these notations.

Uploaded by

Islam Saleem
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views

Algorithm Analysis and Design: Introduction To Algorithms

The document introduces algorithms and their properties, including that an algorithm must have finite steps and be precise. It discusses analyzing time and space complexity using asymptotic notations like Big O for upper bounds, Big Omega for lower bounds, and Big Theta for tight bounds. Big O represents the worst case, Big Omega the best case, and Big Theta the average case. Examples are given to illustrate analyzing linear search time complexity using these notations.

Uploaded by

Islam Saleem
Copyright
© © All Rights Reserved
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 15

ALGORITHM ANALYSIS AND DESIGN

INTRODUCTION TO ALGORITHMS

Algorithm & its properties, Time & Space complexity, Asymptotic Notations.
Algorithm & its properties
 An algorithm is a sequence of computational steps that transforms
the input into the output.
 An algorithm is a step by step procedure to solve a problem. It take
a set of value as input and produces some value as output. The
efficiency of an algorithm is based on the space and time
complexity.
Algorithm & its properties
 Properties of an algorithm
 Algorithms must contain finite number of steps
 Algorithms must be precise and unambiguous.
Algorithm Design and Analysis
 The efficiency or running time of an algorithm is stated as a function
relating the input length to the number of steps (time complexity) or
storage locations (space complexity).
 Algorithm analysis provides theoretical estimates for the resources
needed by any algorithm which solves a given computational
problem.
Time & Space complexity
 Space Complexity  Time Complexity
Space Complexity is the space Time complexity is the time
(memory) needed for an algorithm required for an algorithm to
to solve the problem. An efficient complete its process. It allows
algorithm take space as small as comparing the algorithm to check
possible. which one is the efficient one.
Asymptotic Notations
 When it comes to analyzing the complexity of any algorithm in terms
of time and space, we can never provide an exact number to define
the time required and the space required by the algorithm, instead
we express it using some standard notations, also known
as Asymptotic Notations
Example
 Let us take an example, if some algorithm has a time complexity of T(n) =
(n2 + 3n + 4), which is a quadratic equation. For large values of n, the 3n +
4 part will become insignificant compared to the n2 part.
Also, When we compare the execution times of
two algorithms the constant coefficients of higher
order terms are also neglected.
An algorithm that takes a time of 200n2 will be
faster than some other algorithm that
takes n3 time, for any value of n larger than 200.
Since we're only interested in the asymptotic
behavior of the growth of the function, the
constant factor can be ignored too.
For n = 1000, n2 will be 1000000 while 3n + 4 will be 3004.
What is Asymptotic Behaviour

 The word Asymptotic means approaching a value or curve


arbitrarily closely (i.e., as some sort of limit is taken).
Example
 Let's take an example to understand this:
 If we have two algorithms with the following expressions representing the time
required by them for execution, then:
 Expression 1: (20n2 + 3n - 4)
 Expression 2: (n3 + 100n - 2)
 Now, as per asymptotic notations, we should just worry about how the function
will grow as the value of n(input) will grow, and that will entirely depend
on n2 for the Expression 1, and on n3 for Expression 2.
 Hence, we can clearly say that the algorithm for which running time is
represented by the Expression 2, will grow faster than the other one, simply by
analyzing the highest power coefficient and ignoring the other constants(20 in
20n2) and insignificant parts of the expression(3n - 4 and 100n - 2).
Asymptotic Notations
 Asymptotic notation is used to compute the complexity of an algorithm in
terms of time and space. It is normally mentioned as the following terms.
 Big O notation Example: O(n2)
 Big Omega notation Example: Ω(n)
 Big Theta notation Example: Θ(n log n)
BIG O: Upper Bound
 This notation is known as the upper bound of the algorithm,
or a Worst Case of an algorithm.
 It tells us that a certain function will never exceed a specified
time for any value of input n.
 The question is why we need this representation when we
already have the big-Θ notation, which represents the tightly
bound running time for any algorithm.
Example
 Let's take a small example to understand this.
 Consider Linear Search algorithm, in which we traverse an array elements, one by one to search a
given number.
 In Worst case, starting from the front of the array, we find the element or number we are searching
for at the end, which will lead to a time complexity of n, where n represents the number of total
elements.
 But it can happen, that the element that we are searching for is the first element of the array, in
which case the time complexity will be 1.
 Now in this case, saying that the big-Θ or tight bound time complexity for Linear search is Θ(n), will
mean that the time required will always be related to n, as this is the right way to represent the
average time complexity, but when we use the big-O notation, we mean to say that the time
complexity is O(n), which means that the time complexity will never exceed n, defining the upper
bound, hence saying that it can be less than or equal to n, which is the correct representation.
Lower Bounds: Omega
 Big Omega notation is used to define the lower bound of any
algorithm or we can say the best case of any algorithm.
 This always indicates the minimum time required for any algorithm
for all input values, therefore the best case of any algorithm.
 In simple words, when we represent a time complexity for any
algorithm in the form of big-Ω, we mean that the algorithm will
take at least this much time to complete it's execution. It can
definitely take more time than this too.
Tight Bounds: Theta
 When we say tight bounds, we mean that the time compexity
represented by the Big-Θ notation is like the average value or range
within which the actual time of execution of the algorithm will be.
Example
 If for some algorithm the time complexity
is represented by the expression 3n2 + 5n,
and we use the Big-Θ notation to
represent this, then the time complexity
would be Θ(n2), ignoring the constant
coefficient and removing the insignificant
part, which is 5n.
 Here, in the example above, complexity
of Θ(n2) means, that the average time for
any input n will remain in between, k1 *
n2 and k2 * n2, where k1, k2 are two
constants, thereby tightly binding the
expression representing the growth of
the algorithm.

You might also like