Unit 1 DAA
Unit 1 DAA
UNIT 1 Introduction
and effectively.
2. They help to automate processes and make them more reliable, faster,
Properties of Algorithm:
It should terminate after a finite time.
It should produce at least one output.
It should take zero or more input.
It should be deterministic means giving the same output for the same
input case.
Every step in the algorithm must be effective i.e. every step should do
some work.
Types of Algorithms:
Brute Force Algorithm
Recursive Algorithm
Backtracking Algorithm
Searching Algorithm:
Divide and Conquer Algorithm
Advantages of Algorithms:
It is easy to understand.
An algorithm is a step-wise representation of a solution to a given
problem.
In an Algorithm the problem is broken down into smaller pieces or
steps hence, it is easier for the programmer to convert it into an actual
program.
Disadvantages of Algorithms:
Writing an algorithm takes a long time so it is time-consuming.
Analyzing an Algorithm
a. For an algorithm the most important is efficiency. In fact, there are two kinds
2. Coding an Algorithm
c. Standard tricks like computing a loop’s invariant (an expression that does not
orders of magnitude.
Complexity:
The Performance of a computer program is the amount of the
memory and time needed to a run a program
Time efficiency, also called time complexity, indicates how quickly an algorithm
runs.
Instruction space
Data Space
Environment stack
S = S(fix) + S(variable)
Algorithm(A,N)
for(i=0;i<n;i++) ----------------------n+1
S=S+A[i];----------------------------n
return s;--------------------------------1
--------------------------------------
f(n)=2n+3
Time Complexity:O(n)
Prof. Arati Badiger Page 13
UNIT-I
n-----1
s-----1
i-----1
--------------------------
n+3
S(n)=n+3
Space Complexity(n)
Example 02:
for(i=0;i<n;i++) -----------------n+1
for(j=0;j<n;j++) -----------------n*(n+1)
C[i,j]=A[i,j]+B[I,j]; ---------------n*n
----------------
F(n)=2 n2+2n+1
Time Complexity=O(n2)
f(n) 3n2+3
Space Complexity:=O(n2)
Analysis Framework
1. Measuring an input size
2. Units for measuring runtime
3. Orders of Growth
4. Worst case, Best case and Average case
5. Time Complexity
6. Space Complexity
For example, it will be the size of the list for problems of sorting, searching, finding
the list's smallest element, and most other problems dealing with lists.
There are situations, of course, where the choice of a parameter indicating an input
size does matter.
There are obvious drawbacks to such an approach. They are dependence on the
speed of a particular computer
Prof. Arati Badiger Page 15
UNIT-I
One possible approach is to count the number of times each of the algorithm's
operations is executed. This approach is both difficult and unnecessary.
Time Complexity:
The time complexity of an algorithm quantifies the amount of time taken by an
algorithm to run as a function of the length of the input. Note that the time to run
is a function of the length of the input and not the actual execution time of the
machine on which the algorithm is running on.
Definition
The valid algorithm takes a finite amount of time for execution. The time
required by the algorithm to solve given problem is called time complexity of
the algorithm. Time complexity is very useful measure in algorithm analysis.
Space complexity
We often speak of extra memory needed, not counting the memory needed to
store the input itself. Again, we use natural (but fixed-length) units to measure
this.
We can use bytes, but it's easier to use, say, the number of integers used, the
number of fixed-sized structures, etc.
In the end, the function we come up with will be independent of the actual
number of bytes needed to represent the unit.
Space complexity is sometimes ignored because the space used is minimal and/or
obvious, however sometimes it becomes as important issue as time complexity
Consider the example of Linear Search where we search for an item in an array. If the
item is in the array, we return the corresponding index, otherwise, we return -1. The
code for linear search is given below.
1
2 int search(int a, int n, int item) {
3 int i;
4 for (i = 0; i < n; i++) {
5 if (a[i] == item) {
6 return a[i]
7 }
8 }
9 return -1
}
Variable a is an array, n is the size of the array and item is the item we are looking for
in the array. When the item we are looking for is in the very first position of the array,
it will return the index immediately.
The for loop runs only once. So the complexity, in this case, will be O(1). This is the
called the best case.
Consider another example of insertion sort. Insertion sort sorts the items in the input
array in an ascending (or descending) order. It maintains the sorted and un-sorted
parts in an array. It takes the items from the un-sorted part and inserts into the sorted
part in its appropriate position. The figure below shows one snapshot of the insertion
operation.
In the figure, items [1, 4, 7, 11, 53] are already sorted and now we want to place 33 in
its appropriate place.
The item to be inserted are compared with the items from right to left one-by-one
until we found an item that is smaller than the item we are trying to insert.
We compare 33 with 53 since 53 is bigger we move one position to the left and
compare 33 with 11. Since 11 is smaller than 33, we place 33 just after 11 and move
53 one step to the right.
Here we did 2 comparisons. It the item was 55 instead of 33, we would have
performed only one comparison.
Prof. Arati Badiger Page 18
UNIT-I
That means, if the array is already sorted then only one comparison is necessary to
place each item to its appropriate place and one scan of the array would sort it. The
code for insertion operation is given below.
1
2 void sort(int a, int n) {
3 int i, j;
4 for (i = 0; i < n; i++) {
5 j = i-1;
6 key = a[i];
7 while (j >= 0 && a[j] > key)
8 {
9 a[j+1] = a[j];
10 j = j-1;
11 }
12 a[j+1] = key;
13 }
}
When items are already sorted, then the while loop executes only once for each item
There are total n items, so the running time would be O(n)
So the best case running time of insertion sort is O(n)
The best case gives us a lower bound on the running time for any input.
If the best case of the algorithm is O(n) then we know that for any input the program
needs at least O(n) time to run. In reality, we rarely need the best case for our
algorithm. We never design an algorithm based on the best case scenario.
In real life, most of the time we do the worst case analysis of an algorithm. Worst
case running time is the longest running time for any input of size n.
In the linear search, the worst case happens when the item we are searching is in the
last position of the array or the item is not in the array.
In both the cases, we need to go through all n items in the array. The worst case
runtime is, therefore, O(n). Worst case performance is more important than the best
case performance in case of linear search because of the following reasons.
1. The item we are searching is rarely in the first position. If the array has 1000
items from 1 to 1000. If we randomly search the item from 1 to 1000, there is
0.001 percent chance that the item will be in the first position.
2. Most of the time the item is not in the array (or database in general). In which
case it takes the worst case running time to run.
Similarly, in insertion sort, the worst case scenario occurs when the items are reverse
sorted. The number of comparisons in the worst case will be in the order of n2 2 and
hence the running time is O(n2)
In the case of insertion sort, when we try to insert a new item to its appropriate
position, we compare the new item with half of the sorted item on average.
The complexity is still in the order of n2 2 which is the worst-case running time.
It is usually harder to analyze the average behavior of an algorithm than to analyze its
behavior in the worst case.
This is because it may not be apparent what constitutes an “average” input for a
particular problem.
Order of growth:
It is described by the highest degree term of the formula for running time. (Drop
lower-order terms. Ignore the constant coefficient in the leading term.)
Example: We found out that for insertion sort the worst-case running time is of the
form an2 + bn + c.
grows like n2 . But it doesn’t equal n2.We say that the running time is Θ (n2) to
capture the notion that the order of growth is n2.
We usually consider one algorithm to be more efficient than another if its worst-
case running time has a smaller order of growth.
------------------------------------END--------------------------------------------------------