0% found this document useful (0 votes)
9 views69 pages

time compexity

The document provides an overview of data structures, including types such as linear and non-linear structures, and discusses various algorithms and their complexities. Key topics include asymptotic notations (Big O, Omega, and Theta), sorting algorithms, graph algorithms, and the analysis of algorithms based on time and memory. It also references foundational texts in algorithms and data structures for further reading.

Uploaded by

Melam Surendra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views69 pages

time compexity

The document provides an overview of data structures, including types such as linear and non-linear structures, and discusses various algorithms and their complexities. Key topics include asymptotic notations (Big O, Omega, and Theta), sorting algorithms, graph algorithms, and the analysis of algorithms based on time and memory. It also references foundational texts in algorithms and data structures for further reading.

Uploaded by

Melam Surendra
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 69

Data Structures

By:
Dr. Sushil Kumar,
Assistant Professor
NIT Warangal
Data Structures
 Asymptotic Notations: Big-oh, Big-omega, Theta

 Abstract Data Types (ADTs), Implementation and Applications of Stacks,


Operations and Applications of Queues, Array Implementation of Circular
Queues, Implementation of Stacks using Queues, Implementation Queues using
Stacks, Linked Lists, Search and Update Operations on Varieties of Linked
Lists, Linked List Implementation of Stacks and Queues.

 Introduction to Trees, Implementation of Trees, Binary Trees, Tree Traversals,


Binary Search Trees (BSTs), B-trees

 Hashing: Implementation of Dictionaries, Hash Function, Collisions in Hashing,


Separate Chaining, Open Addressing, Analysis of Search Operations
Data Structures

 Sorting Algorithms: Stability and In Place Properties, Insertion


Sort, Merge Sort, Quick Sort, Heap Sort, Lower Bound for
Comparison Based Sorting Algorithms, Linear Sorting
Algorithms: Counting Sort, Radix Sort, Bucket Sort

 Graph Algorithms: Graphs and their Representations, Graph


Traversal Techniques: Breadth First Search (BFS) and Depth
First Search (DFS), Applications of BFS and DFS, Prim’s and
Kruskal’s algorithms for MST, Connected Components,
Dijkstra’s Algorithm for Single Source Shortest Paths,
Warshall’s Algorithm for finding Transitive Closure of a
Graph, Floydd’s Algorithm for All-Pairs Shortest Paths
Problem
References
 Thomas H. Cormen, Charles E. Leiserson, Ronald L. Rivest
and Clifford Stein, Introduction to Algorithms, Second Edition,
PHI, 2009.

 Mark Allen Weiss, Data Structures and Algorithm Analysis in


C++, Third Edition, Pearson Education, 2006

 Ellis Horowitz, Sartaj Sahni and Sanguthevar Rajasekaran,


Fundamentals of Computer Algorithms, Second Edition,
Universities Press, 2011.

 Michael T.Goodrich and Roberto Tamassia, Algorithm


Design: Foundations, Analysis and Internet Examples, Second
Edition, Wiley-India, 2006.
Types of Data Structures

Abstract Data Types (ADTs)

Algorithms
Asymptotic Notations
Types of Data Structures
 Data structure is a particular way of
storing and organizing data in a
computer so that it can be used efficiently.

 Eg: arrays, files, linked lists,


stacks, queues, trees, graphs and
so on.
Types of Data Structures
 Depending on the organization of
the elements, data structures are
classified into two types:

1. Linear data structures: Elements


are accessed in a sequential order but
it is not compulsory to store all
elements sequentially.
Eg: Linked Lists, Stacks and Queues.
Types of Data Structures
2. Non –linear data structures:
Elements of this data structure are
stored/accessed in a non-linear order.

Eg: Trees and graphs.


Algorithms

Specification of
Input
Output

 Infinite number of Input instances


satisfying the specifications.
 Example-
1, 20, 908, 1000, 10000, 10000..
Algorithms

Specification
of Input
Algorithms Output

 Algorithms describes the actions


on the Input instances.

 Infinitely many correct algorithms


for the same algorithmic problem.
Algorithms
 Describing a problem in informal
language is called Algorithms.

 There may be so many solutions


for a single problem.

 Suppose we have many algorithms


we should decide which one is
efficient according to some rules.
Algorithms Analysis
 An algorithm can be measured
according to –

 Time
 Memory

 To design an efficient algorithm we


use some terminology.
Algorithms Analysis
Running Time Analysis: It determines
how processing time increases as the
size of the problem (input size)
increases.
Objective measures to Compare
Algorithms:
 Execution time
 Number of statements executed
 Complexity defined using input size
Rate of Growth
 The rate at which the running time
increases as a function of input (n) is
rate of growth.
Types of Analysis
To analyze an algorithm, identify the inputs
for which
 the algorithm takes less time Best case
 the algorithm takes a long time Worst
case
Average case
Run the algorithm many times, using
different inputs, compute the total running
time, and divide by the number of trials.
Lower Bound <= Average Time <= Upper
Bound
Asymptotic Notation
 The expressions for the best, average
and worst cases of 𝑓(𝑛)represented in
the form of function 𝑔(𝑛).
 Big O Notation:
Asymptotic Notation
 If input size is n and time taking by
the algorithm is t, so for Big O graph
will be above one.
 Big O Notation:
Asymptotic Notation
 Time will increase
f (n)  cg (n)
n  n0
c  0, and n0  1
If these conditions satisfy then we
say that
f (n)  O ( g (n))
Example
f (n)  3n  2
g ( n)  n
f (n)  O ( g (n))

 We should calculate two variables c


and n.
Example
f (n)  cg (n)
If f(n) = O(g(n))
for c  0
So it will satisfy the
n0  1 equation
3n  2  cn G(n) = n^2
if c  4 = n^3
3n  2  4n
Because Big O is upper
n2 bound. But it should be
so, c  4 and n  2 tightest bound and
3n  2 is O( g (n)) tightest is n.
Omega Notation (Ω)
Omega Notation (Ω)
 The above graph shows the Omega
Notation presentation.
f (n)  cg (n)
for c  0
n0  1
f (n)  3n  2
g ( n)  n
if f (n)  cg (n)
3n  2  cn
if c  1 and n0  1 it is true.
3n  2  ( g (n))
Theta Notation (𝜽)

 A function which can be an upper


bound as well as lower bound it.
Theta Notation (𝜽)
c1 g ( n)  f ( n)  c2 g ( n)
f ( n)   ( g ( n))
for c1 and c2  0 and n0  1
n  n0
f ( n)  3n  2
g ( n)  n
f ( n)  cg ( n)
3n  2  4n
where c  4 this is valid for all n0  1
f ( n)  g ( n)
c 1
3n  2  n
for n0  1
for upper bound c  4
for lower bound c  1
Notations

Big (O)  Worst Case

Omega (Ω)  Best Case

Theta (𝜽)  Average Case


Asymptotic Notations
 We are interested in finding worst case
not the best case.
 Example
9, 4, 7, 1, 3, 7, 8, 10, 31, 12, 98, 100
Search any given element in above series.
 Apply linear search.
 Suppose we want to search 9 that is the
best case.
 Searching 100 is the worst case.
Complexity Calculation

 f(n) is not the Correct time but it


is approximate time for algorithm.
Complexity Calculation
There are two types of
algorithms:
algorithms

Iterative Recursive
Complexity Calculation
Iterative:
fun()
{
for( i = 1 to n )
max (a,b,c);
.
.
.
}
Complexity Calculation
Recursive
fun()
{
if( )
fun(n/2);
}
Complexity Calculation
 Any iterative algorithm can be
converted into recursive and any
recursive algorithm can be converted
into iterative.
 Analysis: for recursion A(n) and we
are calling A(n/2) inside A(n) then we
will do with f(n) and f(n/2).
 In the iterative calculate on the basis
of no. of iterations.
Complexity Calculation
A()
{ int i;
for( i = 1 to n )
cout<<“NITW”;
}

So, It will run n-times and print NITW


n-times. Complexity of the algorithm
will be O(n).
Complexity Calculation
A()
{ int i, j;
for( i = 1 to n )
for( j = 1 to n )
cout<<“NITW”;
}
For i = 1, program will print NITW n-times
For i = 2, program will print NITW 2n-times
For i = 3, program will print NITW 3n-times
.
.
For i = n, program will print NITW n*n-times.
So, Complexity will be O(𝑛2 )
Complexity Calculation
A()
{
i = 1; S = 1;
While(S<=n)
{ i++;
S = S + i;
cout<<“NITW”;
}
}
I = 1 2 3 4 5 6….K
S = 1 3 6 10 15 21…n
So, for any value of k S will reach n and result for S is sum
of first natural numbers till I reached. So,
𝐾2 +𝐾
K(K+1)/2 = n, > 𝑛, 𝐾 = 𝑂( 𝑛 )
Complexity Calculation
A()
{
I = 1;
𝒇𝒐𝒓 𝒊 = 𝟏; 𝒊𝟐 ≤ 𝒏; 𝒊 + +
cout<<“NITW”;
}

So, we can replace 𝑖 2 = 𝑛 𝑡𝑜 𝑖 = 𝑛.


So, the loop will execute 𝑛 times.
Complexity will be 𝑂 𝑛
Complexity Calculation
A()
{
int i, j, k, n;
𝒇𝒐𝒓 𝒊 = 𝟏; 𝒊 ≤ 𝒏; 𝒊 + +
{
𝒇𝒐𝒓 𝒋 = 𝟏; 𝒋 ≤ 𝒊; 𝒋 + +
{
𝒇𝒐𝒓 𝒌 = 𝟏; 𝒌 ≤ 𝟏𝟎𝟎; 𝒌 + +
cout<<“NITW”;
}
}
}
Complexity Calculation
For i = 1 For i = 2 For i = 3
j = 1 time j = 2 times j = 3 times
k = 100 times k = 2*100 times k = 3*100 times

For i = 4 For i = 5 For i = n


j = 4 times j = 5 times j = n time
k = 4*100 times k = 5*100 times k = n*100 times

For n, k will execute : 100 + 2*100 + 3*100 +


4*100…n*100
=100(1+2+3+…n)
=100*(n(n+1))/2
So complexity will be 𝑂(𝑛2 )
Complexity Calculation
A()
{
int i, j, k, n;
𝒇𝒐𝒓 𝒊 = 𝟏; 𝒊 ≤ 𝒏; 𝒊 + +
{
𝒇𝒐𝒓 𝒋 = 𝟏; 𝒋 ≤ 𝒊𝟐 ; 𝒋 + +
{
𝒏
𝒇𝒐𝒓 𝒌 = 𝟏; 𝒌 ≤ ; 𝒌 + +
𝟐
{
cout<<“NITW”;
}
}
}
Complexity Calculation
For i = 1
For i = 2 For i = 3
J = 1 time
J = 4 times J = 9 times
K = n/2*1
K = n/2*4 times K = n/2*9 times
times

For i = 4 For i = n
For i = 5
J = 16 times j = 𝒏𝟐 time
J = 25 times
K = n/2*16 K = n/2* 𝒏𝟐
K = n/2*25
times times

For total it will execute : n/2(1 + 4 + 9 + …𝑛2 )


= n/2(n(n+1)(2n+1)/6)
= O(𝑛4 )
So complexity will be 𝑂(𝑛4 )
Complexity Calculation
A()
{
𝒇𝒐𝒓 𝒊 = 𝟏; 𝒊 < 𝒏; 𝒊 = 𝒊 ∗ 𝟐
cout<<“NITW”;
}

For i = 1, 2, 4, …n
= 2 ,2 ,2 ,…𝑛
0 1 2

If 2 = 𝑛 it will stop here.


𝑘
𝑘
2 =𝑛
𝑘 = 𝑙𝑜𝑔𝑛
Complexity = O(log 2 𝑛)
Complexity Calculation
A()
{
int i, j, k;
𝒇𝒐𝒓(𝒊 = 𝒏/𝟐; 𝒊 ≤ 𝒏; 𝒊 + +)
𝒇𝒐𝒓(𝒋 = 𝟏; 𝒋 ≤ 𝒏/𝟐; 𝒋 + +)
𝒇𝒐𝒓(𝒌 = 𝟏; 𝒌 ≤ 𝒏; 𝒌 = 𝒌 ∗ 𝟐)
cout<<“NITW”;
}

For first loop it will take  n/2


For second loop it will take n/2
Third loop it will take  log 2 𝑛
Total time it will take  n/2*n/2*log 2 𝑛
𝑂(𝑛2 log 2 𝑛)
Complexity Calculation
A()
{
int i, j, k;
𝒇𝒐𝒓(𝒊 = 𝒏/𝟐; 𝒊 ≤ 𝒏; 𝒊 + +)
𝒇𝒐𝒓(𝒋 = 𝟏; 𝒋 ≤ 𝒏; 𝒋 = 𝟐 ∗ 𝒋)
𝒇𝒐𝒓 𝒌 = 𝟏; 𝒌 ≤ 𝒏; 𝒌 = 𝟐 ∗ 𝒌
cout<<“NITW”;
}

Total = n/2*log 2 𝑛 ∗ log 2 𝑛


= n/2(log 2 𝑛) 2

= O(𝑛(log 2 𝑛)^2)
Complexity for Recursive
A(n)
{
if (condition)
return(A(n/2)+A(n/2));
}

If complexity for A(n) is T(n) then for


above segment it will be:
T(n) = c + 2.T(n/2)
Where, c  is a constant for addition.
Back Substitution
A(n)
{
if (n>1)
Return (A(n-1))
}
Back Substitution
A(n)
{
if (n>1)
Return (A(n-1))
}
Some Constant time for if(n>1), so total time will be:
T(n) = 1+ T(n-1) (1)
T(n-1) = 1 + T(n-2) (2)
T(n-2) = 1 + T(n-3) (3)
Put (2) in (1)
T(n) = 1+1+T(n-2)
T(n) = 1+1+1+T(n-3)

T(n) = k + T(n-k)
Complexity for Recursive
So when it will stop, T(n) = 1+T(n-1)
So, T(n-1) gradually decreasing based
on (n>1).
So, T(n) = 1. it will stop here.
So, n - k = 1, k = n - 1,
T(n) = n-1+T(n-(n-1))
T(n) = n – 1 + T(1)
T(n) = n
So, it will be O(n)
Complexity for Recursive
T(n) = n + T(n-1); n>1
= 1, n=1
Solution---
Complexity for Recursive
T(n) = n + T(n-1); n>1
= 1, n=1
Solution---
T(n-1) = n-1 + T(n-2)
T(n-2) = n-2 + T(n-3)
T(n) = n + n -1 + T(n-2)
= n + (n-1) +(n-2) +T(n-3)
= n + (n-1) + (n-2) + ….(n-k) + T(n-(k+1))
Stopping criteria or base condition
If n = 1
T(1) = 1 + T(0) = 1,
So, n – k + 1 = 1
K=n–2
=n + (n-1) + (n-2) +..n-(n-2) + …+ T(n-(n-2+1)
= n + (n-1) + (n-2) + …2+1
=n(n+1)/2 = 𝑂(𝑛2 )
Recursion Tree Method
T(n) = 2T(n/2) + c; n>1
= c; n=1
Recursion Tree Method
Complexity for Recursive
 When it reach at T(1) stop it.
 Total work done at first level – c
 At second level - 2c
 Third -4c
.
.
T(1) = T(n/n)
Complexity for Recursive
 Total work done = c + 2c + 4c + …n
= c(1+2+4+…n)
𝟏.(𝟐𝒌+𝟏 −𝟏)
=𝒄
𝟐−𝟏

=𝒄 𝟐 𝒌+𝟏
−𝟏
= c(2n-1)
= O(n)
Complexity for Recursive
T(n) = 2.T(n/2) + n; n>1
=1; n=1
Complexity for Recursive
n

T (n/2) T (n/2)

(n/2) (n/2)

T(n/4) T(n/4) T(n/4) T(n/4)


Complexity for Recursive
 So, we have n leaves having work
done – n
𝒏 𝒏 𝒏 𝒏
So, 𝟎 , 𝟏, 𝟐,… =
𝟐 𝟐 𝟐 𝟐𝒌
𝒏 = 𝟐𝒌
𝑲 = 𝒍𝒐𝒈𝒏
Total no. of level = n(logn +1)
= O(nlogn)
Complexity for Recursive
 // Sum returns the sum 1 + 2 + ... + n, where
n >= 1.
func Sum(n int) int64
{ if n == 1
{ return 1 }
return int64(n) + Sum(n-1)
}
Complexity for Recursive
func Find(a []int, x int) int
{ switch len(a)
{ case 0: return 0
case 1: if x <= a[0]
{ return 0 }
return 1
}
mid := 1 + (len(a)-1)/2
if x <= a[mid-1]
{ return Find(a[:mid], x) }
return mid + Find(a[mid:], x)
}
Abstract Data Type
 ADT is a mathematical model for data types where a
data type is defined by its behavior (semantics) from
the point of view of a user of the data, specifically in
terms of possible values, possible operations on data
of this type, and the behavior of these operations.

 an abstract data structure is just


some arrangement of data that
we've built into an orderly
arrangement.
Abstract Data Type
 The reason we use abstract structures is
because they efficiently use memory based
on the design of the data stored in them.
 With very large amounts of data or very
frequently changing data, the data
structure can make a huge difference in
the efficiency (run time) of your computer
program.
Some types of abstract data
structures
Static and dynamic data structure
Arrays
Two-dimensional arrays
Stack
Queue
Linked list
Tree
Binary tree
Collections
Lists
Dictionaries
Sets
Tuple
Abstract Data Type
Comparison of different data
structures

Data Structure Strengths Weaknesses

Sorting and searching, Inserting and


Inserting and deleting elements, deleting - especially if you are
arrays
iterating through the collection inserting and deleting at the
beginning or the end of the array

direct indexing, Easy to create and


linked list direct access, searching and sorting
use

stack and queue designed for LIFO / FIFO direct access, searching and sorting

speed of insertion and deletion,


binary tree speed of access, maintaining sorted some overhead
order
Comparison of static vs dynamic
data structures

Static data structure Dynamic data structure


Inefficient as memory is allocated that may not be Efficient as the amount of memory varies as
needed. needed.
Fast access to each element of data as the memory Slower access to each element as the memory
location is fixed when the program is written. location is allocated at run-time.
Memory addresses allocated will be contiguous so Memory addresses allocated may be fragmented
quicker to access. so slower to access.
Structures are a fixed size, making them more Structures vary in size so there needs to be a
predictable to work with. For example, they can mechanism for knowing the size of the current
contain a header. structure.
The relationship between different elements of The relationship between different elements of
data does not change. data will change as the program is run.
malloc and calloc
The argument of malloc () is the
unsigned integer or an expression
which evaluates to an unsigned
integer.
Since the function malloc () returns
a void pointer, it has to be cast to
the type of data being dealt with.
The prototype of the function is given below.
void* malloc(size_t size);
malloc and calloc
# include <stdlib.h>
int *ptri ;
ptri = (int*) malloc(n*sizeof(int));

 the function malloc allocates n*sizeof


(int) bytes of memory and returns the value
of pointer ptri.
malloc and calloc
struct Student
{
char Name [30];
int grade;
};
If it is required to dynamically create n such
structures, the memory allocation for the n structures
may be done as follows:
struct student * Pst;
Pst = (struct Student*) malloc( n* sizeof(struct Student));
calloc()
Allocate and zero-initialize
array.
Allocates a block of memory for
an array of num elements, each
of them size bytes long, and
initializes all its bits to zero.
void* calloc (size_t num, size_t size);
Parameters
num-
Number of elements to allocate.

size-
Size of each element.

size_t is an unsigned integral type.


Return Value
 On success, a pointer to the memory block
allocated by the function.

You might also like