0% found this document useful (0 votes)
27 views

Sorting and Searching Ppt

The document discusses various searching and sorting algorithms, focusing on their implementation and time complexity analysis. Key algorithms covered include Linear Search, Binary Search, Bubble Sort, Insertion Sort, Selection Sort, Quick Sort, and Merge Sort, along with their advantages and disadvantages. It also outlines important terms and criteria for evaluating sorting algorithms, such as time complexity, space complexity, and stability.

Uploaded by

Akash Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views

Sorting and Searching Ppt

The document discusses various searching and sorting algorithms, focusing on their implementation and time complexity analysis. Key algorithms covered include Linear Search, Binary Search, Bubble Sort, Insertion Sort, Selection Sort, Quick Sort, and Merge Sort, along with their advantages and disadvantages. It also outlines important terms and criteria for evaluating sorting algorithms, such as time complexity, space complexity, and stability.

Uploaded by

Akash Kumar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 57

Searching & Sorting

Algorithms
Dr. Akash Kumar
Gautam Buddha University, Greater Noida
Objective
This lab aims to implement various searching and sorting algorithms
and analyze their time complexity.
• Linear Search
• Binary Search
• Bubble sort
• Insertion Sort
• Selection Sort
• Quick Sort
• Merge Sort
Searching

• Searching is a methods to find the element in any data structure like array, linked-list, tree,
graph.

• It decides whether a search key is present in the data or not.

• It is the algorithmic process of finding a particular item in a collection of items.


Searching

Sequential Binary
Search Search
Sequential Search Example

• Sequential search is also called as Linear Search.


• Sequential search starts at the beginning of the list and checks every element of the list.
• Sequential search compares the element with all the other elements given in the list.
Sequential/Linear Search Algorithm
int linearSearch(int arr [], int n, int target)
{
for (int i = 0; i < n; i++)
{
if (arr[i] == target) return i;
}
return -1;
}
Time complexity

• Best Case:- Ω(1)

• Average and Worst Case:- O(n)


Time complexity

• Best Case:- Ω(1)

• Average and Worst Case:- O(n)


Binary search
• Array must be sorted.

• Comparing the “target element” with middle most item of the array.

• Middle element=(low+high)/2

• If a match occurs, then the index of item is returned.

• If the middle element > “target element”, then search the target element from
the left side of the array. Otherwise, Search from the right side of the array.

• This process continues on the sub-array.


Binary Search Example
Binary Search Algorithm
int binarysearch(int arr[], int low, int high, int target)
{
while (low<=high){
int mid=(low+high)/2;
if (target==arr[mid]){return mid;} If target found then return index value
if(target<arr[mid])
high=mid-1;
Move left side of the array
else
low=mid+1; Move right side of the array
}
return -1; If target not found then return index value -1
}
Time complexity

• Best Case:- Ω(1)

• Average and Worst Case:- O(logn)


Linear Vs Binary Search

Linear Search Binary Search

In linear search input data need not to be in sorted. In binary search input data need to be in sorted order.

It is also called sequential search. It is also called half-interval search.

The time complexity of linear search O(n). The time complexity of binary search O(log n).

Multidimensional array can be used. Only single dimensional array is used.

It is less complex. It is more complex.


Important Terms in Sorting
Algorithm
 Increasing Order:- Successive element is greater than the previous one . For example, 10, 25, 35,
45, 60
 Decreasing Order :- Successive element is less than the previous one . For example, 60, 45, 35,
25, 10
 Non-decreasing Order :- Successive element is greater than or equal to the previous one. For
example, 10, 25, 35, 35, 45, 60

 Non-Increasing Order:- Successive element is less than or equal to the previous one. For example,
60, 45, 35, 35, 25, 10
Sorting
● Sorting refers to arranging data in a particular format.
● Sorting algorithm specifies the way to arrange data in a particular
order.
● The choice of the algorithm depends on the criteria of the task.
Analysis Criteria for Sorting
Algorithm
 Time Complexity
 Space Complexity (In-place and Not-In-place sorting)
 Comparison sorting
 Stable and Unstable sorting
 Internal and External sorting
 Recursive and Non-Recursive sorting
 Adaptive and Non-Adaptive sorting
Analysis Criteria for Sorting Algorithm
 Time Complexity :- The best and worst case time complexity of the sorting algorithms is O(n) and O().
 Space Complexity
 In-place sorting:- Not extra space require for comparison and temporary storage of few data
elements. For example:- Bubble, Insertion, Selection, Quick sort algorithm
 Not-in-place sorting:- Require some extra space for comparison and temporary storage of few
data elements. For example:- Merge sort
 Comparison sorting:- Example:- Bubble, Insertion, Selection, Quick sort algorithm
 Stable:- After sorting the contents, does not change the sequence of similar content in
which they appear.
 Unstable sorting:- After sorting the contents, change the sequence of similar content in
which they appear.
Stable Sorting
25 10 35 45 60 35

10 25 35 35 45 60

Unstable Sorting
25 10 35 45 60 35

10 25 35 35 45 60
Stable Sorting

2 1 1 4 5 3

1 1 2 3 4 5

Unstable Sorting
2 1 1 4 5 3

1 1 2 3 4 5
Analysis Criteria for Sorting
Algorithm
 Internal sorting:- All the data is loaded into the memory
 External sorting :- Not Loaded into the memory

 Recursive Sorting:- Merge and Quick Sort


 Adaptive sorting :- Takes advantage of already 'sorted' elements in the list that is to be
sorted.

 Non-Adaptive sorting :- Does not take into account the elements which are already
sorted & force every single element to be re-ordered to confirm their sortedness.
Few important Sorting Algorithms

● Bubble Sort
● Selection Sort
● Insertion Sort
● Merge Sort
● Quick Sort
Bubble Sort

● It is a simple, in-place comparison-based sorting algorithm.

● Each pair of adjacent elements is compared and swapped them if they are in
wrong order.
● The algorithm repeats this process until the list is sorted.
Bubble Sort Example
Bubble Sort Algorithm (Code)

for (j=0;j<n;j++) // Pass


{
count=count+1;
for (i=0; i<n-count; i++) // Traverse
{
if (arr[i]>arr[i+1])
{
value=arr[i+1]; Swap operation
arr[i+1]=arr[i];
arr[i]=value;
}
}
}

}
Time complexity

● Best Case:- For sorted array (Ω(n))


● Average and worst Case:- O()
Advantages of Bubble Sort

● Simple to implement and understand


● Stable
● Suitable for small datasets
● Can be used to detect if a list is already sorted
Disadvantages of Bubble Sort

● Poor performance: Bubble sort has a worst-case time complexity of O(n^2),


making it inefficient for large datasets. It requires multiple passes through the
array and can be slow for large lists.
● Not suitable for most real-world scenarios
Selection Sort
● It’s also an in-place comparison-based sorting algorithm.
● It works by repeatedly selecting the minimum element from the unsorted
portion of the array and placing it at the end of the sorted portion
● It repeats this process until the list is sorted.
Selection Sort (Basic Steps)
● Select the first element is the min element from the unsorted array.
● Traverse the array to find the minimum element in the array.
● If any element smaller than min is found then select them as the min value,
until smallest element is not found.
● Repeat that process again for the smallest element of the reaming unsorted
array.
Selection Sort Example
min swap

7 4 5 9 8 2 1 Unsorted
min swap
1 4 5 9 8 2 7

min swap
1 2 5 9 8 4 7
swap
min
1 2 4 9 8 5 7

min swap
1 2 4 5 8 9 7

min swap
1 2 4 5 7 9 8

1 2 4 5 7 8 9 Sorted
Selection Sort Algorithm (Function)

int selectionsort(int arr[], int n) //Function definition


{
int i, value, j;
for (i=0;i<n;i++)
{ Selecting the First element
int min=i; // from the unsorted array
for (j=i+1;j<n;j++)
{ Finding the minimum element
if (arr[min]>arr[j]){min=j;}
}
value=arr[i];
arr[i]=arr[min]; Swap operation
arr[min]=value;
}
}
Time complexity

● Best, Average and worst Case:- O()


Advantages of Selection Sort

● Simple to implement and understand


● Suitable for small datasets.
● In-place sorting
Disadvantages of Selection Sort

● Poor performance.
● Not suitable for most real-world scenarios.
● Unstable
Insertion Sort

● This is also an in-place comparison-based sorting algorithm like bubble and


selection sort.
● The lower part of an array is maintained to be sorted.
● An element which is to be inserted in this sorted sub-list, has to find its
appropriate place and then it has to be inserted there.
● Repeats this process until the array is sorted.
Insertion Sort Example

1st Pass 12 11 13 5 6 11 12 13 5 6 After 1st Pass

2nd Pass 11 12 13 5 6 11 12 13 5 6 After 2nd Pass

3rd Pass 11 12 13 5 6 11 12 5 13 6

11 5 12 13 6

5 11 12 13 6 After 3rd Pass

4th Pass 5 11 12 13 6 5 11 12 6 13

5 11 6 12 13

5 6 11 12 13 After 4th Pass


Insertion Sort Algorithm (Function)

int insrtsort(int arr[], int n) //Function definition


{
int i,j, key, value;
for (i=1;i<n;i++)
{
key=arr[i];
for (j=i-1 ;j>=0;j--)
{
if (arr[j]>key)
pass {
value=arr[j]; swap For traverse the left side of the array
arr[j]=arr[j+1];
arr[j+1]=value;

}
}
}}
Time complexity

● Best Case:- For sorted array (Ω(n))


● Average and worst Case:- O()
Advantages of Insertion Sort
● Simple to implement and understand.
● Stable: Preserve the relative order of elements with equal keys.
● Suitable for small datasets.
● In-place sorting algorithm
● Adaptive: It can detect when the list is already sorted and stop early, making
it more efficient in these cases.
Disadvantages of Insertion Sort
● Poor performance: Average and Worst-case time complexity of O(), making
it inefficient for large datasets. It requires multiple passes which make them
slow for large size array.

● Not suitable for most real-world scenarios: Due to its poor performance,
insertion sort is not suitable for use in most real-world scenarios where larger
datasets are involved. It is primarily used for educational purposes and small
datasets.
Merge Sort

● It is a divide-and-conquer algorithm for sorting an array or list of items


● First, divides the array into recursively into two halves until it can no more be divided.
● Second, merge the smaller lists in sorted order.
● Not-in-place:-Require extra O(n) space.
● Merge Sort is a type of recursive algorithm.
Merge Sort Example

Divide

Conquer
Merge Sort Algorithm

MergeSort(int arr[], int low, int high) {


if (low < high) {
mid = (low+high)/2;
MergeSort(arr, low, mid);
MergeSort(arr, mid + 1, high);
Merge(arr,low,mid,high);
}}
Merge Function
void merge(int arr[], int low, int high, int mid)
{ i=0 j<=high
i<=mid j=mid+1
int i=low;
int j=mid+1;
3 27 38 43 9 10 82
int k=low;
int arrB[high+1];
while(i<=mid && j<=high) k=0
{
if (arr[i]<arr[j]){arrB[k]=arr[i]; i++; k++;} 3 9 10 27 38 43 82
else{arrB[k]=arr[j]; j++;k++;}
} arrB[high+1](Size of the array)
while(i<=mid){arrB[k]=arr[i]; i++; k++;}
while(j<=high){arrB[k]=arr[j]; j++; k++;}
for (i=low; i<=high; i++){arr[i]=arrB[i];}
}
Time complexity
● Best, Average and worst Case:- O(n logn)
T(n)=2T(n/2)+n; (1)
T(n/2)=2T(n/2^2)+n/2; (2)
Put T(n/2) value in equation 1.
=2(2T(n/4)+n/2) +n
= 2^2T(n/2^2)+ n+n
= 2^2T(n/2^2)+ 2*n
= 2^iT(n/2^i) + i*n ;

put n / 2^i = 1 => i= log(n)


T(n) = 2^( log(n) )T(1) + log(n) * n
= log(n)T(1) + n*log(n)
=O(n log(n))
Merge sort Advantages

● It is a stable sort.
● Suitable for sorting large data sets.
● The divide and conquer strategy allows sorting elements in a parallel way,
which can be very beneficial in certain scenarios where the algorithm needs to
be efficient and running on multiple cores.
Merge Sort Disadvantages

● Not-In-Place
● It may not be efficient for small arrays, as the overhead of the divide-and-
conquer strategy can outweigh the benefits for small data sets.
● It's not always the best choice when working with real-time systems where
memory and time constraints are strict.
Quick Sort
● Quick sort uses divide and conquer to gain the same advantages as the merge
sort, while not using additional storage.
● It selects a "pivot" element from the array and partition the other elements into
two parts:
1. less than the pivot in the left side
2. greater than the pivot in right side.
● The pivot element is then in its correct position in the sorted array.
● The partitioning step is then repeated recursively on the two sub-arrays until
the entire array is sorted.
Quick Sort Example
Quick Sort Algorithm

QuickSort(int arr[], int low, int high) {


if (low < high) {
/* p is partitioning index, arr[p] is now at right place */
p = partition(arr, low, high);
quickSort(arr, low, p – 1); // Before p
quickSort(arr, p + 1, high); // After p
}}
Partition Function
int partion(int a[], int low, int high)
{
int i, j, pivot, value;
pivot=a[low];
i=low+1; j=high; pivot=low;
while(i<j)
{
while(a[i]<=a[pivot]){i++;} // Traverse until greater value find then stop it
while(a[j]>a[pivot]){j--;} // Traverse until less value find then stop it
if (i < j)
{
Swap the smallest element from the left side of the
value=a[i];
array and larger element form the right side of the array
a[i]=a[j];
a[j]=value;
}
}
value=a[low];
a[low]=a[j]; Place the pivot value to the right position
a[j]=value;
return j; // return the pivot position to the quick sort function
}
Time complexity

● Always picks the middle element as the pivot. Time complexity on that
scenario is θ(nlogn).

● Sorted list (Worst Case): Select the first or last element is always picked as
a pivot. Time complexity on that case is O().
Best and Average case:- Time
complexity
T(n)=2T(n/2)+n; (1)
T(n/2)=2T(n/2^2)+n/2; (2)
Put T(n/2) value in equation 1.
=2(2T(n/4)+n/2) +n
= 2^2T(n/2^2)+ n+n
= 2^2T(n/2^2)+ 2*n
= 2^iT(n/2^i) + i*n ;

put n / 2^i = 1 => i= log(n)


T(n) = 2^( log(n) )T(1) + log(n) * n
= log(n)T(1) + n*log(n)
=O(n log(n))
Worst Case:- Time complexity
10 20 30 40 50
Advantages of Quick Sort

● It is considered one of the fastest sorting algorithms, with an average time


complexity of (n log n).
● It does not require additional memory for sorting, as it sorts in-place.
Disadvantages of Quick Sort

● It is not a stable sorting algorithm.


● It is not adaptive.
Comparison based on the time complexity
memory, stability and adaptively
Algorithm Time In-place Stable Adaptive
Complexity
Bubble sort O(n^2) Yes Yes Yes

Selection sort Yes No No


O(n^2)

Insertion sort Yes Yes Yes


O(n^2)

Quick sort O(n log(n)) Yes No Yes

Merge sort O(n^2) No Yes No


Time and Space Complexity comparison

Algorithm Time Complexity


Best Average Worst
Bubble sort Ω(n) θ(n^2) O(n^2)

Selection sort Ω(n^2) θ(n^2) O(n^2)

Insertion sort Ω(n) θ(n^2) O(n^2)

Merge sort Ω(n log(n)) θ(n log(n)) O(n log(n))

Quick sort Ω(n log(n)) θ(n log(n)) O(n^2)

You might also like