0% found this document useful (0 votes)
19 views22 pages

Advanced Sorting

The document discusses three sorting algorithms: Shell Sort, which improves on insertion sort by sorting distant elements first before reducing the gap; Quick Sort, which follows the divide and conquer approach by picking a pivot element and partitioning the array around it; and Merge Sort, which also uses divide and conquer by recursively splitting the array in half and merging the sorted halves.

Uploaded by

Firomsa Dine
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
19 views22 pages

Advanced Sorting

The document discusses three sorting algorithms: Shell Sort, which improves on insertion sort by sorting distant elements first before reducing the gap; Quick Sort, which follows the divide and conquer approach by picking a pivot element and partitioning the array around it; and Merge Sort, which also uses divide and conquer by recursively splitting the array in half and merging the sorted halves.

Uploaded by

Firomsa Dine
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 22

Advanced Sorting

1
Advanced Sorting
▪ Shell Sort:
▪ Quick Sort:
▪ Merge Sort: etc.

2
Shell Sort
It is a sorting algorithm that is an extended version of insertion sort.
In insertion sort, at a time, elements can be moved ahead by one position
only.
To move an element to a far-away position, many movements are required
that increase the algorithm's execution time. But shell sort overcomes this
drawback of insertion sort.
This algorithm first sorts the elements that are far away from each other,
then it subsequently reduces the gap between them. This gap is called as
interval
To understand the working of the shell sort algorithm, let's take an
unsorted array. It will be easier to understand the shell sort via an example.

3
Shell Sort
Example

How the algorithm works ?


Step 1 find the interval or gap of the are using n/2---------n means size of array
In the first loop, n is equal to 8 (size of the array), so the elements are lying at the
interval of 4 (n/2 = 4). Elements will be compared and swapped if they are not in
order.
Here, in the first pass, the element at the 0th position will be compared with the
element at 4th position.
Step 2 If the 0th element is greater, it will be swapped with the element at 4th
position. Otherwise, it remains the same. This process will continue for the
remaining elements.
Step 3 calculate interval or gap until >=1 using gap/2 4

At the interval of 4, the sub lists are {33, 12}, {31, 17}, {40, 25}, {8, 42}.

Now, we have to compare the values in every sub-list. After comparing, we have to
swap them if required in the original array. After comparing and swapping, the
updated array will look as follows –

5

In the second pass, elements are lying at the interval of 2 (gap/2 = 2),
where gap=4.
• Now, we are taking the interval of 2 to sort the rest of the array. With an
interval of 2, two sub lists will be generated - {12, 25, 33, 40}, and {17, 8,
31, 42}..

6
In the above example:
In the third pass, elements are lying at the interval of 1 (gap/2= 1), where gap = 2. At last,
we use the interval of value 1 to sort the rest of the array elements. In this step, shell sort
uses insertion sort to sort the array elements

Exercise
23 29 15 19 31 7 9 5 2

7
Time complexity of shell sort algorithms

➢ Best Case Complexity - It occurs when there is no sorting required, i.e., the array is already
sorted. The best-case time complexity of Shell sort is O(n*logn).
➢ Average Case Complexity - It occurs when the array elements are in jumbled order that is
not properly ascending and not properly descending. The average case time complexity of
Shell sort is O(n*logn).
➢ Worst Case Complexity - It occurs when the array elements are required to be sorted in
reverse order. That means suppose you have to sort the array elements in ascending order,
but its elements are in descending order. The worst-case time complexity of Shell sort
is O(n2). 8
Quick Sort
This algorithm follows the divide and conquer approach.
Divide and conquer is a technique of breaking down the algorithms into
subproblems, then solving the subproblems, and combining the results back
together to solve the original problem.
Divide: In Divide, first pick a pivot element.
After that, partition or rearrange the array into two sub-arrays such that each
element in the left sub-array is less than or equal to the pivot element and each
element in the right sub-array is larger than the pivot element.

Conquer: Recursively, sort two subarrays with Quicksort.


After that, left and right sub-arrays are also partitioned using the same
approach. It will continue until the single element remains in the sub-array.
let Pivot =10 10 15 1 2 9 16 11
9
The Algorithm of quick sort

Example 1 A=
In the given array, we consider the leftmost element as pivot. So, in this case, a[left]
= 24, a[right] = 27 and a[pivot] = 24.
Algorithm for Partition
i. if (left<right)
1. if a[left]<=pivot element left ++, else stop;
2. if a[right] >pivot element right -- , else stop}
3. swap (a[left] , a[right])}
ii if(left>=right)
Swap(pivot, a[right])

Example 2 7 6 10 5 9 2 1 15
10 7
Example [ pivot selected from the middle]

11
Recursion Function

➢Recursion is a technique in which a function calls itself repeatedly


until a given condition is satisfied
➢When a recursive function is called, it executes a set of
instructions and then calls itself to execute the same set of
instructions with a smaller input.
This process continues until a base
case is reached, which is a condition
that stops the recursion and returns a
value.

12
In the above Example
To begin with, the middle element 6 is chosen as the bound, and the array
partitioned into two subarrays.
Each of these subarrays is then partitioned using the bounds 2 and 10,
respectively.
In the third phase, two of the four subarrays only have one element, so they
are not partitioned further.
The other two subarrays are partitioned once more, using the bounds 3 and
7.
Finally, we are left with only subarrays consisting of a single element. At
this point, the subarrays and bounds are recombined successively, resulting
in the sorted array.

13
Time complexity of Quick Sort algorithms
➢Best Case Complexity - In Quicksort, the best-case occurs when the pivot
element is the middle element or near to the middle element. The best-case time
complexity of quicksort is O(n*logn).
➢Average Case Complexity - It occurs when the array elements are in jumbled
order that is not properly ascending and not properly descending. The average
case time complexity of quicksort is O(n*logn).
➢Worst Case Complexity - In quick sort, worst case occurs when the pivot
element is either greatest or smallest element. Suppose, if the pivot element is
always the last element of the array, the worst case would occur when the given
array is sorted already in ascending or descending order. The worst-case time
complexity of quicksort is O(n2).

14
Merge Sort
Merge sort is the sorting technique that follows the divide and conquer
approach.
Merge sort also works by successively partitioning the array into two
subarrays, but it guarantees that the subarrays are of approximately equal
size. by calculate middle =first + last/2
In merge sort the array is partitioned without regard to the values of their
elements: it is simply divided down the array into two halves.
Each of these halves is recursively sorted using the same algorithm.
After the two subarrays have been sorted, they are merged back together
again.
The recursion continues until the subarrays consist of a single element; in
which case they are already sorted.
15
Example

16
Pseudocode code for merge sort
Algorithm
➢In the following algorithm, arr is the given array, beg is the starting element,
and end is the last element of the array.
MERGE_SORT(arr, beg, end)
15 5 24 8 1 3 16 20 20
if beg < end
set mid = (beg + end)/2
MERGE_SORT(arr, beg, mid)
MERGE_SORT(arr, mid + 1,
end)
MERGE (arr, beg, mid, end)
end of if
END MERGE_SORT

17
Time complexity of Merge Sort algorithm

➢Best Case Complexity: The merge sort algorithm has a best-case time
complexity of O(n*log n) for the already sorted array.
➢Average Case Complexity: The average-case time complexity for the merge
sort algorithm is O(n*log n), which happens when 2 or more elements are
jumbled, i.e., neither in the ascending order nor in the descending order.
➢Worst Case Complexity: The worst-case time complexity is also O(n*log
n), which occurs when we sort the descending order of an array into the
ascending order.

18
Summery
▪ Shell Sort: This is a variation of insertion sort that works by sorting elements that are
far apart from each other first and then progressively reducing the gap between them.
It is often used when the input data is nearly sorted.

▪ Merge Sort: This is a comparison-based sorting algorithm that works by first building a
binary heap from the array and then repeatedly extracting the maximum element and
placing it at the end of the array.

▪ Quick Sort: This is a divide-and-conquer algorithm that works by selecting a "pivot"


element from the array and partitioning the other elements into two sub-arrays,
according to whether they are less than or greater than the pivot. The sub-arrays are
then sorted recursively.

19
Quick Sort vs Merge Sort
1.Partition of elements in the array : In the merge sort, the array is parted into
just 2 halves (i.e. n/2). whereas In case of quick sort, the array is parted into
any ratio. There is no compulsion of dividing the array of elements into equal
parts in quick sort.
2.Worst case complexity : The worst case complexity of quick sort is O(n^2) as
there is need of lot of comparisons in the worst condition. whereas In merge
sort, worst case and average case has same complexities O(n log n).
3.Usage with datasets : Merge sort can work well on any type of data sets
irrespective of its size (either large or small). whereas The quick sort cannot
work well with large datasets.
4.Additional storage space requirement : Merge sort is not in place because it
requires additional memory space to store the auxiliary arrays. whereas The
quick sort is in place as it doesn’t require any additional storage.
5.Efficiency : Merge sort is more efficient and works faster than quick sort in
case of larger array size or datasets. whereas Quick sort is more efficient and
works faster than merge sort in case of smaller array size or datasets. 20
Quick Sort vs Merge Sort
6. Sorting method : The quick sort is internal sorting method where the data
is sorted in main memory. whereas The merge sort is external sorting method
in which the data that is to be sorted cannot be accommodated in the memory
and needed auxiliary memory for sorting.
7. Stability : Merge sort is stable as two elements with equal value appear in
the same order in sorted output as they were in the input unsorted array.
whereas Quick sort is unstable in this scenario. But it can be made stable using
some changes in code.
8.Preferred for : Quick sort is preferred for arrays. whereas Merge sort is
preferred for linked lists.
9. Locality of reference : Quicksort exhibits good cache locality and this
makes quicksort faster than merge sort (in many cases like in virtual memory
environment).
21

???
End!!!!!!!!

22

You might also like