The document presents the selection sort and insertion sort algorithms. It demonstrates how selection sort works by repeatedly finding the smallest element in the unsorted portion of the array and swapping it into the sorted portion. It also shows how insertion sort inserts one element at a time into the sorted portion by shifting larger elements to make room. Both algorithms view the array as having a sorted portion that grows gradually as elements are added from the unsorted portion.
Sorting Order and Stability in Sorting.
Concept of Internal and External Sorting.
Bubble Sort,
Insertion Sort,
Selection Sort,
Quick Sort and
Merge Sort,
Radix Sort, and
Shell Sort,
External Sorting, Time complexity analysis of Sorting Algorithms.
The document discusses three quadratic sorting algorithms: selection sort, insertion sort, and bubble sort. It provides pseudocode for selection sort and insertion sort, and describes their operation through examples. Both selection sort and insertion sort have a worst-case and average-case runtime of O(n^2) where n is the number of elements to sort.
The document describes several sorting algorithms:
1) Bubble sort, selection sort, insertion sort, and merge sort are presented through examples of sorting arrays.
2) Quicksort and heapsort are also explained, with quicksort using a pivot element and heapsort building a max-heap structure.
3) For each algorithm, the key steps and operations are outlined, such as comparing and swapping elements in bubble and selection sort, and partitioning in quicksort.
This document is about one of the searching and sorting techniques. I had prepare a PDF document about one of the Searching technique in Data structure that is Binary Search and one of the Sorting Technique that is Quick Sort.
Counting sort is an algorithm that sorts elements by counting the number of occurrences of each unique element in an array. It works by:
1) Creating a count array to store the count of each unique object in the input array.
2) Modifying the count array to store cumulative counts.
3) Creating an output array by using the modified count array to output elements in sorted order.
This document discusses a minor project on sorting techniques using functions in C++. It is authored by a group of 4 students and their project guide. The document introduces different sorting algorithms like bubble sort, insertion sort, and selection sort. It provides examples to explain how each algorithm works step-by-step to sort an array of numbers in ascending order. C and C++ are computer programming languages commonly used for software development, and sorting is an important technique to arrange data in a desired order.
The document discusses quicksort and merge sort algorithms. It provides pseudocode for quicksort, explaining how it works by picking a pivot element and partitioning the array around that element. Quicksort has average time complexity of O(n log n) but worst case of O(n^2). Merge sort is also explained, with pseudocode showing how it recursively splits the array in half and then merges the sorted halves. Merge sort runs in O(n log n) time in all cases.
This document discusses hash tables and how they work. Hash tables store records with keys in an array. To insert a record, its key is hashed to a location in the array. If that location is occupied, the next available empty location is used instead. Searching for a record's key involves hashing the key and checking locations until the key is found or an empty spot is reached. When deleting a record, its location must be marked as deleted rather than left empty to avoid interfering with searches.
The document discusses different algorithms for searching through a list of records to find a record with a particular key:
1) Serial search simply iterates through each record sequentially until the target key is found, with average case time complexity of O(n).
2) Binary search can be used if the records are sorted, performing a divide and conquer search with average and worst case time complexity of O(logn).
3) Hash tables map keys to array indices via a hash function, allowing direct access to records in O(1) time on average by resolving collisions through open addressing. This provides the most efficient search algorithm discussed.
Searching Algorithms with Binary Search and Hashing Concept with Time and Spa...mrhabib10
The document discusses different search algorithms for efficiently finding records in a list given a key, including serial search, binary search, and hash tables. Serial search has O(n) worst-case time complexity, while binary search of a sorted list has O(log n) worst-case time complexity. Hash tables can provide constant time O(1) search by mapping keys to array indices via a hash function, but collisions require probing to find empty slots.
The document discusses different search algorithms for efficiently finding a record with a particular key in a list of records. It describes serial search, which has O(n) worst-case and average-case time complexity, and binary search, which has O(log n) worst-case and average-case time complexity for sorted lists. The document then introduces hash tables as a way to search in O(1) time by using a hash function to map keys to array indices, though collisions require searching further in the array.
Quick sort is an internal algorithm which is based on divide and conquer strategy. In this:
The array of elements is divided into parts repeatedly until it is not possible to divide it further.
It is also known as “partition exchange sort”.
It uses a key element (pivot) for partitioning the elements.
One left partition contains all those elements that are smaller than the pivot and one right partition contains all those elements which are greater than the key element.divide and conquer strategy. In this:
The elements are split into two sub-arrays (n/2) again and again until only one element is left.
Merge sort uses additional storage for sorting the auxiliary array.
Merge sort uses three arrays where two are used for storing each half, and the third external one is used to store the final sorted list by merging other two and each array is then sorted recursively.
At last, the all sub arrays are merged to make it ‘n’ element size of the array. Quick Sort vs Merge Sort
Partition of elements in the array : In the merge sort, the array is parted into just 2 halves (i.e. n/2). whereas In case of quick sort, the array is parted into any ratio. There is no compulsion of dividing the array of elements into equal parts in quick sort.
Worst case complexity : The worst case complexity of quick sort is O(n^2) as there is need of lot of comparisons in the worst condition. whereas In merge sort, worst case and average case has same complexities O(n log n).
Usage with datasets : Merge sort can work well on any type of data sets irrespective of its size (either large or small). whereas The quick sort cannot work well with large datasets.
Additional storage space requirement : Merge sort is not in place because it requires additional memory space to store the auxiliary arrays. whereas The quick sort is in place as it doesn’t require any additional storage.
Efficiency : Merge sort is more efficient and works faster than quick sort in case of larger array size or datasets. whereas Quick sort is more efficient and works faster than merge sort in case of smaller array size or datasets.
Sorting method : The quick sort is internal sorting method where the data is sorted in main memory. whereas The merge sort is external sorting method in which the data that is to be sorted cannot be accommodated in the memory and needed auxiliary memory for sorting.
Stability : Merge sort is stable as two elements with equal value appear in the same order in sorted output as they were in the input unsorted array. whereas Quick sort is unstable in this scenario. But it can be made stable using some changes in code.
Preferred for : Quick sort is preferred for arrays. whereas Merge sort is preferred for linked lists.
Locality of reference : Quicksort exhibits good cache locality and this makes quicksort faster than merge sort (in many cases like in virtual memory environment).
Quicksort has average time complexity of O(n log n), but worst case of O(n^2). It has O(log n) space complexity for the recursion stack. It works by picking a pivot element, partitioning the array into sub-arrays of smaller size based on element values relative to the pivot, and recursively
Mergesort and Quicksort are two efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the array in half until single elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the array into elements less than or greater than the pivot, then recursively sorts the subarrays. The document provides pseudocode for both algorithms and analyzes their time complexities.
The quicksort algorithm works by recursively sorting arrays of data. It first selects a pivot element and partitions the array around the pivot so that all elements less than the pivot come before it and all elements greater than the pivot come after it. It then recursively sorts the sub-arrays to the left and right of the pivot until the entire array is sorted.
Mergesort and Quicksort are two efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the list into halves until individual elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the list into elements less than or greater than the pivot, then recursively sorts the sublists until the entire list is sorted.
Mergesort and Quicksort are two efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the list into halves until individual elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the list into elements less than or greater than the pivot, then recursively sorts the sublists until the entire list is sorted.
Mergesort and Quicksort are efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the list into halves until single elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the list into elements less than or greater than the pivot, then recursively sorts the sublists.
Mergesort and Quicksort are two efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the array in half until single elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the array into elements less than or greater than the pivot, then recursively sorts the subarrays. The document provides pseudocode for both algorithms and walks through an example of Quicksort.
Mergesort and Quicksort are efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the list into halves until single elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the list into elements less than or greater than the pivot, then recursively sorts the sublists.
CS-102 Data Structures HashFunction CS102.pdfssuser034ce1
Hashing is a technique for implementing dictionaries that provides constant time per operation on average. It works by using a hash function to map keys to positions in a hash table. Ideally, an element with key k would be stored at position h(k). However, collisions can occur if multiple keys map to the same position. When a collision occurs, the element is stored in the next available empty position. Searching for an element involves computing its hash value to locate its position, and searching linearly if a collision is encountered. Deletion requires marking deleted positions as empty rather than truly empty to avoid interfering with searches.
Automated Actions (Automation) in the Odoo 18Celine George
In this slide, we’ll discuss the automated actions in the Odoo 18. Automated actions in Odoo 18 enable users to set predefined actions triggered automatically by specified conditions or events.
Vaping is not a safe form of smoking for youngsters (or adults) warns CANSA
As the world marks World No Tobacco Day on 31 May, the Cancer Association of South Africa (CANSA) is calling out the tobacco industry for deliberately marketing vaping products to teenagers and younger children. And one day earlier, CANSA will be walking with South African youth to draw attention to this alarming trend.
This year’s theme for World No Tobacco Day on 31 May is Unmasking the Appeal: Exposing the Industry Tactics on Tobacco and Nicotine Products. It’s about revealing how the tobacco and nicotine industries make their harmful products seem attractive, particularly to young people, through manipulative marketing, appealing flavours and deceptive product designs.
For more information about my speaking and training work, visit: https://www.pookyknightsmith.com/speaking/
Session overview:
Maslow’s Toolbox: Creating Classrooms Where Every Child Thrives
Using Maslow’s Hierarchy of Needs as a practical lens, this session explores how meeting children’s basic physical, emotional, and psychological needs can transform behaviour, engagement, and learning. With a strong focus on inclusion, we’ll look at how small, manageable changes can create classrooms where all children—including autistic pupils, ADHD learners, and those with experiences of trauma—feel safe, valued, and ready to thrive. You’ll leave with simple, low-cost strategies that are easy to implement and benefit every student, without singling anyone out.
By the end of this session, participants will be able to:
Identify unmet needs that may be driving behaviour or disengagement
Make quick, effective adjustments that improve focus and wellbeing
Create a safer, more predictable classroom environment
Support students to feel calm, confident and included
Build a stronger sense of belonging and connection
Foster self-esteem through success-focused strategies
Apply practical tools the very next day—no extra budget required
The document discusses quicksort and merge sort algorithms. It provides pseudocode for quicksort, explaining how it works by picking a pivot element and partitioning the array around that element. Quicksort has average time complexity of O(n log n) but worst case of O(n^2). Merge sort is also explained, with pseudocode showing how it recursively splits the array in half and then merges the sorted halves. Merge sort runs in O(n log n) time in all cases.
This document discusses hash tables and how they work. Hash tables store records with keys in an array. To insert a record, its key is hashed to a location in the array. If that location is occupied, the next available empty location is used instead. Searching for a record's key involves hashing the key and checking locations until the key is found or an empty spot is reached. When deleting a record, its location must be marked as deleted rather than left empty to avoid interfering with searches.
The document discusses different algorithms for searching through a list of records to find a record with a particular key:
1) Serial search simply iterates through each record sequentially until the target key is found, with average case time complexity of O(n).
2) Binary search can be used if the records are sorted, performing a divide and conquer search with average and worst case time complexity of O(logn).
3) Hash tables map keys to array indices via a hash function, allowing direct access to records in O(1) time on average by resolving collisions through open addressing. This provides the most efficient search algorithm discussed.
Searching Algorithms with Binary Search and Hashing Concept with Time and Spa...mrhabib10
The document discusses different search algorithms for efficiently finding records in a list given a key, including serial search, binary search, and hash tables. Serial search has O(n) worst-case time complexity, while binary search of a sorted list has O(log n) worst-case time complexity. Hash tables can provide constant time O(1) search by mapping keys to array indices via a hash function, but collisions require probing to find empty slots.
The document discusses different search algorithms for efficiently finding a record with a particular key in a list of records. It describes serial search, which has O(n) worst-case and average-case time complexity, and binary search, which has O(log n) worst-case and average-case time complexity for sorted lists. The document then introduces hash tables as a way to search in O(1) time by using a hash function to map keys to array indices, though collisions require searching further in the array.
Quick sort is an internal algorithm which is based on divide and conquer strategy. In this:
The array of elements is divided into parts repeatedly until it is not possible to divide it further.
It is also known as “partition exchange sort”.
It uses a key element (pivot) for partitioning the elements.
One left partition contains all those elements that are smaller than the pivot and one right partition contains all those elements which are greater than the key element.divide and conquer strategy. In this:
The elements are split into two sub-arrays (n/2) again and again until only one element is left.
Merge sort uses additional storage for sorting the auxiliary array.
Merge sort uses three arrays where two are used for storing each half, and the third external one is used to store the final sorted list by merging other two and each array is then sorted recursively.
At last, the all sub arrays are merged to make it ‘n’ element size of the array. Quick Sort vs Merge Sort
Partition of elements in the array : In the merge sort, the array is parted into just 2 halves (i.e. n/2). whereas In case of quick sort, the array is parted into any ratio. There is no compulsion of dividing the array of elements into equal parts in quick sort.
Worst case complexity : The worst case complexity of quick sort is O(n^2) as there is need of lot of comparisons in the worst condition. whereas In merge sort, worst case and average case has same complexities O(n log n).
Usage with datasets : Merge sort can work well on any type of data sets irrespective of its size (either large or small). whereas The quick sort cannot work well with large datasets.
Additional storage space requirement : Merge sort is not in place because it requires additional memory space to store the auxiliary arrays. whereas The quick sort is in place as it doesn’t require any additional storage.
Efficiency : Merge sort is more efficient and works faster than quick sort in case of larger array size or datasets. whereas Quick sort is more efficient and works faster than merge sort in case of smaller array size or datasets.
Sorting method : The quick sort is internal sorting method where the data is sorted in main memory. whereas The merge sort is external sorting method in which the data that is to be sorted cannot be accommodated in the memory and needed auxiliary memory for sorting.
Stability : Merge sort is stable as two elements with equal value appear in the same order in sorted output as they were in the input unsorted array. whereas Quick sort is unstable in this scenario. But it can be made stable using some changes in code.
Preferred for : Quick sort is preferred for arrays. whereas Merge sort is preferred for linked lists.
Locality of reference : Quicksort exhibits good cache locality and this makes quicksort faster than merge sort (in many cases like in virtual memory environment).
Quicksort has average time complexity of O(n log n), but worst case of O(n^2). It has O(log n) space complexity for the recursion stack. It works by picking a pivot element, partitioning the array into sub-arrays of smaller size based on element values relative to the pivot, and recursively
Mergesort and Quicksort are two efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the array in half until single elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the array into elements less than or greater than the pivot, then recursively sorts the subarrays. The document provides pseudocode for both algorithms and analyzes their time complexities.
The quicksort algorithm works by recursively sorting arrays of data. It first selects a pivot element and partitions the array around the pivot so that all elements less than the pivot come before it and all elements greater than the pivot come after it. It then recursively sorts the sub-arrays to the left and right of the pivot until the entire array is sorted.
Mergesort and Quicksort are two efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the list into halves until individual elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the list into elements less than or greater than the pivot, then recursively sorts the sublists until the entire list is sorted.
Mergesort and Quicksort are two efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the list into halves until individual elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the list into elements less than or greater than the pivot, then recursively sorts the sublists until the entire list is sorted.
Mergesort and Quicksort are efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the list into halves until single elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the list into elements less than or greater than the pivot, then recursively sorts the sublists.
Mergesort and Quicksort are two efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the array in half until single elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the array into elements less than or greater than the pivot, then recursively sorts the subarrays. The document provides pseudocode for both algorithms and walks through an example of Quicksort.
Mergesort and Quicksort are efficient sorting algorithms that run in O(n log n) time. Mergesort uses a divide-and-conquer approach, recursively splitting the list into halves until single elements remain, then merging the sorted halves back together. Quicksort chooses a pivot element and partitions the list into elements less than or greater than the pivot, then recursively sorts the sublists.
CS-102 Data Structures HashFunction CS102.pdfssuser034ce1
Hashing is a technique for implementing dictionaries that provides constant time per operation on average. It works by using a hash function to map keys to positions in a hash table. Ideally, an element with key k would be stored at position h(k). However, collisions can occur if multiple keys map to the same position. When a collision occurs, the element is stored in the next available empty position. Searching for an element involves computing its hash value to locate its position, and searching linearly if a collision is encountered. Deletion requires marking deleted positions as empty rather than truly empty to avoid interfering with searches.
Automated Actions (Automation) in the Odoo 18Celine George
In this slide, we’ll discuss the automated actions in the Odoo 18. Automated actions in Odoo 18 enable users to set predefined actions triggered automatically by specified conditions or events.
Vaping is not a safe form of smoking for youngsters (or adults) warns CANSA
As the world marks World No Tobacco Day on 31 May, the Cancer Association of South Africa (CANSA) is calling out the tobacco industry for deliberately marketing vaping products to teenagers and younger children. And one day earlier, CANSA will be walking with South African youth to draw attention to this alarming trend.
This year’s theme for World No Tobacco Day on 31 May is Unmasking the Appeal: Exposing the Industry Tactics on Tobacco and Nicotine Products. It’s about revealing how the tobacco and nicotine industries make their harmful products seem attractive, particularly to young people, through manipulative marketing, appealing flavours and deceptive product designs.
For more information about my speaking and training work, visit: https://www.pookyknightsmith.com/speaking/
Session overview:
Maslow’s Toolbox: Creating Classrooms Where Every Child Thrives
Using Maslow’s Hierarchy of Needs as a practical lens, this session explores how meeting children’s basic physical, emotional, and psychological needs can transform behaviour, engagement, and learning. With a strong focus on inclusion, we’ll look at how small, manageable changes can create classrooms where all children—including autistic pupils, ADHD learners, and those with experiences of trauma—feel safe, valued, and ready to thrive. You’ll leave with simple, low-cost strategies that are easy to implement and benefit every student, without singling anyone out.
By the end of this session, participants will be able to:
Identify unmet needs that may be driving behaviour or disengagement
Make quick, effective adjustments that improve focus and wellbeing
Create a safer, more predictable classroom environment
Support students to feel calm, confident and included
Build a stronger sense of belonging and connection
Foster self-esteem through success-focused strategies
Apply practical tools the very next day—no extra budget required
Management of head injury in children.pdfsachin7989
Management of Head Injury: A Clinical Overview
1. Initial Assessment and Stabilization:
The management of a head injury begins with a rapid and systematic assessment using the ABCDE approach:
A – Airway: Ensure the airway is patent; consider cervical spine protection.
B – Breathing: Assess respiratory effort and oxygenation; provide supplemental oxygen if needed.
C – Circulation: Monitor pulse, blood pressure, and capillary refill; manage shock if present.
D – Disability: Evaluate neurological status using the Glasgow Coma Scale (GCS); assess pupil size and reactivity.
E – Exposure: Fully expose the patient to assess for other injuries while preventing hypothermia.
2. Classification of Head Injury:
Head injuries are classified based on GCS score:
Mild: GCS 13–15
Moderate: GCS 9–12
Severe: GCS ≤8
3. Imaging and Diagnosis:
CT scan of the head is the imaging modality of choice, especially in moderate to severe injuries, or if red flag symptoms are present (e.g., vomiting, seizures, focal neurological signs, skull fracture).
Cervical spine imaging may also be necessary.
4. Acute Management:
Mild head injury: Observation, symptomatic treatment (e.g., analgesics), and instructions for return precautions.
Moderate to severe head injury:
Admit to hospital, ideally in an intensive care unit (ICU) if GCS ≤8.
Maintain cerebral perfusion pressure (CPP): control blood pressure and intracranial pressure (ICP).
Consider hyperosmolar therapy (e.g., mannitol or hypertonic saline) if signs of raised ICP.
Elevate head of the bed to 30 degrees.
Surgical intervention (e.g., evacuation of hematomas) may be required based on CT findings.
5. Monitoring and Supportive Care:
Continuous monitoring of GCS, pupils, vitals, and neurological signs.
ICP monitoring in patients with severe injury.
Prevent secondary brain injury by optimizing oxygenation, ventilation, and perfusion.
Seizure prophylaxis may be considered in select cases.
6. Rehabilitation and Long-Term Care:
Referral for neurorehabilitation for physical, cognitive, and emotional recovery.
Psychological support and education for patient and family.
Regular follow-up to monitor for late complications like post-traumatic epilepsy, cognitive deficits, or behavioral changes.
7. Prevention:
Education on safety measures (e.g., helmets, seat belts).
Public health strategies to reduce road traffic accidents, falls, and violence.
Flower Identification Class-10 by Kushal Lamichhane.pdfkushallamichhame
This includes the overall cultivation practices of rose prepared by:
Kushal Lamichhane
Instructor
Shree Gandhi Adarsha Secondary School
Kageshowri Manohara-09, Kathmandu, Nepal
Leveraging AI to Streamline Operations for Nonprofits [05.20.2025].pdfTechSoup
Explore how AI tools can enhance operational efficiency for nonprofits. Learn practical strategies for automating repetitive tasks, optimizing resource allocation, and driving organizational impact. Gain actionable insights into implementing AI solutions tailored to nonprofit needs.
Protest - Student Revision Booklet For VCE Englishjpinnuck
The 'Protest Student Revision Booklet' is a comprehensive resource to scaffold students to prepare for writing about this idea framework on a SAC or for the exam. This resource helps students breakdown the big idea of protest, practise writing in different styles, brainstorm ideas in response to different stimuli and develop a bank of creative ideas.
How to Manage Blanket Order in Odoo 18 - Odoo SlidesCeline George
In this slide, we’ll discuss on how to manage blanket order in Odoo 18. A Blanket Order in Odoo 18 is a long-term agreement with a vendor for a specific quantity of goods or services at a predetermined price.
Combustion in Compression Ignition Engine (CIE)NileshKumbhar21
Stages of combustion, Delay period, Factors affecting delay period, Abnormal
combustion- Diesel knock, Influence of engine design and operating variables
on diesel knock, Comparison of abnormal combustion in S.I. and C.I. Engines,
Cetane number, Additives. Requirements of combustion chambers for C.I.
Engines and its types
2. Introduction
• Common problem: sort a list of values, starting
from lowest to highest.
– List of exam scores
– Words of dictionary in alphabetical order
– Students names listed alphabetically
– Student records sorted by ID#
• Generally, we are given a list of records that have
keys. These keys are used to define an ordering of
the items in the list.
3. Quadratic Sorting Algorithms
• We are given n records to sort.
• There are a number of simple sorting
algorithms whose worst and average case
performance is quadratic O(n2):
– Selection sort
– Insertion sort
– Bubble sort
4. Sorting an Array of Integers
• Example: we
are given an
array of six
integers that
we want to
sort from
smallest to
largest
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[0] [1] [2] [3] [4] [5]
6. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• Swap the
smallest
entry with
the first
entry.
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[0] [1] [2] [3] [4] [5]
7. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• Swap the
smallest
entry with
the first
entry.
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[0] [1] [2] [3] [4] [5]
8. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• Part of the
array is now
sorted.
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
Sorted side Unsorted side
[0] [1] [2] [3] [4] [5]
9. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• Find the
smallest
element in
the unsorted
side.
Sorted side Unsorted side
[0] [1] [2] [3] [4] [5]
10. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• Swap with
the front of
the unsorted
side.
Sorted side Unsorted side
[0] [1] [2] [3] [4] [5]
11. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• We have
increased the
size of the
sorted side
by one
element.
Sorted side Unsorted side
[0] [1] [2] [3] [4] [5]
12. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• The process
continues...
Sorted side Unsorted side
Smallest
from
unsorted
[0] [1] [2] [3] [4] [5]
13. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• The process
continues...
Sorted side Unsorted side
[0] [1] [2] [3] [4] [5]
14. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• The process
continues...
Sorted side Unsorted side
Sorted side
is bigger
[0] [1] [2] [3] [4] [5]
15. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• The process
keeps adding
one more
number to the
sorted side.
• The sorted side
has the smallest
numbers,
arranged from
small to large.
Sorted side Unsorted side
[0] [1] [2] [3] [4] [5]
16. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• We can stop
when the
unsorted side
has just one
number, since
that number
must be the
largest number.
[0] [1] [2] [3] [4] [5]
Sorted side Unsorted sid
17. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Selection Sort Algorithm
• The array is
now sorted.
• We repeatedly
selected the
smallest
element, and
moved this
element to the
front of the
unsorted side. [0] [1] [2] [3] [4] [5]
18. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Insertion Sort Algorithm
• The Insertion
Sort algorithm
also views the
array as having
a sorted side
and an
unsorted side.
[0] [1] [2] [3] [4] [5]
19. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Insertion Sort Algorithm
• The sorted
side starts
with just the
first
element,
which is not
necessarily
the smallest
element.
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[0] [1] [2] [3] [4] [5]
Sorted side Unsorted side
20. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Insertion Sort Algorithm
• The sorted
side grows
by taking the
front
element
from the
unsorted
side... 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[0] [1] [2] [3] [4] [5]
Sorted side Unsorted side
21. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Insertion Sort Algorithm
• ...and
inserting it
in the place
that keeps
the sorted
side
arranged
from small
to large.
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[0] [1] [2] [3] [4] [5]
Sorted side Unsorted side
23. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Insertion Sort Algorithm
• Sometimes
we are lucky
and the new
inserted item
doesn't need
to move at
all.
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[0] [1] [2] [3] [4] [5]
Sorted side Unsorted side
24. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Insertionsort Algorithm
• Sometimes
we are lucky
twice in a
row.
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[0] [1] [2] [3] [4] [5]
Sorted side Unsorted side
25. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
How to Insert One Element
Copy the
new element
to a separate
location.
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[3] [4] [5] [6] [0] [1] [2] [3] [4] [5]
Sorted side Unsorted side
26. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
How to Insert One Element
Shift
elements in
the sorted
side,
creating an
open space
for the new
element.
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[3] [4] [5] [6] [0] [1] [2] [3] [4] [5]
27. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
How to Insert One Element
Shift
elements in
the sorted
side,
creating an
open space
for the new
element.
[3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[0] [1] [2] [3] [4] [5]
30. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
How to Insert One Element
...until you
reach the
location for
the new
element.
[3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
[0] [1] [2] [3] [4] [5]
31. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
How to Insert One Element
Copy the
new element
back into the
array, at the
correct
location.
[3] [4] [5] [6] [0] [1] [2] [3] [4] [5]
Sorted side Unsorted sid
32. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
How to Insert One Element
[2] [3] [4] [5] [6]
• The last
element
must also be
inserted.
Start by
copying it...
[0] [1] [2] [3] [4] [5]
Sorted side Unsorted sid
34. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
35. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
Swap?
36. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
Yes!
37. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
Swap?
38. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
No.
39. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
Swap?
40. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
No.
41. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
Swap?
42. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
Yes!
43. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
Swap?
44. 0
10
20
30
40
50
60
70
[1] [2] [3] [4] [5] [6]
The Bubble Sort Algorithm
• The Bubble
Sort algorithm
looks at pairs
of entries in
the array, and
swaps their
order if
needed.
[0] [1] [2] [3] [4] [5]
Yes!
#5: The picture shows a graphical representation of an array which we will sort so that the smallest element ends up at the front, and the other elements increase to the largest at the end. The bar graph indicates the values which are in the array before sorting--for example the first element of the array contains the integer 45.
#6: The first sorting algorithm that we'll examine is called Selectionsort. It begins by going through the entire array and finding the smallest element. In this example, the smallest element is the number 8 at location [4] of the array.
#7: Once we have found the smallest element, that element is swapped with the first element of the array...
#8: ...like this.
The smallest element is now at the front of the array, and we have taken one small step toward producing a sorted array.
#9: At this point, we can view the array as being split into two sides: To the left of the dotted line is the "sorted side", and to the right of the dotted line is the "unsorted side". Our goal is to push the dotted line forward, increasing the number of elements in the sorted side, until the entire array is sorted.
#10: Each step of the Selectionsort works by finding the smallest element in the unsorted side. At this point, we would find the number 15 at location [5] in the unsorted side.
#11: This small element is swapped with the number at the front of the unsorted side, as shown here...
#12: ...and the effect is to increase the size of the sorted side by one element.
As you can see, the sorted side always contains the smallest numbers, and those numbers are sorted from small to large. The unsorted side contains the rest of the numbers, and those numbers are in no particular order.
#13: Again, we find the smallest entry in the unsorted side...
#14: ...and swap this element with the front of the unsorted side.
#15: The sorted side now contains the three smallest elements of the array.
#16: Here is the array after increasing the sorted side to four elements.
#17: And now the sorted side has five elements.
In fact, once the unsorted side is down to a single element, the sort is completed. At this point the 5 smallest elements are in the sorted side, and so the the one largest element is left in the unsorted side.
We are done...
#18: ...The array is sorted.
The basic algorithm is easy to state and also easy to program.
#19: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#20: ...like this.
However, in the Selectionsort, the sorted side always contained the smallest elements of the array. In the Insertionsort, the sorted side will be sorted from small to large, but the elements in the sorted side will not necessarily be the smallest entries of the array.
Because the sorted side does not need to have the smallest entries, we can start by placing one element in the sorted side--we don't need to worry about sorting just one element. But we do need to worry about how to increase the number of elements that are in the sorted side.
#21: The basic approach is to take the front element from the unsorted side...
#22: ...and insert this element at the correct spot of the sorted side.
In this example, the front element of the unsorted side is 20. So the 20 must be inserted before the number 45 which is already in the sorted side.
#23: After the insertion, the sorted side contains two elements. These two elements are in order from small to large, although they are not the smallest elements in the array.
#24: Sometimes we are lucky and the newly inserted element is already in the right spot. This happens if the new element is larger than anything that's already in the array.
#26: The actual insertion process requires a bit of work that is shown here. The first step of the insertion is to make a copy of the new element. Usually this copy is stored in a local variable. It just sits off to the side, ready for us to use whenever we need it.
#27: After we have safely made a copy of the new element, we start shifting elements from the end of the sorted side. These elements are shifted rightward, to create an "empty spot" for our new element to be placed.
In this example we take the last element of the sorted side and shift it rightward one spot...
#28: ...like this.
Is this the correct spot for the new element? No, because the new element is smaller than the next element in the sorted section. So we continue shifting elements rightward...
#29: This is still not the correct spot for our new element, so we shift again...
#31: Finally, this is the correct location for the new element. In general there are two situations that indicate the "correct location" has been found:
1. We reach the front of the array (as happened here), or
2. We reached an element that is less than or equal to the new element.
#32: Once the correct spot is found, we copy the new element back into the array. The number of elements in the sorted side has increased by one.
#33: The last element of the array also needs to be inserted. Start by copying it to a safe location.
#35: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#36: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#37: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#38: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#39: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#40: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#41: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#42: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#43: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#44: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#45: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#46: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#47: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#48: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#49: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#50: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#51: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#52: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#53: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#54: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#55: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#56: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#57: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#58: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#59: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...
#60: Now we'll look at another sorting method called Insertionsort. The end result will be the same: The array will be sorted from smallest to largest. But the sorting method is different.
However, there are some common features. As with the Selectionsort, the Insertionsort algorithm also views the array as having a sorted side and an unsorted side, ...