0% found this document useful (0 votes)
1 views

Sorting Algorithms

ggod

Uploaded by

mrxgamer33899
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views

Sorting Algorithms

ggod

Uploaded by

mrxgamer33899
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

Sorting Algorithms

1. Shell Sort
Definition: An in-place comparison-based sorting algorithm that improves insertion sort by
comparing and swapping elements at gaps.
Time Complexity:

o Best: O(nlog⁡n)O(n \log n)O(nlogn)

o Average: O(n3/2)O(n^{3/2})O(n3/2)

o Worst: O(n2)O(n^2)O(n2)

2. Quick Sort
Definition: A divide-and-conquer algorithm that selects a pivot, partitions the array, and
recursively sorts subarrays.
Time Complexity:

o Best: O(nlog⁡n)O(n \log n)O(nlogn)

o Average: O(nlog⁡n)O(n \log n)O(nlogn)

o Worst: O(n2)O(n^2)O(n2)

3. Merge Sort
Definition: A divide-and-conquer algorithm that splits the array, sorts the subarrays, and
merges them back in order.
Time Complexity:

o Best: O(nlog⁡n)O(n \log n)O(nlogn)

o Average: O(nlog⁡n)O(n \log n)O(nlogn)

o Worst: O(nlog⁡n)O(n \log n)O(nlogn)

4. Heap Sort
Definition: A comparison-based algorithm that builds a max heap and repeatedly extracts
the largest element.
Time Complexity:

o Best: O(nlog⁡n)O(n \log n)O(nlogn)

o Average: O(nlog⁡n)O(n \log n)O(nlogn)

o Worst: O(nlog⁡n)O(n \log n)O(nlogn)

5. Sorting in Linear Time (e.g., Counting Sort, Radix Sort, Bucket Sort)
Definition: Non-comparison-based algorithms that sort based on counting occurrences or
placing elements in buckets.
Time Complexity:

o Best, Average, Worst: O(n)O(n)O(n)

Advanced Data Structures


6. Red-Black Trees
Definition: A self-balancing binary search tree where each node has a color (red or black) to
maintain balance.
Time Complexity:

o All operations: O(log⁡n)O(\log n)O(logn)

7. B-Trees
Definition: A self-balancing search tree optimized for systems that read and write large
blocks of data.
Time Complexity:

o All operations: O(log⁡n)O(\log n)O(logn)

8. Binomial Heaps
Definition: A collection of binomial trees used for efficient priority queue operations.
Time Complexity:

o Insertion: O(log⁡n)O(\log n)O(logn)

o Deletion: O(log⁡n)O(\log n)O(logn)

o Find Minimum: O(log⁡n)O(\log n)O(logn)

9. Fibonacci Heaps
Definition: A heap data structure that supports operations like decrease-key efficiently.
Time Complexity:

o Insert: O(1)O(1)O(1)

o Extract Min: O(log⁡n)O(\log n)O(logn)

o Decrease Key: O(1)O(1)O(1) (amortized)

10. Tries
Definition: A tree-like data structure for storing strings, where keys are represented by paths
from the root.
Time Complexity:

o Search, Insert: O(m)O(m)O(m) (where mmm is the length of the string)

11. Skip Lists


Definition: A layered linked list allowing for faster search, insertion, and deletion.
Time Complexity:

o All operations: O(log⁡n)O(\log n)O(logn)

Divide and Conquer

12. Matrix Multiplication (Strassen's Algorithm)


Definition: Divides matrices into smaller submatrices for multiplication to reduce time
complexity.
Time Complexity: O(n2.81)O(n^{2.81})O(n2.81)
13. Convex Hull
Definition: Finds the smallest convex polygon enclosing all given points (e.g., Graham’s Scan,
Jarvis March).
Time Complexity: O(nlog⁡n)O(n \log n)O(nlogn)

14. Binary Search


Definition: Searches a sorted array by repeatedly dividing the search interval in half.
Time Complexity:

o Best: O(1)O(1)O(1)

o Average/Worst: O(log⁡n)O(\log n)O(logn)

Greedy Algorithms

15. Knapsack Problem (Fractional)


Definition: Maximizes profit by choosing items with the highest value-to-weight ratio.
Time Complexity: O(nlog⁡n)O(n \log n)O(nlogn)

16. Prim's Algorithm


Definition: Constructs a minimum spanning tree by adding the smallest edge connecting the
tree to new vertices.
Time Complexity: O(Elog⁡V)O(E \log V)O(ElogV)

17. Kruskal's Algorithm


Definition: Constructs a minimum spanning tree by sorting edges and using a union-find
structure.
Time Complexity: O(Elog⁡E)O(E \log E)O(ElogE)

18. Dijkstra's Algorithm


Definition: Finds the shortest path from a source to all vertices in a weighted graph.
Time Complexity: O(V2)O(V^2)O(V2) or O(E+Vlog⁡V)O(E + V \log V)O(E+VlogV) (using a min-
heap)

19. Bellman-Ford Algorithm


Definition: Finds the shortest path from a source to all vertices, allowing negative edge
weights.
Time Complexity: O(VE)O(VE)O(VE)

Dynamic Programming

20. Knapsack Problem (0/1)


Definition: Maximizes profit by deciding whether to include each item.
Time Complexity: O(nW)O(nW)O(nW) (where WWW is the capacity)

21. Warshall’s Algorithm


Definition: Computes transitive closure of a graph.
Time Complexity: O(V3)O(V^3)O(V3)
22. Floyd’s Algorithm
Definition: Finds the shortest paths between all pairs of vertices in a graph.
Time Complexity: O(V3)O(V^3)O(V3)

23. Resource Allocation Problem


Definition: Allocates resources to maximize efficiency or profit.
Time Complexity: Problem-dependent.

Backtracking

24. N-Queen Problem


Definition: Places nnn queens on an n×nn \times nn×n chessboard so no two queens attack
each other.
Time Complexity: O(n!)O(n!)O(n!)

25. Graph Coloring


Definition: Assigns colors to vertices of a graph such that no two adjacent vertices have the
same color.
Time Complexity: Problem-dependent.

26. Hamiltonian Cycle


Definition: Determines if a graph contains a cycle that visits each vertex exactly once.
Time Complexity: O(n!)O(n!)O(n!)

27. Sum of Subsets


Definition: Finds subsets of a set that sum to a given value.
Time Complexity: O(2n)O(2^n)O(2n)

Selected Topics

28. Fast Fourier Transform (FFT)


Definition: Efficiently computes the Discrete Fourier Transform (DFT) of a sequence.
Time Complexity: O(nlog⁡n)O(n \log n)O(nlogn)

29. String Matching (e.g., KMP, Rabin-Karp)


Definition: Finds occurrences of a pattern in a text.
Time Complexity:

o KMP: O(n+m)O(n + m)O(n+m)

o Rabin-Karp: O(n+m)O(n + m)O(n+m) (average)

30. Theory of NP-Completeness


Definition: Studies problems that are hard to solve but easy to verify.
Time Complexity: Problem-dependent.

31. Approximation Algorithms


Definition: Provides near-optimal solutions to optimization problems in polynomial time.
Time Complexity: O(nk)O(n^k)O(nk), problem-dependent.
32. Randomized Algorithms
Definition: Uses random numbers to influence decisions and solve problems faster.
Time Complexity: Problem-dependent.

Little bit detail


Sorting Algorithms

1. Shell Sort

o Definition: A generalized version of insertion sort that first sorts elements far apart
and reduces the gap between them over iterations.

o Approach: Uses a gap sequence to divide the array and repeatedly compares and
sorts elements at these intervals.

o Time Complexity:

▪ Best: O(nlog⁡n)O(n \log n)O(nlogn) (depends on gap sequence)

▪ Average: O(n3/2)O(n^{3/2})O(n3/2)

▪ Worst: O(n2)O(n^2)O(n2)

2. Quick Sort

o Definition: A divide-and-conquer algorithm that partitions the array into two


subarrays based on a pivot element.

o Approach: Recursively sort the two partitions such that elements in the left are
smaller than the pivot and the right are larger.

o Time Complexity:

▪ Best: O(nlog⁡n)O(n \log n)O(nlogn)

▪ Average: O(nlog⁡n)O(n \log n)O(nlogn)

▪ Worst: O(n2)O(n^2)O(n2) (occurs if the pivot is poorly chosen, e.g., smallest


or largest element)

3. Merge Sort

o Definition: A divide-and-conquer algorithm that splits the array into halves,


recursively sorts them, and merges them back.

o Approach: Focuses on merging two sorted subarrays into one sorted array.

o Time Complexity:

▪ Best: O(nlog⁡n)O(n \log n)O(nlogn)

▪ Average: O(nlog⁡n)O(n \log n)O(nlogn)


▪ Worst: O(nlog⁡n)O(n \log n)O(nlogn)

4. Heap Sort

o Definition: Uses a binary heap to repeatedly extract the maximum or minimum


element and place it in sorted order.

o Approach: Build a max heap, then remove the largest element, reheapify, and
repeat.

o Time Complexity:

▪ Best: O(nlog⁡n)O(n \log n)O(nlogn)

▪ Average: O(nlog⁡n)O(n \log n)O(nlogn)

▪ Worst: O(nlog⁡n)O(n \log n)O(nlogn)

5. Sorting in Linear Time

o Definition: Sorting algorithms that operate in O(n)O(n)O(n) time by avoiding


comparisons, e.g., counting sort, radix sort, bucket sort.

o Approach: Leverages auxiliary data structures to sort integers or distribute elements


into "buckets."

o Time Complexity:

▪ Best, Average, Worst: O(n)O(n)O(n) (for counting sort and radix sort)

Advanced Data Structures

6. Red-Black Trees

o Definition: A self-balancing binary search tree that ensures the tree height is
logarithmic.

o Approach: Maintains balance by coloring nodes red or black and rebalancing after
insertions and deletions.

o Time Complexity:

▪ All operations (search, insert, delete): O(log⁡n)O(\log n)O(logn)

7. B-Trees

o Definition: A generalization of binary search trees designed for systems that handle
large amounts of data and disk storage.

o Approach: Balances multi-level nodes with multiple keys.

o Time Complexity:

▪ All operations: O(log⁡n)O(\log n)O(logn)

8. Binomial Heaps
o Definition: A collection of binomial trees supporting mergeable heap operations.

o Approach: Based on binomial coefficients, efficiently supports union and other


priority queue operations.

o Time Complexity:

▪ Insert: O(log⁡n)O(\log n)O(logn)

▪ Extract Min: O(log⁡n)O(\log n)O(logn)

9. Fibonacci Heaps

o Definition: A heap data structure that supports faster amortized operations


compared to binary or binomial heaps.

o Approach: Uses a lazy deletion approach, decreasing the cost of certain operations
like decrease-key.

o Time Complexity:

▪ Insert: O(1)O(1)O(1) (amortized)

▪ Extract Min: O(log⁡n)O(\log n)O(logn)

▪ Decrease Key: O(1)O(1)O(1) (amortized)

10. Tries

o Definition: A tree structure used for fast string operations like prefix matching.

o Approach: Nodes store parts of strings; paths represent the strings.

o Time Complexity:

▪ Insert/Search: O(m)O(m)O(m) (where mmm is the string length)

11. Skip Lists

o Definition: A linked list augmented with layers to speed up search operations.

o Approach: Layers of "shortcuts" reduce the number of steps needed to find


elements.

o Time Complexity:

▪ All operations (search, insert, delete): O(log⁡n)O(\log n)O(logn)

Divide and Conquer Algorithms

12. Matrix Multiplication (Strassen’s Algorithm)

o Definition: Reduces the number of multiplications required in standard matrix


multiplication.

o Approach: Divides matrices into smaller submatrices and recursively computes their
products.
o Time Complexity: O(n2.81)O(n^{2.81})O(n2.81)

13. Convex Hull

o Definition: Finds the smallest convex polygon enclosing all given points.

o Approach: Algorithms like Graham's Scan or Jarvis March iteratively add points to
the convex boundary.

o Time Complexity: O(nlog⁡n)O(n \log n)O(nlogn)

Greedy Algorithms

14. Knapsack Problem (Fractional)

o Definition: Maximizes profit by picking fractional parts of items to fit the knapsack.

o Approach: Sort items by value-to-weight ratio and pick greedily.

o Time Complexity: O(nlog⁡n)O(n \log n)O(nlogn)

15. Prim’s Algorithm

o Definition: Finds the minimum spanning tree by growing the tree one vertex at a
time.

o Approach: Starts from a random node and selects the smallest edge connecting the
tree to new vertices.

o Time Complexity: O(Elog⁡V)O(E \log V)O(ElogV)

16. Kruskal’s Algorithm

o Definition: Constructs a minimum spanning tree by adding the smallest edges while
avoiding cycles.

o Approach: Sorts edges and uses union-find for cycle detection.

o Time Complexity: O(Elog⁡E)O(E \log E)O(ElogE)

17. Dijkstra’s Algorithm

o Definition: Finds the shortest path from a source to all vertices in a weighted graph.

o Approach: Greedily selects the next closest vertex.

o Time Complexity: O(V2)O(V^2)O(V2) or O(E+Vlog⁡V)O(E + V \log V)O(E+VlogV)


(with min-heap)

18. Bellman-Ford Algorithm

o Definition: Finds shortest paths from a source, allowing negative weights.

o Time Complexity: O(VE)O(VE)O(VE)

Dynamic Programming
19. Knapsack Problem (0/1)

o Definition: Maximizes profit without fractional items, using dynamic programming to


decide inclusion/exclusion.

o Time Complexity: O(nW)O(nW)O(nW)

20. Warshall’s Algorithm

o Definition: Computes transitive closure of a graph.

o Time Complexity: O(V3)O(V^3)O(V3)

21. Floyd’s Algorithm

o Definition: Finds shortest paths between all pairs of vertices in a graph.

o Time Complexity: O(V3)O(V^3)O(V3)

Backtracking and Selected Topics

22. N-Queen Problem

o Definition: Places queens such that no two attack each other.

o Time Complexity: O(n!)O(n!)O(n!)

23. Graph Coloring

o Definition: Assigns colors to vertices of a graph so adjacent vertices have different


colors.

o Time Complexity: O(2n)O(2^n)O(2n)

24. Hamiltonian Cycle

o Definition: Checks if a cycle exists visiting each vertex once.

o Time Complexity: O(n!)O(n!)O(n!)

25. Fast Fourier Transform (FFT)

o Definition: Efficiently computes the Discrete Fourier Transform.

o Time Complexity: O(nlog⁡n)O(n \log n)O(nlogn)

26. String Matching

o KMP Algorithm: Matches strings using a prefix function. O(n+m)O(n + m)O(n+m)

o Rabin-Karp Algorithm: Uses hashing. O(n+m)O(n + m)O(n+m) average,


O(nm)O(nm)O(nm) worst.

27. Approximation Algorithms

o Definition: Provides near-optimal solutions to NP-complete problems in polynomial


time.
28. Randomized Algorithms

o Definition: Uses randomness to achieve faster average performance.

You might also like