Sorting preliminary
Sorting preliminary
Quicksort
Insertion sort
Insertion sort is a simple sorting algorithm that works by iteratively inserting
each element of an unsorted list into its correct position in a sorted portion of
the list. It is like sorting playing cards in your hands. You split the cards into
two groups: the sorted cards and the unsorted cards. Then, you pick a card
from the unsorted group and put it in the right place in the sorted group.
● We start with the second element of the array as the first element is
assumed to be sorted.
● Compare the second element with the first element if the second
● Move to the third element, compare it with the first two elements,
Complexity Analysis of Insertion Sort
Time Complexity
● Best case: O(n), If the list is already sorted, where n is the number
Space Complexity
Advantages
Disadvantages
Quick Sort and Merge Sort. When the subarray size becomes small,
Shell Sort
elements only one position ahead. When an element has to be moved far
ahead, many movements are involved. The idea of ShellSort is to allow the
exchange of far items. In Shell sort, we make the array h-sorted for a large
Algorithm:
Step 1 − Start
Step 2 − Initialize the value of gap size, say h.
Step 3 − Divide the list into smaller sub-part. Each must have equal intervals
to h.
Step 4 − Sort these sub-lists using insertion sort.
Step 5 – Repeat this step 2 until the list is sorted.
Step 6 – Print a sorted list.
Step 7 – Stop.
Heap Sort
Heap sort is a comparison-based sorting technique based on Binary Heap
data Structure. It can be seen as an optimization over Selection sort where
we first find the max (or min) element and swap it with the last (or first). We
repeat the same process for the remaining elements. In Heap Sort, we use
Binary Heap so that we can quickly find and move the max element in O(Log
n) instead of O(n) and hence achieve the O(n Log n) time complexity.
Heap Sort Algorithm
First convert the array into a max heap using heapify , Please note that this
happens in-place. The array elements are rearranged to follow heap
properties. Then one by one delete the root node of the Max-heap and
replace it with the last node and heapify. Repeat this process while the size
of the heap is greater than 1.
● Rearrange array elements so that they form a Max Heap.
● Repeat the following steps until the heap contains only one
element:
In the illustration above, we have shown some steps to sort the array. We
need to keep repeating these steps until there’s only one element left in the
heap.
Complexity Analysis of Heap Sort
Time Complexity: O(n log n)
Auxiliary Space: O(log n), due to the recursive call stack. However, auxiliary
space can be O(1) for iterative implementation.
Important points about Heap Sort
● Heap sort is an in-place algorithm.
● Its typical implementation is not stable but can be made stable (See
this)
log n) in all cases. This makes it efficient for sorting large datasets.
The log n factor comes from the height of the binary heap, and it
merge sort even if the time complexity is O(n Log n) for both.
order.
Merge sort is a popular sorting algorithm known for its efficiency and
dividing the input array into two halves, recursively sorting the two halves
and finally merging them back together to obtain the sorted array.
1. Divide: Divide the list or array recursively into two halves until it can
no longer be divided.
2. Conquer: Each subarray is sorted individually using the merge sort
algorithm.
3. Merge: The sorted subarrays are merged back together in sorted
order. The process continues until all elements from both subarrays
Let’s sort the array or list [38, 27, 43, 10] using Merge Sort
Divide:
● [38, 27, 43, 10] is divided into [38, 27 ] and [43, 10] .
Conquer:
Merge:
● Merge [27, 38] and [10,43] to get the final sorted list [10, 27, 38,
43]
● Time Complexity:
randomly ordered.
reverse order.
● Auxiliary Space: O(n), Additional space is required for the
Advantages
on large datasets.
straightforward.
Disadvantages
concern.
● Merge Sort is Slower than QuickSort in general as QuickSort is
Quick Sort
QuickSort is a sorting algorithm based on the Divide and Conquer that picks
an element as a pivot and partitions the given array around the picked pivot
It works on the principle of divide and conquer, breaking down the problem
1. Choose a Pivot: Select an element from the array as the pivot. The
choice of pivot can vary (e.g., first element, last element, random
element, or median).
2. Partition the Array: Rearrange the array around the pivot. After
partitioning, all elements smaller than the pivot will be on its left,
and all elements greater than the pivot will be on its right. The pivot
is then in its correct position, and we obtain the index of the pivot.
3. Recursively Call: Recursively apply the same process to the two
Choice of Pivot
already sorted.
because it does not have a pattern for which the worst case
happens.
● Pick the median element is pivot. This is an ideal approach in terms
of time complexity as we can find the median in linear time and the
partition function will always divide the input array into two halves.
constants.
Partition Algorithm
1. Naive Partition : Here we create a copy of the array. First put all
temporary array back to the original array. This requires O(n) extra
space.
2. Lomuto Partition: We have used this partition in this article. This is a
its simplicity.
3. Hoare’s Partition: This is the fastest of all. Here we traverse array
from both sides and keep swapping greater elements on left with
Hoare's
Illustration of QuickSort Algorithm
In the previous step, we looked at how the partitioning process rearranges
the array based on the chosen pivot. Next, we apply the same method
recursively to the smaller sub-arrays on the left and right of the pivot. Each
time, we select new pivots and partition the arrays again. This process
continues until only one element is left, which is always sorted. Once every
Below image illustrates, how the recursive method calls for the smaller
Quick Sort is a crucial algorithm in the industry, but there are other sorting
Time Complexity:
● Best Case: (Ω(n log n)), Occurs when the pivot element divides the
● Average Case (θ(n log n)), On average, the pivot divides the array
problems.
to function.
not required.
● It is Tail recursive and hence all the Tail call Optimization can be
done.
● It is not a stable sort, meaning that if two elements have the same
key, their relative order will not be preserved in the sorted output in
positions).