Binary, Merge, Quick
Binary, Merge, Quick
2. Stability: Merge sort is a stable sorting algorithm, meaning it maintains the relative
order of equal elements.
Demerits:
The algorithm requires linear amount of extra storage.
10.What is Quick Sort?
Quick Sort is a sorting algorithm based on the Divide and Conquer algorithm that
picks an element as a pivot and partitions the given array around the picked pivot by
placing the pivot in its correct position in the sorted array.
11.Give the principle of Quick Sort?
The principle of the Quicksort algorithm is given below:
Select any element as pivot.
Split the array into 3 parts: by following the below-given rules:
o First part: All elements in this part should less than the pivot
element.
o Second part: The single element i.e. the pivot element.
o Third part: All elements in this part should greater than or
equal to the pivot element.
Then, applying this algorithm to the first and the third part (recursively).
// Driver code
int main(void)
{
int arr[] = { 2, 3, 4, 10, 40 };
int n = sizeof(arr) / sizeof(arr[0]);
int x = 10;
int result = binarySearch(arr, 0, n - 1, x);
(result == -1) ? printf("Element is not present"
" in array")
: printf("Element is present at "
"index %d",
result);
return 0;
}
Output
Element is present at index 3
Time Complexity
Create a recursive function and compare the mid of the search space with the key.
And based on the result either return the index where the key is found or call the recursive
function for the next search space.
Implementation of Recursive Binary Search Algorithm:
// Driver code
int main()
{
int arr[] = { 2, 3, 4, 10, 40 };
int n = sizeof(arr) / sizeof(arr[0]);
int x = 10;
int result = binarySearch(arr, 0, n - 1, x);
(result == -1)
? printf("Element is not present in array")
: printf("Element is present at index %d", result);
return 0;
}
Output
Element is present at index 3
Complexity Analysis of Binary Search Algorithm:
Time Complexity:
o Best Case: O(1)
o Average Case: O(log N)
o Worst Case: O(log N)
Auxiliary Space: O(1), If the recursive call stack is considered then the
auxiliary space will be O(logN).
Applications of Binary Search Algorithm:
Binary search can be used as a building block for more complex algorithms used
in machine learning, such as algorithms for training neural networks or finding
the optimal hyperparameters for a model.
It can be used for searching in computer graphics such as algorithms for ray
tracing or texture mapping.
It can be used for searching a database.
Advantages of Binary Search:
Binary search is faster than linear search, especially for large arrays.
More efficient than other searching algorithms with a similar time complexity,
such as interpolation search or exponential search.
Binary search is well-suited for searching large datasets that are stored in
external memory, such as on a hard drive or in the cloud.
Disadvantages of Binary Search:
The array should be sorted.
Binary search requires that the data structure being searched be stored in
contiguous memory locations.
Binary search requires that the elements of the array be comparable, meaning
2. Explain the Analysis of Merge Sort?
Let T (n) be the total time taken by the Merge Sort algorithm.
But we ignore '-1' because the element will take some time to be copied in merge lists.
So T (n) = 2T + n...equation 1
Note: Stopping Condition T (1) =0 because at last, there will be only 1 element left that
need to be copied, and there will be no comparison.
logn=log2i
logn=ilog2
=i
log2n=i
From 6 equation
Best Case Complexity: The merge sort algorithm has a best-case time complexity
of O(n*log n) for the already sorted array.
Average Case Complexity: The average-case time complexity for the merge sort algorithm
is O(n*log n), which happens when 2 or more elements are jumbled, i.e., neither in the
ascending order nor in the descending order.
Worst Case Complexity: The worst-case time complexity is also O(n*log n), which occurs
when we sort the descending order of an array into the ascending order.
Merge sort is the sorting technique that follows the divide and conquer approach. This
article will be very helpful and interesting to students as they might face merge sort as a
question in their examinations. In coding or technical interviews for software engineers,
sorting algorithms are widely asked. So, it is important to discuss the topic.
Merge sort is similar to the quick sort algorithm as it uses the divide and conquer approach to
sort the elements. It is one of the most popular and efficient sorting algorithm. It divides the
given list into two equal halves, calls itself for the two halves and then merges the two sorted
halves. We have to define the merge() function to perform the merging.
The sub-lists are divided again and again into halves until the list cannot be divided further.
Then we combine the pair of one element lists into two-element lists, sorting them in the
process. The sorted two-element pairs is merged into the four-element lists, and so on until
we get the sorted list.
Algorithm
In the following algorithm, arr is the given array, beg is the starting element, and end is the
last element of the array.
The important part of the merge sort is the MERGE function. This function performs the
merging of two sorted sub-arrays that are A[beg…mid] and A[mid+1…end], to build one
sorted array A[beg…end]. So, the inputs of the MERGE function are A[], beg,
mid, and end.
To understand the working of the merge sort algorithm, let's take an unsorted array. It will be
easier to understand the merge sort via an example.
According to the merge sort, first divide the given array into two equal halves. Merge sort
keeps dividing the list into equal parts until it cannot be further divided.
As there are eight elements in the given array, so it is divided into two arrays of size 4.
Now, again divide these two arrays into halves. As they are of size 4, so divide them into new
arrays of size 2.
Now, again divide these arrays to get the atomic value that cannot be further divided.
In combining, first compare the element of each array and then combine them into another
array in sorted order.
So, first compare 12 and 31, both are in sorted positions. Then compare 25 and 8, and in the
list of two values, put 8 first followed by 25. Then compare 32 and 17, sort them and put 17
first followed by 32. After that, compare 40 and 42, and place them sequentially.
In the next iteration of combining, now compare the arrays with two data values and merge
them into an array of found values in sorted order.
Now, there is a final merging of the arrays. After the final merging of above arrays, the array
will look like -
Now, the array is completely sorted.
Now, let's see the time complexity of merge sort in best case, average case, and in worst case.
We will also see the space complexity of the merge sort.
1. Time Complexity
o Best Case Complexity - It occurs when there is no sorting required, i.e. the array is
already sorted. The best-case time complexity of merge sort is O(n*logn).
o Average Case Complexity - It occurs when the array elements are in jumbled order
that is not properly ascending and not properly descending. The average case time
complexity of merge sort is O(n*logn).
o Worst Case Complexity - It occurs when the array elements are required to be sorted
in reverse order. That means suppose you have to sort the array elements in ascending
order, but its elements are in descending order. The worst-case time complexity of
merge sort is O(n*logn).
2. Space Complexity
Stable YES
o The space complexity of merge sort is O(n). It is because, in merge sort, an extra
variable is required for swapping.
Implementation of merge sort
Now, let's see the programs of merge sort in different programming languages.
1. #include <stdio.h>
2.
3. /* Function to merge the subarrays of a[] */
4. void merge(int a[], int beg, int mid, int end)
5. {
6. int i, j, k;
7. int n1 = mid - beg + 1;
8. int n2 = end - mid;
9.
10. int LeftArray[n1], RightArray[n2]; //temporary arrays
11.
12. /* copy data to temp arrays */
13. for (int i = 0; i < n1; i++)
14. LeftArray[i] = a[beg + i];
15. for (int j = 0; j < n2; j++)
16. RightArray[j] = a[mid + 1 + j];
17.
18. i = 0; /* initial index of first sub-array */
19. j = 0; /* initial index of second sub-array */
20. k = beg; /* initial index of merged sub-array */
21.
22. while (i < n1 && j < n2)
23. {
24. if(LeftArray[i] <= RightArray[j])
25. {
26. a[k] = LeftArray[i];
27. i++;
28. }
29. else
30. {
31. a[k] = RightArray[j];
32. j++;
33. }
34. k++;
35. }
36. while (i<n1)
37. {
38. a[k] = LeftArray[i];
39. i++;
40. k++;
41. }
42.
43. while (j<n2)
44. {
45. a[k] = RightArray[j];
46. j++;
47. k++;
48. }
49. }
50.
51. void mergeSort(int a[], int beg, int end)
52. {
53. if (beg < end)
54. {
55. int mid = (beg + end) / 2;
56. mergeSort(a, beg, mid);
57. mergeSort(a, mid + 1, end);
58. merge(a, beg, mid, end);
59. }
60. }
61.
62. /* Function to print the array */
63. void printArray(int a[], int n)
64. {
65. int i;
66. for (i = 0; i < n; i++)
67. printf("%d ", a[i]);
68. printf("\n");
69. }
70.
71. int main()
72. {
73. int a[] = { 12, 31, 25, 8, 32, 17, 40, 42 };
74. int n = sizeof(a) / sizeof(a[0]);
75. printf("Before sorting array elements are - \n");
76. printArray(a, n);
77. mergeSort(a, 0, n - 1);
78. printf("After sorting array elements are - \n");
79. printArray(a, n);
80. return 0;
81. }
Output:
Merge Sort
Merge sort is yet another sorting algorithm that falls under the category of Divide and
Conquer technique. It is one of the best sorting techniques that successfully build a recursive
algorithm.
In this technique, we segment a problem into two halves and solve them individually. After
finding the solution of each half, we merge them back to represent the solution of the main
problem.
Suppose we have an array A, such that our main concern will be to sort the subsection, which
starts at index p and ends at index r, represented by A[p..r].
Divide
If assumed q to be the central point somewhere in between p and r, then we will fragment the
subarray A[p..r] into two arrays A[p..q] and A[q+1, r].
Conquer
After splitting the arrays into two halves, the next step is to conquer. In this step, we
individually sort both of the subarrays A[p..q] and A[q+1, r]. In case if we did not reach the
base situation, then we again follow the same procedure, i.e., we further segment these
subarrays followed by sorting them separately.
Combine
As when the base step is acquired by the conquer step, we successfully get our sorted
subarrays A[p..q] and A[q+1, r], after which we merge them back to form a new sorted
array [p..r].
The MergeSort function keeps on splitting an array into two halves until a condition is met
where we try to perform MergeSort on a subarray of size 1, i.e., p == r.
And then, it combines the individually sorted subarrays into larger arrays until the whole
array is merged.
1. ALGORITHM-MERGE SORT
2. 1. If p<r
3. 2. Then q → ( p+ r)/2
4. 3. MERGE-SORT (A, p, q)
5. 4. MERGE-SORT ( A, q+1,r)
6. 5. MERGE ( A, p, q, r)
As you can see in the image given below, the merge sort algorithm recursively divides the
array into halves until the base condition is met, where we are left with only 1 element in the
array. And then, the merge function picks up the sorted sub-arrays and merge them back to
sort the entire array.
Mainly the recursive algorithm depends on a base case as well as its ability to merge back the
results derived from the base cases. Merge sort is no different algorithm, just the fact here
the merge step possesses more importance.
To any given problem, the merge step is one such solution that combines the two individually
sorted lists(arrays) to build one large sorted list(array).
The merge sort algorithm upholds three pointers, i.e., one for both of the two arrays and the
other one to preserve the final sorted array's current index.
Consider the following example of an unsorted array, which we are going to sort with the
help of the Merge Sort algorithm.
A= (36,25,40,2,7,80,15)
Step1: The merge sort algorithm iteratively divides an array into equal halves until we
achieve an atomic value. In case if there are an odd number of elements in an array, then one
of the halves will have more elements than the other half.
Step2: After dividing an array into two subarrays, we will notice that it did not hamper the
order of elements as they were in the original array. After now, we will further divide these
two arrays into other halves.
Step3: Again, we will divide these arrays until we achieve an atomic value, i.e., a value that
cannot be further divided.
Step4: Next, we will merge them back in the same way as they were broken down.
Step5: For each list, we will first compare the element and then combine them to form a new
sorted list.
Step6: In the next iteration, we will compare the lists of two data values and merge them
back into a list of found data values, all placed in a sorted manner.
Hence the array is sorted.
Let T (n) be the total time taken by the Merge Sort algorithm.
So T (n) = 2T + n...equation 1
Note: Stopping Condition T (1) =0 because at last, there will be only 1 element left that
need to be copied, and there will be no comparison.
logn=log2i
logn=ilog2
=i
log2n=i
From 6 equation
Best Case Complexity: The merge sort algorithm has a best-case time complexity
of O(n*log n) for the already sorted array.
Average Case Complexity: The average-case time complexity for the merge sort algorithm
is O(n*log n), which happens when 2 or more elements are jumbled, i.e., neither in the
ascending order nor in the descending order.
Worst Case Complexity: The worst-case time complexity is also O(n*log n), which occurs
when we sort the descending order of an array into the ascending order.
Quick sort
Divide: Rearrange the elements and split arrays into two sub-arrays and an element in
between search that each element in left sub array is less than or equal to the average element
and each element in the right sub- array is larger than the middle element.
Conquer: Recursively, sort two sub arrays.
Algorithm:
Partition Algorithm:
1. 44 33 11 55 77 90 40 60 99 22 88
Let 44 be the Pivot element and scanning done from right to left
Comparing 44 to the right-side elements, and if right-side elements are smaller than 44, then
swap it. As 22 is smaller than 44 so swap them.
22 33 11 55 77 90 40 60 99 4488
Now comparing 44 to the left side element and the element must be greater than 44 then
swap them. As 55 are greater than 44 so swap them.
22 33 11 44 77 90 40 60 99 55 88
Recursively, repeating steps 1 & steps 2 until we get two lists one left from pivot
element 44 & one right from pivot element.
22 33 11 40 77 90 44 60 99 55 88
22 33 11 40 44 90 77 60 99 55 88
Now, the element on the right side and left side are greater than and smaller
than 44 respectively.
And these sublists are sorted under the same process as above done.
Merging Sublists:
SORTED LISTS
Worst Case Analysis: It is the case when items are already in sorted form and we try to sort
them again. This will takes lots of time and space.
Equation:
1. T (n) =T(1)+T(n-1)+n
N: the number of comparisons required to identify the exact position of itself (every element)
If we compare first element pivot with other, then there will be 5 comparisons.
ADVERTISEMENT
Because at last there is only one element left and no comparison is required.