The document provides an overview of the time complexities for three basic sorting algorithms: Bubble Sort, Selection Sort, and Insertion Sort. All three algorithms have a worst-case time complexity of O(n^2), while Bubble Sort and Insertion Sort can achieve O(n) in the best case when the array is already sorted. The document concludes that for large datasets, more efficient algorithms like Merge Sort or Quick Sort are recommended.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0 ratings0% found this document useful (0 votes)
3 views
Sorting_Algorithms_Time_Complexity
The document provides an overview of the time complexities for three basic sorting algorithms: Bubble Sort, Selection Sort, and Insertion Sort. All three algorithms have a worst-case time complexity of O(n^2), while Bubble Sort and Insertion Sort can achieve O(n) in the best case when the array is already sorted. The document concludes that for large datasets, more efficient algorithms like Merge Sort or Quick Sort are recommended.
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 12
Time Complexity of Bubble Sort,
Selection Sort, and Insertion Sort
An Overview of Sorting Algorithms Introduction • Sorting algorithms are fundamental in computer science. • We'll explore the time complexities of three basic sorting algorithms: • 1. Bubble Sort • 2. Selection Sort • 3. Insertion Sort Bubble Sort • Bubble Sort compares adjacent elements and swaps them if they are in the wrong order.
• Best Case: O(n) - when the array is already
sorted • Average Case: O(n^2) • Worst Case: O(n^2) 1. Best Case – O(n) This is the scenario where the input data is already in the best possible state for the algorithm. For Bubble Sort, the best case is when the array is already sorted, so it only needs one pass to confirm that no swaps are needed. Why O(n)? Only one pass through the array is needed, with no swaps — linear time.
2. Average Case – O(n²)
This reflects the typical behavior of the algorithm for a random input. It assumes the elements are in a random order, and the algorithm needs to do a mix of comparisons and swaps. Why O(n²)? On average, the algorithm ends up doing about half of the possible comparisons and swaps. Stil quadratic.
3. Worst Case – O(n²)
This is the scenario where the input is in the least favorable condition. For Bubble Sort, the worst case is when the array is in reverse order (completely unsorted). Why O(n²)? It needs to make the maximum number of comparisons and swaps to fully sort the array. Example for Bubble Sort (n = 5 )
Case Array Example Time Complexity
Best Case [1, 2, 3, 4, 5] O(n) Average Case [3, 5, 1, 4, 2] O(n²) Worst Case [5, 4, 3, 2, 1] O(n²) Selection Sort • Selection Sort selects the smallest element from the unsorted portion and moves it to the sorted portion.
• Best Case: O(n^2)
• Average Case: O(n^2) • Worst Case: O(n^2) Best Case (Sorted Array): For each element, just one comparison is needed. Total: n - 1 comparisons → O(n)
Average Case (Random Order):
Each new element is compared to about half of the sorted portion. Total comparisons & shifts ≈ n²/4 → O(n²)
Worst Case (Reversed Array):
Every element has to be compared and moved through the entire sorted portion. Total ≈ n(n - 1)/2 → O(n²) Insertion Sort • Insertion Sort builds the sorted array one item at a time.
• Best Case: O(n) - when the array is already
sorted • Average Case: O(n^2) • Worst Case: O(n^2) Best Case: O(n)💡 Happens when the array is already sorted. Example:1, 2, 3, 4, 5] In each iteration, every element is already in the correct place. Only one comparison per element, no shifting. Total operations: n → O(n)
Average Case: O(n²)
💡 Happens when the elements are randomly arranged. Example:[4, 1, 3, 5, 2] Sometimes shifting is needed, sometimes not. On average, elements are moved halfway through the sorted portion. Total operations: ~n²/4 → O(n²)
Worst Case: O(n²)
💡 Happens when the array is in reverse order. Example:[5, 4, 3, 2, 1] Every new element is smaller than all previous, so it has to be compared and moved past every one of them. Maximum comparisons and shifts. Total operations: ~n²/2 → O(n²) O(n^2) The time required grows quadratically with the size of the input. Swaps vs Comparisons: Bubble Sort: Repeated swapping. Selection Sort: Swaps only once per iteration but still makes many comparisons. Insertion Sort: Shifts elements instead of swapping, which can be more efficient Conclusion • All three sorting algorithms have O(n^2) worst-case time complexity. • Insertion Sort performs better for nearly sorted arrays. • For large datasets, more efficient algorithms like Merge Sort or Quick Sort are preferred.