Name
Name
Cms: 1318-2023
Program: CIS
Course: DSA
Assignment 1
Q: 1 Show that complexity of all cases (Worst case, Average case, Best case) of
given below algorithm?
o Bubble Sort:
Worst-case: O(n^2)
Average-case: O(n^2)
Best-case: O(n)
o Selection Sort:
Worst-case: O(n^2)
Average-case: O(n^2)
Best-case: O(n^2)
o Insertion Sort:
Worst-case: O(n^2)
Average-case: O(n^2)
Best-case: O(n)
o Merge Sort:
Worst-case: O(n log n)
Average-case: O(n log n)
Best-case: O(n log n)
o Linear Search:
Worst-case: O(n)
Average-case: O(n)
Best-case: O(1)
o Binary Search:
Worst-case: O(log n)
Average-case: O(log n)
Best-case: O(1)
o Arrays:
Arrays themselves don't have a specific time complexity as they are a data structure, but
accessing an element by index in an array is O(1) in the best, average, and worst cases.
o Prime Number Algorithm:
Prime number algorithms can vary greatly depending on the specific implementation.
Algorithms like the Sieve of Eratosthenes have complexities ranging from O(n log log n)
to O(n^2) depending on the implementation details and optimizations applied.
Q.2 Explain asymptotic Analysis and why we use asymptotic notations in the study
of algorithms? Briefly describe the commonly used asymptotic Notation?
Asymptotic Analysis and Notations:
Asymptotic analysis is a method used to describe the behavior of functions as their input size
approaches infinity. It helps in understanding how the performance of algorithms scales with
increasing input size. Asymptotic analysis focuses on the dominant part of the function as the
input size becomes very large, ignoring lower-order terms and constant factors.
We use asymptotic notations in the study of algorithms because they provide a concise and
abstract way to express the performance characteristics of algorithms without getting bogged
down in the details of specific implementations or hardware configurations. They allow us to
compare the efficiency of algorithms independently of machine-specific constants and lower-
order terms, which can vary from one system to another.
The commonly used asymptotic notations are:
1. Big O notation (O): This notation represents the upper bound or worst-case scenario of
an algorithm's time complexity. It describes the maximum amount of time an algorithm
will take to run, as a function of the input size. For example, if an algorithm has a time
complexity of O(n^2), it means the time taken by the algorithm will not exceed a
quadratic function of the input size.
2. Big Omega notation (Ω): This notation represents the lower bound or best-case
scenario of an algorithm's time complexity. It describes the minimum amount of time an
algorithm will take to run, as a function of the input size. For example, if an algorithm
has a time complexity of Ω(n), it means the time taken by the algorithm will be at least
linear with respect to the input size.
3. Big Theta notation (Θ): This notation represents both the upper and lower bounds of an
algorithm's time complexity, providing a tight bound on its performance. It describes the
exact growth rate of the algorithm's running time as the input size increases. For
example, if an algorithm has a time complexity of Θ(n), it means the time taken by the
algorithm will grow linearly with the input size, neither faster nor slower.