0% found this document useful (0 votes)
81 views31 pages

Asymptotic Notations and Complexity Analysis

This document discusses asymptotic notations and complexity analysis of algorithms. It defines key terminology like algorithms, time and space complexity measures, and asymptotic notations like Big-O, Big-Omega, and Big-Theta that are used to analyze how an algorithm's running time scales with input size. Examples are given of analyzing loops and calculating time complexities of common algorithms like O(1), O(log n), O(n), O(n^2) etc. Review questions are provided at the end to check understanding of algorithm analysis concepts.

Uploaded by

Rewash Chettri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
81 views31 pages

Asymptotic Notations and Complexity Analysis

This document discusses asymptotic notations and complexity analysis of algorithms. It defines key terminology like algorithms, time and space complexity measures, and asymptotic notations like Big-O, Big-Omega, and Big-Theta that are used to analyze how an algorithm's running time scales with input size. Examples are given of analyzing loops and calculating time complexities of common algorithms like O(1), O(log n), O(n), O(n^2) etc. Review questions are provided at the end to check understanding of algorithm analysis concepts.

Uploaded by

Rewash Chettri
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 31

Data Structures

Lecture: Asymptotic Notations & Complexity Analysis

By
Arvind Kumar
Asst. Professor,
Lovely Professional University, Punjab
Contents
• Basic Terminology
• Complexity of Algorithm
• Asymptotic Notations
• Review Questions
Basic Terminology
• Algorithm: is a finite step by step list of well-defined
instructions for solving a particular problem.

• Complexity of Algorithm: is a function which gives running


time and/or space requirement in terms of the input size.

• Time and Space are two major measures of efficiency of an


algorithm.
Algorithm

Specification of Output
Specification of Input
(e.g. any sequence of natural Algorithm as a function of Input
numbers) (e.g. sequence of sorted
natural numbers)
Characteristics of Good Algorithm
• Efficient
• Running Time
• Space used

• Efficiency as a function of input size


• Size of Input
• Number of Data elements
Time-Space Tradeoff
• By increasing the amount of space for storing the
data, one may be able to reduce the time needed for
processing the data, or vice versa.
Blank
Complexity of Algorithm
• Time and Space used by the algorithm are two main
measures for efficiency of any algorithm M.

• Time is measured by counting the number of key


operations.

• Space is measured by counting the maximum of


memory needed by the algorithm.
• Complexity of Algorithm is a function f(n) which gives
running time and/or space requirement of algorithm M in terms
of the size n of the input data.

• Worst Case: The maximum value of f(n) for any possible input.

• Average Case: The expected or average value of f(n).

• Best Case: Minimum possible value of f(n).


Blank
Analysis of Insertion Sort Algorithm
cost times
for j←2 to n do c1 n
key ←A[j] c2 n-1
i ←j-1 c3 n-1
while i>0 and A[i] > key c4
do A[i+1] ←A[i] c5
i-- c6
A[i+1] ← key c7 n-1
Total Time = n(c1 + c2 + c3 + c7) +
- (c2 + c3 + c5 + c6 + c7)  
Blank
Analysis of Insertion Sort
Total Time = n(c1 + c2 + c3 + c7) +
- (c2 + c3 + c5 + c6 + c7)
• Best Case: Elements are already sorted, tj=1
running time = f(n)
• Worst Case: Elements are sorted in reverse order, tj=j
running time = f(n2)
• Average Case: tj= j/2
running time = f(n2)
 
Rate of Growth
• The rate of growth of some standard functions
g(n) is:

log2n < n < nlog2n < n2 < n3 < 2n


Asymptotic Notations
• Goal: to simplify analysis of running time .

• Useful to identify how the running time of an


algorithm increases with the size of the input in
the limit.

•  Asymptotic is a line that approaches a curve but


never touches.
Asymptotic Notations
Special Classes of Algorithms
• Logarithmic: O(log n)
• Linear: O(n)
• Quadratic: O(n2)
• Polynomial: O(nk), k >= 1
• Exponential: O(an), a > 1
Big-Oh (O) Notation
• Asymptotic upper bound

• f(n) = O (g(n)), if there exists


constants c and n0 such that,

• f(n) <= c g(n) for n >= n0

• f(n) and g(n) are functions over non-


negative integers.
• Used for Worst-case analysis.
Big-Oh (O) Notation
• Simple Rule:
Drop lower order terms and constant factors.

Example:
• 50n log n is O(n log n)
• 8n2 log n + 5 n2 + n is O(n2 log n)
Big-Omega (Ω) Notation
• Asymptotic lower bound

• f(n) = Ω (g(n)), if there


exists constants c and n0
such that,
c g(n) <= f(n) for n >= n0

• Used to describe Best-case


running time.
Big-Theta (Ө)Notation
• Asymptotic tight bound

• f(n) = Ө (g(n)), if there exists


constants c1, c2 and n0 such that,

• c1 g(n) <= f(n) <= c2 g(n) for n


>= n0

• f(n) = Ө (g(n)), iff f(n) = O(g(n)) and


f(n) = Ω (g(n))
Little-Oh (o) Notation
• Non-tight analogue of Big-Oh.

• f(n) = o (g(n)), if for every c, there exists n0


such that,
f(n) < c g(n) for n >= n0

• Used for comparisons of running times.


Analysis of Algorithms
• // Here c is a constant
for (int i = 1; i <= c; i++)
{
// some O(1) expressions
}

• O(1): Time complexity of a function (or set of


statements) is considered as O(1) if it doesn’t
contain loop, recursion and call to any other
function.
Analysis of Algorithms
• for (int i = 1; i <= n; i += c)
{ // some O(1) expressions }

• for (int i = n; i > 0; i -= c)


{ // some O(1) expressions }

• O(n): Time Complexity of a loop is considered as


O(n) if the loop variables is incremented /
decremented by a constant amount.
Analysis of Algorithms
• for (int i = 1; i <=n; i += c)
{
for (int j = 1; j <=n; j += c)
{ // some O(1) expressions
}
}
• for (int i = n; i > 0; i -= c)
{
for (int j = i+1; j <=n; j += c)
{ // some O(1) expressions
} }
• O(n2): Time complexity of nested loops is equal to the
number of times the innermost statement is executed.
Analysis of Algorithms
• for (int i = 1; i <=n; i *= c)
{ // some O(1) expressions }

• for (int i = n; i > 0; i /= c)


{ // some O(1) expressions }

• O(Logn) Time Complexity of a loop is


considered as O(Logn) if the loop variables is
divided / multiplied by a constant amount.
Analysis of Algorithms
• for (int i = 2; i <=n; i = pow(i, c))
{ // some O(1) expressions }

• //Here fun is sqrt or cuberoot or any other


constant root
for (int i = n; i > 0; i = fun(i))
{ // some O(1) expressions }
• O(LogLogn) Time Complexity of a loop is
considered as O(LogLogn) if the loop variables is
reduced / increased exponentially by a constant
amount.
Analysis of Algorithms
• for (int i = 2; i*i <=n; i++))
{ // some O(1) expressions }

• O(√n) Time Complexity.
Questions
Review Questions
• When an algorithm is said to be better than the
other?

• Can an algorithm have different running times on


different machines?

• How the algorithm’ running time is dependent on


machines on which it is executed?
Review Questions
Find out the complexity:
function ()
{
if (condition)
{
for (i=0; i<n; i++) { // simple statements}
}
else
{
for (j=1; j<n; j++)
for (k=n; k>0; k--) {// simple statement}
}
}
Blank

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy