0% found this document useful (0 votes)
8 views7 pages

Asymptotic Analysis

Asymptotic analysis is a method for evaluating the run-time performance of algorithms by determining their best, average, and worst-case scenarios based on input size. It utilizes various notations such as Big O, Big Omega, and Big Theta to express upper, lower, and tight bounds of an algorithm's complexity. Additionally, the document distinguishes between apriori analysis, which is theoretical and done before execution, and apostiari analysis, which is practical and performed after execution.

Uploaded by

angela2000toohot
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views7 pages

Asymptotic Analysis

Asymptotic analysis is a method for evaluating the run-time performance of algorithms by determining their best, average, and worst-case scenarios based on input size. It utilizes various notations such as Big O, Big Omega, and Big Theta to express upper, lower, and tight bounds of an algorithm's complexity. Additionally, the document distinguishes between apriori analysis, which is theoretical and done before execution, and apostiari analysis, which is practical and performed after execution.

Uploaded by

angela2000toohot
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Asymptotic Analysis

Asymptotic analysis of an algorithm refers to defining the mathematical


foundation/framing of its run-time performance. Using asymptotic analysis,
we can very well conclude the best case, average case, and worst case
scenario of an algorithm.

Asymptotic analysis is input bound i.e., if there's no input to the algorithm, it


is concluded to work in a constant time. Other than the "input" all other
factors are considered constant.

Asymptotic analysis refers to computing the running time of any operation in


mathematical units of computation. For example, the running time of one
operation is computed as f(n) and may be for another operation it is
computed as g(n2). This means the first operation running time will increase
linearly with the increase in n and the running time of the second operation
will increase exponentially when n increases. Similarly, the running time of
both operations will be nearly the same if n is significantly small.

Usually, the time required by an algorithm falls under three types −

 Best Case − Minimum time required for program execution.


 Average Case − Average time required for program execution.
 Worst Case − Maximum time required for program execution.

Asymptotic Notations
Execution time of an algorithm depends on the instruction set, processor
speed, disk I/O speed, etc. Hence, we estimate the efficiency of an algorithm
asymptotically.

Time function of an algorithm is represented by T(n), where n is the input


size.

Different types of asymptotic notations are used to represent the complexity


of an algorithm. Following asymptotic notations are used to calculate the
running time complexity of an algorithm.

 O − Big Oh Notation
 Ω − Big omega Notation
 θ − Big theta Notation
 o − Little Oh Notation
 ω − Little omega Notation

Debasree Sarkar, Asst. Prof. BCET Durgapur Page 1


Big Oh, O: Asymptotic Upper Bound
The notation Ο(n) is the formal way to express the upper bound of an
algorithm's running time. is the most commonly used notation. It measures
the worst case time complexity or the longest amount of time an algorithm can
possibly take to complete.

A function f(n) can be represented is the order of g(n) that is O(g(n)), if there
exists a value of positive integer n as n0 and a positive constant c such that

f(n)⩽c.g(n)f(n)⩽c.g(n) for n>n0n>n0 in all case


Hence, function g(n) is an upper bound for function f(n), as g(n) grows faster
than f(n).

Example

Let us consider a given function, f(n)=4.n3+10.n2+5.n+1f(n)=4.n3+10.n2+5.n+1

Considering g(n)=n3g(n)=n3,

f(n)⩽5.g(n)f(n)⩽5.g(n) for all the values of n>2n>2

Debasree Sarkar, Asst. Prof. BCET Durgapur Page 2


Hence, the complexity of f(n) can be represented as O(g(n))O(g(n)),
i.e. O(n 3)O(n3)

Big Omega, Ω: Asymptotic Lower Bound


The notation Ω(n) is the formal way to express the lower bound of an
algorithm's running time. It measures the best case time complexity or the best
amount of time an algorithm can possibly take to complete.

We say that f(n)=Ω(g(n))f(n)=Ω(g(n)) when there exists


constant c that f(n)⩾c.g(n)f(n)⩾c.g(n) for all sufficiently large value of n.
Here n is a positive integer. It means function g is a lower bound for
function f ; after a certain value of n, f will never go below g.

Example

Let us consider a given function, f(n)=4.n3+10.n2+5.n+1f(n)=4.n3+10.n2+5.n+1.

Considering g(n)=n3g(n)=n3, f(n)⩾4.g(n)f(n)⩾4.g(n) for all the values of n>0n>0.

Hence, the complexity of f(n) can be represented as Ω(g(n))Ω(g(n)),


i.e. Ω(n 3)Ω(n3)

Debasree Sarkar, Asst. Prof. BCET Durgapur Page 3


Theta, θ: Asymptotic Tight Bound
The notation θ(n) is the formal way to express both the lower bound and the
upper bound of an algorithm's running time. Some may confuse the theta
notation as the average case time complexity; while big theta notation could
be almost accurately used to describe the average case, other notations
could be used as well.

We say that f(n)=θ(g(n))f(n)=θ(g(n)) when there exist


constants c1 and c2 that c1.g(n)⩽f(n)⩽c2.g(n)c1.g(n)⩽f(n)⩽c2.g(n) for all
sufficiently large value of n. Here n is a positive integer.

This means function g is a tight bound for function f.

Example

Let us consider a given function, f(n)=4.n3+10.n2+5.n+1f(n)=4.n3+10.n2+5.n+1

Considering g(n)=n 3g(n)=n3, 4.g(n)⩽f(n)⩽5.g(n)4.g(n)⩽f(n)⩽5.g(n) for all the


large values of n.

Hence, the complexity of f(n) can be represented as θ(g(n))θ(g(n)),


i.e. θ(n 3)θ(n3).

Debasree Sarkar, Asst. Prof. BCET Durgapur Page 4


Debasree Sarkar, Asst. Prof. BCET Durgapur Page 5
Common Asymptotic Notations
Following is a list of some common asymptotic notations −

constant − O(1)

logarithmic − O(log n)

Debasree Sarkar, Asst. Prof. BCET Durgapur Page 6


linear − O(n)

n log n − O(n log n)

quadratic − O(n2)

cubic − O(n3)

polynomial − nO(1)

exponential − 2O(n)

Apriori and Apostiari Analysis


Apriori analysis means, analysis is performed prior to running it on a specific
system. This analysis is a stage where a function is defined using some
theoretical model. Hence, we determine the time and space complexity of an
algorithm by just looking at the algorithm rather than running it on a
particular system with a different memory, processor, and compiler.

Apostiari analysis of an algorithm means we perform analysis of an


algorithm only after running it on a system. It directly depends on the
system and changes from system to system.

In an industry, we cannot perform Apostiari analysis as the software is


generally made for an anonymous user, which runs it on a system different
from those present in the industry.

In Apriori, it is the reason that we use asymptotic notations to determine


time and space complexity as they change from computer to computer;
however, asymptotically they are the same.

Debasree Sarkar, Asst. Prof. BCET Durgapur Page 7

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy