0% found this document useful (0 votes)
181 views

Analysis of Algorithms: Asymptotic Notation

This document discusses asymptotic notation and complexity analysis of algorithms. It introduces Big-O, Big-Theta, and Big-Omega notation to describe the asymptotic behavior and growth rates of functions. Common examples are provided to illustrate how these notations can be used to analyze the worst-case, best-case, and average-case time complexity of algorithms like insertion sort. The relationships between the different notations are also explained.

Uploaded by

hariom12367855
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
181 views

Analysis of Algorithms: Asymptotic Notation

This document discusses asymptotic notation and complexity analysis of algorithms. It introduces Big-O, Big-Theta, and Big-Omega notation to describe the asymptotic behavior and growth rates of functions. Common examples are provided to illustrate how these notations can be used to analyze the worst-case, best-case, and average-case time complexity of algorithms like insertion sort. The relationships between the different notations are also explained.

Uploaded by

hariom12367855
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 25

Analysis of Algorithms

Asymptotic Notation
Instructor: S.N.TAZI
ASSISTANT PROFESSOR ,DEPTT CSE
GEC AJMER
satya.tazi@ecajmer.ac.in
Asymptotic Complexity
• Running time of an algorithm as a function of
input size n for large n.
• Expressed using only the highest-order term
in the expression for the exact running time.
– Instead of exact running time, say Q(n2).
• Describes behavior of function in the limit.
• Written using Asymptotic Notation.
Asymptotic Notation
• Q, O, W, o, w
• Defined for functions over the natural numbers.
– Ex: f(n) = Q(n2).
– Describes how f(n) grows in comparison to n2.
• Define a set of functions; in practice used to compare
two function sizes.
• The notations describe different rate-of-growth
relations between the defining function and the
defined set of functions.
-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
 positive constants c1, c2, and n0,
such that n  n0,
we have 0  c1g(n)  f(n)  c2g(n)

}
Intuitively: Set of all functions that
have the same rate of growth as g(n).

g(n) is an asymptotically tight bound for f(n).


-notation
For function g(n), we define (g(n)),
big-Theta of n, as the set:
(g(n)) = {f(n) :
 positive constants c1, c2, and n0,
such that n  n0,
we have 0  c1g(n)  f(n)  c2g(n)

}
Technically, f(n)  (g(n)).
Older usage, f(n) = (g(n)).
I’ll accept either…

f(n) and g(n) are nonnegative, for large n.


Example
(g(n)) = {f(n) :  positive constants c1, c2, and n0,
such that n  n0, 0  c1g(n)  f(n)  c2g(n)}

• 10n2 - 3n = Q(n2)
• What constants for n0, c1, and c2 will work?
• Make c1 a little smaller than the leading
coefficient, and c2 a little bigger.
• To compare orders of growth, look at the
leading term.
• Exercise: Prove that n2/2-3n= Q(n2)
Example
(g(n)) = {f(n) :  positive constants c1, c2, and n0,
such that n  n0, 0  c1g(n)  f(n)  c2g(n)}

• Is 3n3  Q(n4) ??
• How about 22n Q(2n)??
O-notation
For function g(n), we define O(g(n)),
big-O of n, as the set:
O(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,
we have 0  f(n)  cg(n) }
Intuitively: Set of all functions
whose rate of growth is the same as
or lower than that of g(n).
g(n) is an asymptotic upper bound for f(n).
f(n) = (g(n))  f(n) = O(g(n)).
(g(n))  O(g(n)).
Examples
O(g(n)) = {f(n) :  positive constants c and n0,
such that n  n0, we have 0  f(n)  cg(n) }

• Any linear function an + b is in O(n2). How?


• Show that 3n3=O(n4) for appropriate c and n0.
 -notation
For function g(n), we define (g(n)),
big-Omega of n, as the set:
(g(n)) = {f(n) :
 positive constants c and n0,
such that n  n0,
we have 0  cg(n)  f(n)}
Intuitively: Set of all functions
whose rate of growth is the same
as or higher than that of g(n).
g(n) is an asymptotic lower bound for f(n).
f(n) = (g(n))  f(n) = (g(n)).
(g(n))  (g(n)).
Example
(g(n)) = {f(n) :  positive constants c and n0, such
that n  n0, we have 0  cg(n)  f(n)}

• n = (lg n). Choose c and n0.


Relations Between Q, O, W
Relations Between Q, W, O
Theorem
Theorem :: ForFor any
any two
two functions
functions g(n)
g(n) and
and f(n),
f(n),
f(n) == (g(n))
f(n) (g(n)) iff
iff
f(n)
f(n) == O(g(n))
O(g(n)) and f(n) == (g(n)).
and f(n) (g(n)).

• I.e., (g(n)) = O(g(n)) Ç W(g(n))

• In practice, asymptotically tight bounds are


obtained from asymptotic upper and lower
bounds.
Running Times
• “Running time is O(f(n))” Þ Worst case is O(f(n))
• O(f(n)) bound on the worst-case running time 
O(f(n)) bound on the running time of every input.
• Q(f(n)) bound on the worst-case running time 
Q(f(n)) bound on the running time of every input.
• “Running time is W(f(n))” Þ Best case is W(f(n))
• Can still say “Worst-case running time is W(f(n))”
– Means worst-case running time is given by some
unspecified function g(n) Î W(f(n)).
Example
• Insertion sort takes Q(n2) in the worst case, so
sorting (as a problem) is O(n2). Why?

• Any sort algorithm must look at each item, so


sorting is W(n).

• In fact, using (e.g.) merge sort, sorting is Q(n lg n)


in the worst case.
– Later, we will prove that we cannot hope that any
comparison sort to do better in the worst case.
Insertion Sort Algorithm (cont.)
INSERTION-SORT(A)
1. for j = 2 to length[A]
2. do key  A[j]
3. //insert A[j] to sorted sequence A[1..j-1]
4. i  j-1
5. while i >0 and A[i]>key
6. do A[i+1]  A[i] //move A[i] one position right
7. i  i-1
8. A[i+1]  key

16
Correctness of Insertion Sort Algorithm

• Loop invariant
– At the start of each iteration of the for loop, the subarray
A[1..j-1] contains original A[1..j-1] but in sorted order.
• Proof:
– Initialization : j=2, A[1..j-1]=A[1..1]=A[1], sorted.
– Maintenance: each iteration maintains loop invariant.
– Termination: j=n+1, so A[1..j-1]=A[1..n] in sorted order.

17
Analysis of Insertion Sort
INSERTION-SORT(A) cost times
1. for j = 2 to length[A] c1 n
2. do key  A[j] c2 n-1
3. //insert A[j] to sorted sequence A[1..j-1] 0n-1
4. i  j-1 c4 n-1
5. while i >0 and A[i]>key c5 j=2n tj
6. do A[i+1]  A[i] c6 j=2n(tj –1)
7. i  i-1 c7 j=2n(tj –1)
8. A[i+1]  key c8 n –1
(tj is the number of times the while loop test in line 5 is executed for that value of j)
The total time cost T(n) = sum of cost  times in each line
=c1n + c2(n-1) + c4(n-1) + c5j=2n tj+ c6j=2n (tj-1)+ c7j=2n (tj-1)+ c8(n-1)

18
Analysis of Insertion Sort (cont.)
• Best case cost: already ordered numbers
– tj=1, and line 6 and 7 will be executed 0 times
– T(n) = c1n + c2(n-1) + c4(n-1) + c5(n-1) + c8(n-1)
=(c1 + c2 + c4 + c5 + c8)n – (c2 + c4 + c5 + c8) = cn + c‘
• Worst case cost: reverse ordered numbers
– tj=j,
– so j=2n tj = j=2n j =n(n+1)/2-1, and j=2n(tj –1) = j=2n(j –1) = n(n-1)/2, and
– T(n) = c1n + c2(n-1) + c4(n-1) + c5(n(n+1)/2 -1) + + c6(n(n-1)/2 -1) + c7(n(n-
1)/2)+ c8(n-1) =((c5 + c6 + c7)/2)n2 +(c1 + c2 + c4 +c5/2-c6/2-c7/2+c8)n-(c2 + c4 +
c5 + c8) =an2+bn+c
• Average case cost: random numbers
– in average, tj = j/2. T(n) will still be in the order of n2, same as the worst case.

19
Asymptotic Notation in Equations
• Can use asymptotic notation in equations to
replace expressions containing lower-order terms.
• For example,
4n3 + 3n2 + 2n + 1 = 4n3 + 3n2 + (n)
= 4n3 + (n2) = (n3). How to interpret?
• In equations, (f(n)) always stands for an
anonymous function g(n) Î (f(n))
– In the example above, (n2) stands for
3n2 + 2n + 1.
o-notation
For a given function g(n), the set little-o:
o(g(n)) = {f(n):  c > 0,  n0 > 0 such that
 n  n0, we have 0  f(n) < cg(n)}.
f(n) becomes insignificant relative to g(n) as n
approaches infinity:
lim [f(n) / g(n)] = 0
n

g(n) is an upper bound for f(n) that is not


asymptotically tight.
Observe the difference in this definition from previous
ones. Why?
w -notation
For a given function g(n), the set little-omega:

w(g(n)) = {f(n):  c > 0,  n > 0 such that


0
 n  n0, we have 0  cg(n) < f(n)}.
f(n) becomes arbitrarily large relative to g(n) as n
approaches infinity:
lim [f(n) / g(n)] = .
n

g(n) is a lower bound for f(n) that is not


asymptotically tight.
Comparison of Functions
fg  ab

f (n) = O(g(n))  a  b
f (n) = (g(n))  a  b
f (n) = (g(n))  a = b
f (n) = o(g(n))  a < b
f (n) = w (g(n))  a > b
Limits
• lim [f(n) / g(n)] = 0 Þ f(n) Î o(g(n))
n

• lim [f(n) / g(n)] <  Þ f(n) Î O(g(n))


n

• 0 < lim [f(n) / g(n)] <  Þ f(n) Î Q(g(n))


n

• 0 < lim [f(n) / g(n)] Þ f(n) Î W (g(n))


n

• lim [f(n) / g(n)] =  Þ f(n) Î w(g(n))


n

• lim [f(n) / g(n)] undefined Þ can’t say


n
• Symmetry
Properties
f(n) = (g(n)) iff g(n) = (f(n))

• Complementarity
f(n) = O(g(n)) iff g(n) = (f(n))
f(n) = o(g(n)) iff g(n) = w((f(n))

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy