0% found this document useful (0 votes)
1 views5 pages

Notes

The document discusses the importance of algorithms in computer science, emphasizing the analysis of their efficiency, particularly running time. It introduces asymptotic notation to measure the growth of running time as input size increases, explaining concepts like O, Θ, o, and ω notations. Examples of algorithm analysis, including selection sort and dummy algorithms, are provided to illustrate these concepts.

Uploaded by

nakbeta4
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
1 views5 pages

Notes

The document discusses the importance of algorithms in computer science, emphasizing the analysis of their efficiency, particularly running time. It introduces asymptotic notation to measure the growth of running time as input size increases, explaining concepts like O, Θ, o, and ω notations. Examples of algorithm analysis, including selection sort and dummy algorithms, are provided to illustrate these concepts.

Uploaded by

nakbeta4
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Running time and asymptotic notation

Hoa T. Vu *

1 Introduction
Algorithms are the backbone of computer science. The study of algorithms predates computers (e.g., Euclid
algorithm). An algorithm is a set of instructions to solve a particular problem. Analysis of algorithms consists
of the following:

• Mathematically prove that the described algorithm is correct.

• Analyze its efficiency. In this course, we will focus on the running time. However, there are other
measurements of efficiency such as memory use, communication (distributed algorithms), etc.

Algorithms to computer science is like physics to engineering. Some concrete applications: Efficiently
sort a large database, scheduling tasks, finding the shortest path between two locations on a map, etc.

2 Running time and asymptotic notation


The running time is measured based on the number of basic machine instructions such as memory access and
pairwise arithmetic operations. For example, consider the following line of code

A[i] = A[i] + 1.

We have memory access operations (accessing A[i]) and a pairwise arithmetic operation. For the moment, let
us adopt the following view: each line of the pseudo-code consists of a constant number of basic instructions.
We want to count how many times these lines are executed in total.
Let us look at the following example that sorts an array A[1 . . . n] of integers in increasing order. The
algorithm is called selection sort. The idea is simple, scan through A[1 . . . n] to find the smallest element and
put it in A[1], then scan through A[2 . . . n] to find the smallest element and put it in A[2], and so on.
Algorithm 1: Selection sort, input: A[1 . . . n]
1 for j = 1, 2, 3, . . . , n − 1 do
2 min = j
3 for k = j + 1, . . . , n do
4 if A[min] > A[k] then
5 min = k
6 Swap A[j] and A[min]

• Lines 1, 2, and 6 are executed n − 1 times.


* San Diego State University, hvu2@sdsu.edu

1
• Lines 3, 4, and 5 are executed at most n − 1, n − 2, n − 3, . . . , 1 times respectively.

Recall that 1 + 2 + . . . + n = n(n + 1)/2. The total running time is at most


n−1
X
T (n) = 3(n − 1 + (n − j)) = 3(n − 1) + 3(n − 1)(n)/2.
j=1

What is the running time of the following dummy algorithm?


Algorithm 2: A dummy algorithm
1 for j = 1, 2, 3, . . . , n do
2 x=0
3 for k = 1, . . . , 2j do
4 y=1

3 Asymptotic notation
We often want to measure the growth of the running time as n increases. For example, when T (n) =
n2 /2 + n/2 − 1, the term n2 dictates the growth in terms of n. We often use asymptotic notation to denote
the running time to make our life easier. We use the notation f (n) = O(g(n)) to say that the growth of f (n)
is no more than the growth of g(n).
Definition 1. We say f (n) = O(g(n)) if there exist constants c and n0 such that f (n) ≤ c(g(n)) for all
n ≥ n0 . An equivalent definition is that
f (n)
lim =d<∞
n→∞ g(n)

g(n)
(i.e., d is some constant). If f (n) = O(g(n)), then g(n) = Ω(f (n)) or equivalently limn→∞ f (n) > 0.

Let use consider some examples. I claim that n2 /2 + n/2 − 1 = O(n2 ). One way to show this is as
follows. For n ≥ 1,
n2 /2 + n/2 − 1 ≤ n2 + n2 = 2n2 .
So in this case c = 2 and n0 = 1. Another way to show this is to use limit.
n2 /2 + n/2 − 1
lim = lim (1/2 + 1/(2n) − 1/(n2 )) = 1/2 < ∞.
n→∞ n2 n→∞

So we say that the running time of selection sort is O(n2 ). Let’s look at another example: n + 10 ln n =
O(n). We can plot n and 10 ln n to see that n > 10 ln n for n ≥ 40.

2
Thus, for n ≥ 40, we have
n + 10 ln n ≤ n + n = 2n.
So in this case c = 2 and n0 = 40. We can also use limit approach, but before that recall L’hospital rule that
if lim f ′ (n)/g ′ (n) exists then and lim f (n)/g(n) = lim f ′ (n)/g ′ (n). We have

n + 10 ln n 1 + 10(1/n)
lim = lim = 1 < ∞.
n→∞ n n→∞ 1
Exercise: Show that loga n = O(logb n) for constants a and b. Hint: use the change-of-base formula of
logarithmic.
√ √
Exercise: Show that 2 n + n1/3 log2 n = O( n).
We use the Θ notation to denote “similar growth rate”.

Definition 2. We say f (n) = Θ(g(n)) if f (n) = O(g(n)) and g(n) = O(f (n)). An equivalent definition is
that
f (n)
lim =d
n→∞ g(n)

for some constant 0 < d < ∞.

Let us consider some examples.

• I claim that 2n100 + n99 = Θ(n100 ). To see this,

2n100 − n99
lim =2
n→∞ n100
and 0 < 2 < ∞.

• Exercise: Show that n1.99 = O(n2 ) but n1.99 ̸= Θ(n2 ).

• Exercise: Show that if n is a power of 2, then 2 + 4 + 8 + . . . + n = Θ(n2 ).

Finally, o and ω are used to denote a “strictly slower growth rate“ and “strictly faster growth rate”
respectively.

Definition 3. We say

• f (n) = o(g(n)) if
f (n)
lim = 0.
n→∞ g(n)

• f (n) = ω(g(n)) if
f (n)
lim = ∞.
n→∞ g(n)

Note that f (n) = o(g(n)) is the same as g(n) = ω(f (n)).

Let’s work on some examples.

• I claim that log2 n = o(n0.1 ). To see this,

log2 n ln n 1 ln n 1 1/n 1 10
lim 0.1
= lim 0.1
= lim 0.1 = lim −0.9
= lim 0.1 = 0.
n→∞ n n→∞ ln 2 × n ln 2 n→∞ n ln 2 n→∞ 0.1n ln 2 n→∞ n

3
• I claim that 3n = ω(2n ). To see this,

3n
lim = lim (3/2)n = ∞.
n→∞ 2n n→∞

To recap:

• f (n) = O(g(n)) means that f (n)’s growth is at most g(n)’s growth.

• f (n) = Ω(g(n)) means that f (n)’s growth is at least g(n)’s growth.

• f (n) = Θ(g(n)) means that f (n)’s growth is the same as g(n)’s growth.

• f (n) = o(g(n)) means that f (n)’s growth is strictly slower than g(n)’s growth.

• f (n) = ω(g(n)) means that f (n)’s growth is strictly faster than g(n)’s growth.

Useful facts. Let us go over some useful facts. Some proofs are omitted and left as exercise.

• A polynomial has the same growth rate as its most significant terms. E.g., 3n3 − n2 + 100n = Θ(n3 ).

• A polynomial always grows faster than a poly-logarithmic. E.g.,

n = ω(log999 n),
n0.1 = ω(log99 n),
n log2 n = o(n1.01 ), . . .

• An exponential function always grow faster than a polynomial function (as long as the base is larger
than 1). E.g.,

n99 = o(2n ),
n10 = o(1.012n ), . . .

• For any constant c, 1c + 2c + . . . + nc = Θ(nc+1 ). E.g., 12 + 22 + . . . + n2 = Θ(n3 ). Let’s try to


prove this. First, see that

1c + 2c + . . . + nc ≤ |nc + nc + c
{z . . . + n} ≤ n
c+1
=⇒ 1c + 2c + . . . + nc = O(nc+1 ).
n times

It remains to show that 1c + . . . + nc = Ω(nc+1 ). Here, we use a method called approximation by


integrals.

4
Note that the area under the boxes is given by 1c + 2c + . . . + nc . This area is larger than the area
under the curve xc from 0 to n. So
Z n
c c c nc+1
1 + 2 + ... + n ≥ xc dx = =⇒ 1c + 2c + . . . + nc = Ω(nc+1 ), since c is a constant.
0 c + 1

• For any constant c, we have c = O(1).

• If a < b, then an = o(bn ). E.g., 2n = o(3n ).

• A common pitfall. We have to be careful when dealing with sums where the number of terms is not
fixed. For example, the following proof is wrong.

1 + 2 + 3 + . . . + n = O(1) + O(1) + . . . + O(1) = O(n).

We showed that 1 + 2 + . . . + n = n(n + 1) = Θ(n2 ) which grows faster than O(n). P What exactly
went wrong here? For any constant i, it is true that i = O(1); however, in the sum ni=1 i, we can see
that i is not a constant. It ranges over a set of values from 1 to n which depends on n.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy