Partial Solutions To Exercises 2.2
Partial Solutions To Exercises 2.2
2
1. a. Since Cworst (n) = n, Cworst (n) ∈ Θ(n).
11
4. a. The order of growth and the related notations O, Ω, and Θ deal with
the asymptotic behavior of functions as n goes to infinity. Therefore no
specific values of functions within a finite range of n’s values, suggestive
as they might be, can establish their orders of growth with mathematical
certainty.
b. lim log2 n
= lim(log2 n)
= lim n1 log2 e = log2 e lim 1
= 0.
n→∞ (n) 1
n→∞ n n→∞ n→∞ n
n 1
lim = lim = 0.
n→∞ n log2 n n→∞ log2 n
n log2 n log2 n
lim
n→∞ n2 = lim
n→∞ n = (see the first limit of this exercise) = 0.
2
lim nn3 = lim n1 = 0.
n→∞ n→∞
n3
3 ) 3n2 2 (n2 )
lim 2n = lim (n
(2n ) = lim n = ln32 n→∞
2 ln 2 lim 2nn = 3
lim (2
ln 2 n→∞ n
)
n→∞ n→∞ n→∞
(n)
= ln32 lim 2n2n 6
lim 2nn = ln62 2 lim (2
ln 2 = ln2 2 n→∞ n)
n→∞ n→∞
= ln2 2 lim 2n ln 2 = ln3 2 lim 2n = 0.
6 1 6 1
n→∞ n→∞
2n
lim = (see Example 3 in the section) 0.
n→∞ n!
5. (n − 2)! ∈ Θ((n − 2)!), 5 lg(n + 100)10 = 50 lg(n + 100) ∈ Θ(log n), 2√2n =
(22 )n ∈ Θ(4n ), 0.001n4 + 3n3 + 1 ∈ Θ(n4 ), ln2 n ∈ Θ(log2 n), 3 n ∈
1
Θ(n 3 ), 3n ∈ Θ(3n ). The list of these functions ordered in increasing
order of growth looks as follows:
√
5 lg(n + 100)10 , ln2 n, 3 n, 0.001n4 + 3n3 + 1, 3n , 22n , (n − 2)!
b.
n 0 if a1 < a2 ⇔ an1 ∈ o(an2 )
an a1
lim 1n = lim = 1 if a1 = a2 ⇔ an1 ∈ Θ(an2 )
n→∞ a2 n→∞ a2
∞ if a1 > a2 ⇔ an2 ∈ o(an1 )
12
the order of growth of g(n) is larger than or equal to the order of growth
of t(n). The formal proof is immediate, too:
implies
1
( )t(n) ≤ g(n) for all n ≥ n0 .
c
can be rewritten as
Let now f (n) ∈ Θ(g(n)); we’ll show that f (n) ∈ Θ(αg(n)) for α > 0.
Indeed, if f (n) ∈ Θ(g(n)),
and therefore
c c
f (n) ≤ ag(n) = c1 αg(n) for all n ≥ n0 (where c1 = > 0),
α α
i.e., f (n) ∈ Θ(αg(n)).
13
8. a. We need to prove that if t1 (n) ∈ Ω(g1 (n)) and t2 (n) ∈ Ω(g2 (n)), then
t1 (n) + t2 (n) ∈ Ω(max{g1 (n), g2 (n)}).
Proof Since t1 (n) ∈ Ω(g1 (n)), there exist some positive constant c1
and some nonnegative integer n1 such that
Since t2 (n) ∈ Ω(g2 (n)), there exist some positive constant c2 and some
nonnegative integer n2 such that
Hence t1 (n) + t2 (n) ∈ Ω(max{g1 (n), g2 (n)}), with the constants c and
n0 required by the O definition being min{c1 , c2 } and max{n1 , n2 }, re-
spectively.
b. The proof follows immediately from the theorem proved in the text
(the O part), the assertion proved in part (a) of this exercise (the Ω part),
and the definition of Θ (see Exercise 7c).
9. a. Since the running time of the sorting part of the algorithm will still
dominate the running time of the second, it’s the former that will deter-
mine the time efficiency of the entire algorithm. Formally, it follows from
equality
Θ(n log n) + O(n) = Θ(n log n),
whose validity is easy to prove in the same manner as that of the section’s
theorem.
b. Since the second part of the algorithm will use no extra space, the
space efficiency class will be determined by that of the first (sorting) part.
Therefore, it will be in Θ(n).
10. The key idea here is to walk intermittently right and left going each time
exponentially farther from the initial position. A simple implementation
of this idea is to do the following until the door is reached: For i = 0, 1, ...,
make 2i steps to the right, return to the initial position, make 2i steps to
14