2 - Master's Algorithm
2 - Master's Algorithm
Abstract
Master's Theorem is the best method to quickly find the algorithm's time
complexity from its recurrence relation. This theorem can be applied to
decreasing as well as dividing functions, each of which we'll be looking
into detail ahead.
Scope
This article starts with the need for the Master's Theorem and the
definition for the same.
Master's Theorem can be applied to 'dividing' and 'decreasing'
recurrence relations. Both of which are discussed in detail.
Further, it covers important cases that cover the computation
involved in the Master's Algorithm.
To understand any logical theorem, we need to understand how
we reached that result. Hence, the idea behind the Master's
Theorem for dividing functions is discussed in detail. Each term in
the recurrence form and each comparison are explained in detail.
Master's Theorem is made easy for the reader by explaining the
proof and solving Master's Theorem examples for both dividing
and decreasing functions.
Every Theorem or method comes with its underlying limitations,
which are important to understand to get the loopholes so that they
can be avoided.
Introduction
The most widely used method to analyse or compare various algorithms
is by computing their time complexities. As the algorithm gets complex,
the time complexity function calculation also complexities.
1. Dividing functions
2. Decreasing Functions
Master's Theorem is the most useful and easy method to compute the
time complexity function of recurrence relations. Master's Algorithm for
dividing functions can only be applied on the recurrence relations of the
form: T(n)T(n) = aT(n/b) + f(n)aT(n/b)+f(n), where f(n) = θ(n^{k}
log^pn)f(n)=θ(nklogpn)
for example:
T(n)T(n) = 2T(n/2)2T(n/2) + n^{2}n2
T(n)T(n) = T(n/4)T(n/4) + nlogn nlogn
where,
1. a>=1
2. b>1
3. k>=0
log^{(p+1)}n)θ(nklog(p+1)n)
o b) If p=-1, then T(n)T(n) = θ(n^{k} loglog n)θ(nkloglogn)
o c) p<-1, then T(n)T(n) = θ(n^{k})θ(nk)
o If p<0, then T(n)T(n) = θ(n^{k})θ(nk)
alogp+1n)
o b) If p=-1, then T(n) = θ(n^{log_b a} loglogn)T(n)=θ(nlogb
aloglogn)
o c) If p<-1, then T(n) = θ(n^{log_b a})T(n)=θ(nlogba)
Hence, using the values of a, b, k and p, we can easily find the time
complexity function of any dividing recurrence relation of the above-
stated form. We'll solve some examples in the upcoming section.
Where:
n = input size (or the size of the problem)
a = count of subproblems in the decreasing recursive function
n-b = size of each subproblem (Assuming size of each subproblem is
same)
a>0, b>0
k>=0
For example:
Hence,
=> T(n)^{'} = aT(n/b)T(n)′=aT(n/b)
T(n/b)T(n/b) can again be divided into sub problems of size (n/b^{2})
(n/b2) each.
Hence,
=> T(n)^{'} = aT(n/b) = a^{2}T(n/b^{2})T(n)′=aT(n/b)=a2T(n/b2)
Similarly,
=> T(n)^{'} = a^{3}T(n/b^{3})T(n)′=a3T(n/b3) and so on.
to
=> T(n)^{'} = a^{i}T(n/b^{i})T(n)′=aiT(n/bi)
=> T(n)^{'}= a^{i}T(n/b^{i})T(n)′=aiT(n/bi)
=> a^{logbn}T(b^{i}/b^{i})alogbnT(bi/bi)
=> a^{logbn}T(1)alogbnT(1)
Hence, T(n)^{'} = θ(a^{logbn}) = θ(n^{logba})T(n)′=θ(alogbn)=θ(nlogba)