0% found this document useful (0 votes)
71 views

2 - Master's Algorithm

The document discusses Master's Algorithm, which provides a method for determining the time complexity of recursive algorithms. It can be applied to dividing and decreasing recurrence relations. Master's Theorem states that the time complexity depends on comparing logba and k for dividing relations, and the value of a for decreasing relations. The document provides examples and explanations of how to apply Master's Algorithm to determine the time complexity of different recursive relations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
71 views

2 - Master's Algorithm

The document discusses Master's Algorithm, which provides a method for determining the time complexity of recursive algorithms. It can be applied to dividing and decreasing recurrence relations. Master's Theorem states that the time complexity depends on comparing logba and k for dividing relations, and the value of a for decreasing relations. The document provides examples and explanations of how to apply Master's Algorithm to determine the time complexity of different recursive relations.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 8

Master's Algorithm

Abstract
Master's Theorem is the best method to quickly find the algorithm's time
complexity from its recurrence relation. This theorem can be applied to
decreasing as well as dividing functions, each of which we'll be looking
into detail ahead.
Scope
 This article starts with the need for the Master's Theorem and the
definition for the same.
 Master's Theorem can be applied to 'dividing' and 'decreasing'
recurrence relations. Both of which are discussed in detail.
 Further, it covers important cases that cover the computation
involved in the Master's Algorithm.
 To understand any logical theorem, we need to understand how
we reached that result. Hence, the idea behind the Master's
Theorem for dividing functions is discussed in detail. Each term in
the recurrence form and each comparison are explained in detail.
 Master's Theorem is made easy for the reader by explaining the
proof and solving Master's Theorem examples for both dividing
and decreasing functions.
 Every Theorem or method comes with its underlying limitations,
which are important to understand to get the loopholes so that they
can be avoided.

Introduction
The most widely used method to analyse or compare various algorithms
is by computing their time complexities. As the algorithm gets complex,
the time complexity function calculation also complexities.

Recursive functions call themselves in their body. It might get complex if


we start calculating its time complexity function by other commonly used
simpler methods. Master's method is the most useful and easy method
to compute the time complexity function of recurrence relations.
We can apply Master's Theorem for:

1. Dividing functions
2. Decreasing Functions

Master's Method for Dividing Functions


Highlights:

 Used to directly calculate the time complexity function of 'dividing' recurrence


relations of the form:
o T(n)T(n) = aT(n/b) + f(n)aT(n/b)+f(n)
 where f(n)f(n) = θ(n^{k} log^{p}n)θ(nklogpn)
 Compare log_balogba and kk to decide the final time complexity function.

Master's Theorem is the most useful and easy method to compute the
time complexity function of recurrence relations. Master's Algorithm for
dividing functions can only be applied on the recurrence relations of the
form: T(n)T(n) = aT(n/b) + f(n)aT(n/b)+f(n), where f(n) = θ(n^{k}
log^pn)f(n)=θ(nklogpn)
for example:

 T(n)T(n) = 2T(n/2)2T(n/2) + n^{2}n2
 T(n)T(n) = T(n/4)T(n/4) + nlogn nlogn

where,

n = input size (or the size of the problem)


a = count of sub problems in the dividing recursive function
n/b = size of each sub problem (Assuming size of each sub problem is
same)

Where the constants a, b, and k are constants and follow the following


conditions:

1. a>=1
2. b>1
3. k>=0

T(n) = aT({n\over b}) + \theta(n^k{log^p n})T(n)=aT(bn)+θ(nklogpn)


Master's Theorem states that:

 Case 1) If log ba>klogba>k then:


o T(n)T(n) = θ(n^{log b a})(nlogba)

 Case 2) If log ba=klogba=k, then:


o a) If p>-1, then T(n)T(n) = θ(n^{k}

log^{(p+1)}n)θ(nklog(p+1)n)
o b) If p=-1, then T(n)T(n) = θ(n^{k} loglog n)θ(nkloglogn)

o c) p<-1, then T(n)T(n) = θ(n^{k})θ(nk)

 Case 3) If log_ba<klogba<k, then:


o If p>=0, then T(n)T(n) = θ(n^{k} log^{p}n)θ(nklogpn)

o If p<0, then T(n)T(n) = θ(n^{k})θ(nk)

The same can also be written as:

 Case 1- If a>b^ka>bk, then:


o T(n)T(n) = θ(n^{log_b a})(nlogba)
 Case 2- If a=b^ka=bk, then:
o a) If p>-1, then T(n)T(n) = θ(n^{log_b a} log ^ {p+1} n)(nlogb

alogp+1n)
o b) If p=-1, then T(n) = θ(n^{log_b a} loglogn)T(n)=θ(nlogb

aloglogn)
o c) If p<-1, then T(n) = θ(n^{log_b a})T(n)=θ(nlogba)

 Case 3- If a<b^ka<bk, then:


o a) If p>=0, then T(n) = θ(n^k log^p n)T(n)=θ(nklogpn)

o b) If p<0, then T(n) = θ(n^k)T(n)=θ(nk)

Hence, using the values of a, b, k and p, we can easily find the time
complexity function of any dividing recurrence relation of the above-
stated form. We'll solve some examples in the upcoming section.

Master's Theorem for Decreasing Functions


Highlights:

 Used to directly calculate the time complexity function of 'decreasing'


recurrence relations of the form:
o T(n)T(n) = aT(n-b)aT(n−b) + f(n)f(n)
 f(n)f(n) = θ(n^{k})θ(nk)
 The value of 'a' will decide the time complexity function for the 'decreasing'
recurrence relation.

For decreasing functions of the form T(n) = aT(n-b) +


f(n)T(n)=aT(n−b)+f(n),
where f(n)f(n) = θ(n^{k})θ(nk)
for example:

 T(n) = T(n-2) + 1T(n)=T(n−2)+1


 T(n) = 2T(n-1) + n^2T(n)=2T(n−1)+n2

Where:
n = input size (or the size of the problem)
a = count of subproblems in the decreasing recursive function
n-b = size of each subproblem (Assuming size of each subproblem is
same)

Here a, b, and k are constants that satisfy the following conditions:

 a>0, b>0
 k>=0

Case 1) If a<1, then T(n) = θ(n^{k})T(n)=θ(nk)

Case 2) If a=1, then T(n) = θ(n^{k+1})T(n)=θ(nk+1)

Case 3) If a>1, then T(n) = θ(n^{n\over b} * f(n))T(n)=θ(nbn∗f(n))

Hence, given a decreasing function of the above-mentioned form, we


can easily compare it to the given form to get the values of a, b, and k.
With that then, we can find the time complexity function by looking at the
three possible cases mentioned above. We'll solve some examples in
the upcoming section to understand better.

Idea Behind Master's Theorem for Dividing Functions


Highlights:
 To calculate log_balogba and compare it with f(n)f(n) to decide the final
time complexity of the recurrence relation T(n)T(n)

The idea behind Master's algorithm is based on the computation


of n^{logba}nlogba and comparing it with f(n)f(n). The time complexity
function comes out to be the one overriding the other.

For example:

 In Case I, n^{logba}nlogba dominates f(n)f(n). So the time complexity


function T(n)T(n) comes out to be θ(n^{logba})θ(nlogba)
 In Case II, n^{logba}nlogba is dominated by f(n)f(n). Hence,
the f(n)f(n) function will take more time, and the time complexity function, in
this case, is θ(f(n))θ(f(n)), i.e., θ(n^{k} log^{p}n)θ(nklogpn).
 In case III, when log_ba=klogba=k, the time complexity function is going
to be θ(n^{logba} log n)θ(nlogbalogn)

Now, Why do We Compare n^{logba}nlogba with f(n)f(n) and not


Some Other Computation?

Given T(n)T(n) = aT(n/b)aT(n/b) + f(n)f(n), we need to calculate the


time complexity of T(n)T(n). If we calculate the time taken by both the
terms individually and compare as to which one dominates, we can
easily define the time complexity function of T(n)T(n) to be equal to the
one that dominates.

To achieve so, let's first understand the first term, i.e., aT(n/b)aT(n/b),


and then, we will calculate the time taken by this term.

Understanding the term - aT(n/b)aT(n/b) aT(n/b)aT(n/b) means -


For our problem of size n, we divide the whole problem
into 'a' subproblems of size 'n/b' each. Suppose T(n)T(n)^'^ is the time
taken by the first term.

Hence,
=> T(n)^{'} = aT(n/b)T(n)′=aT(n/b)
T(n/b)T(n/b) can again be divided into sub problems of size (n/b^{2})
(n/b2) each.

Hence,
=> T(n)^{'} = aT(n/b) = a^{2}T(n/b^{2})T(n)′=aT(n/b)=a2T(n/b2)
Similarly,
=> T(n)^{'} = a^{3}T(n/b^{3})T(n)′=a3T(n/b3) and so on.
to
=> T(n)^{'} = a^{i}T(n/b^{i})T(n)′=aiT(n/bi)

Let's assume, n = b^{k}n=bk (When we divide a problem to 1/2 each


time, we can say that n = 2^{k}n=2k. Here we divide the problem
with b each time, hence, n = b^{k}n=bk)
=> n = b^{k}n=bk
=> log_bn = klogbn=k

At ith level, the size of the sub-problem will reduce to 1, i.e. at ith level,


=> n/b^{i}= 1n/bi=1
=> n = b^{i}n=bi
=> log_bn = i = klogbn=i=k,
Therefore, i = k at the last level, where the size of the sub-problem
reduces to 1.

Using this deduction, we can re-write T(n)^{'}T(n)′ as:

=> T(n)^{'}= a^{i}T(n/b^{i})T(n)′=aiT(n/bi)
=> a^{logbn}T(b^{i}/b^{i})alogbnT(bi/bi)
=> a^{logbn}T(1)alogbnT(1)
Hence, T(n)^{'} = θ(a^{logbn}) = θ(n^{logba})T(n)′=θ(alogbn)=θ(nlogba)

T(n)^{'}T(n)′ was assumed to be the time complexity function for the


first term. Hence proved that the first term takes n^{logba}nlogba time
complexity. That is why we compare n^{logba}nlogba (i.e. the first term)
with f(n) (i.e. the second term) to decide which one dominates, which
decides the final time complexity function for the recurrence relation.

Proof of Master's Algorithm

T(n) = a* T({n\over b}) + O(n^d)T(n)=a∗T(bn)+O(nd)

This form of recurrence relation is in the form of a tree, as shown below,


where every iteration divides the problem into 'a' sub-problems of size
(n/b).

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy