In this article, we will have a look at the famous Master’s Theorem. This is very useful when it comes to the Design and analysis of Algorithms following Divide and Conquer Technique. We will cover the theorem with its working and look at some examples related to it.
Master’s Theorem is Used For?
Master’s Method is functional in providing the solutions in Asymptotic Terms (Time Complexity) for Recurrence Relations. In simpler terms, it is an efficient and faster way in providing tight bound or time complexity without having to expand the relation. However, it is mainly responsible for Recurrence Relations based on Divide and Conquer Technique. Recurrence Relation is basically an equation where the next term of a function is dependent on its previous terms. We will have a look at it with some examples.
The Theorem is applicable for relations of the form:
|T(n)= a T( n/b ) + f(n)
where a>=1, b>1.
Let us look at each term, here:
n-> Indicates the Size of Problem.
a -> Number of Subproblems in the Recursive Relation.
b -> Factor by which size of each Subproblem is reduced in each call.
n/b -> Size of each Subproblem (Usually Same).
f(n) -> Θ(nk logp n ) ,where k >= 0 and p is a real number; It is cost or work done in merging each subproblem to get solution.
T(n) -> Indicates the total time taken to evaluate the Recurrence.
Master’s Theorem Cases
Now, Master’s Method determines the Asymptotic Tight Bound (Θ or Theta) on these recurrences considering 3 Cases:
If a > bk , then T(n)= Θ (n log b a ) [ log b a = log a / log b. ].
Let us understand this Case with example:
Suppose we are given a Recurrence Relation, T(n) = 16 T(n/4) + n .
For this relation, a = 16 , b = 4 , f(n) = Θ (nk logp n) = n, where p=0 and k=1.
As we can see, value of a = 16 and value of bk = 41 = 4. So a > bk which falls under this Case 1 so the solution of this Recurrence Relation, T(n) = Θ ( n log b a ) = Θ ( n log 4 16 ).
Now log 4 16 = log 16 / log 4 = log 42 / log 4 = 2 log 4 / log 4 = 2.
So, T(n) = Θ ( n2 ), is the tight bound runtime for this relation.
If a = bk , then there are again three possibilities to determine T(n) :
1. If p > -1
In this case, the value of T(n) = Θ (n log b a logp+1 n).
Let us also look at this case with Recurrence Relation:
T(n) = 2 T(n/2) + n.
For this relation, a = 2, b = 2, f(n) = Θ (nk logp n) = n, where k=1 and p=0.
Here, value of bk = 2, so a = bk , This falls under Case 2 moreover p > -1 so the solution for this relation will be:
T(n) = Θ (n log b a logp+1 n) = Θ (n log 2 2 log0+1 n) = Θ (n log n), is the tight bound for this relation.
Note: The above equation is the Recurrence relation of Merge Sort Algorithm, which has the time complexity of O(n log n), in all cases. So we can see with Master Theorem we easily determine the running time of any algorithm.
2. If p = -1
For this case, T(n) = Θ (n log b a log log n).
Let us evaluate this case with an example too.
Consider the following Recurrence Relation :
T(n) = 2 T(n/2) + n/log n.
In this relation, a = 2, b = 2, f(n) = Θ (nk logp n) = n/log n, where k =1 and p=-1.
Value of bk = 2, so a = bk , This falls under Case 2 moreover p = -1, so the solution is :
T(n) = Θ (n log 2 2 log log n) = Θ (n log (log n)), is the tight bound for this recurrence.
Note: In this type of recurrences, at each step, the size of the subproblem shrinks by Square Root of n. The difference between and is not polynomial, this is an extended version.
3. If p < -1
In this case, T(n) = Θ (nlog b a ).
Consider this Recurrence: T(n) = 8 T(n/2) + n3 / log2 n.
Here a = 8, b = 2, f(n) = Θ (nk logp n) = n3 log –2 n, where k=3, p = -2.
So, Value of bk = 8 and a = bk , falls under Case 2 .
Therefore, T(n) = Θ (nlog 2 8 ) [ log 2 8 = log 8/ log 2 = 3 log 2/log 2= 3] .
So, T(n) = Θ (n3 ).
This type of condition ( p < -1) is not generally encountered.
For the last case, If a < bk the asymptotic bounds are rounded again to two possibilities :
1. If p >= 0
When a < bk and p is greater than 0, then solution, T(n) = Θ (nk log p n).
Let us consider an example to have a clear idea :
T(n) = 2 T(n/4) + n 0.62 .
So for this Recurrence relation, a = 2, b = 4, f(n) = Θ (nk logp n) = n 0.62 , where k = 0.62 and p=0.
Hence, bk = 4 0.62 = 2.36 and a < bk . This make the recurrence fall under Case 3. Along with this, p >= 0.
Thus, T(n) = Θ (nk log p n) = Θ (n0.62 log0 n) = Θ (n0.62 ), is the tight bound Asymptotic Notation for this type of recurrence.
2. If p < 0
The Solution is given by, T(n) = Θ ( nk ).
Considering this Relation : T(n) = 2 T(n/4) + n–0.51 / log n . Let’s outline the solution.
Here, a = 2, b = 4, f(n) = Θ (nk logp n) = n–0.51 log-1 n, where k = 0.51 and p = -1. The condition a < bk satisfies as bk = 2.027, falls under case 3 and p = -1 < 0.
So solution is T(n) = Θ ( n0.51 ).
So we have discussed all the cases for Master Method related to Divide and Conquer Recurrences. Now let us have a quick look at the Limitations of Master Method before ending this article.
Limitations of Master’s Theorem
- Master’s theorem cannot be used if f(n), the combination/merging time is not positive. For Ex: T(n) = 16 T(n/2) – n, cannot be evaluated using this method as f(n) = – n.
- The value of a should be constant and always greater than 1 . a denotes the number of sub-problems which should be fixed cannot be a function of n. E.g. T(n) = 2n T(n/2) + n, cannot be solved using Master Method, a is not constant.
- There should be at least one subproblem. Eg. T(n) = 0.5 T(n/2) + 2n , cannot be evaluated as a = 0.5 < 1 .
- The value of b must be > 1 and constant. Otherwise, the subproblems will keep increasing at each step and we never reach the base case for the recursion to end.
So that’s it for the article you can practice the examples discussed above. You can also practice some examples for better understanding here. Let us know your doubts or any suggestions in the comment section below.
The post Master’s Theorem Explained with Examples appeared first on The Crazy Programmer.