Code360 powered by Coding Ninjas X Naukri.com. Code360 powered by Coding Ninjas X Naukri.com
Table of contents
1.
Introduction
2.
What is the Master Theorem (Analysis of Algorithms)?
3.
Examples of Master Theorem
3.1.
Example 1: Binary Search A traditional approach called binary search divides the search space in half repeatedly to locate a target element in a sorted array.
3.2.
Example 2: Quick Sort
4.
Master Theorem for Dividing Functions
4.1.
Example
5.
Master Theorem for Decreasing Functions
5.1.
Example
6.
Master Theorem Limitations
7.
Some important points to keep in mind regarding the Master Theorem
8.
Frequently Asked Questions
8.1.
What is master theorem formula?
8.2.
What is master's theorem in DAA?
8.3.
What are the 3 cases of the master theorem?
8.4.
What is an example of the master's theorem?
9.
Conclusion
Last Updated: Jun 14, 2024
Medium

Master Theorem (With Examples)

Author Nikunj Goel
2 upvotes
Master Python: Predicting weather forecasts
Speaker
Ashwin Goyal
Product Manager @

Introduction

The Master Theorem is a fundamental tool in computer science for analyzing the time complexity of recursive algorithms, particularly those that divide a problem into smaller subproblems of a similar nature. Providing a straightforward way to solve recurrence relations helps determine the asymptotic behavior of divide-and-conquer algorithms. This theorem is crucial for understanding the efficiency and scalability of algorithms, offering insights into their performance on large inputs.

Master Theorem (With Examples)

Finding the time complexity of any algorithm is a significant part of writing good code, but finding an algorithm with recurrence relations is tricky. So, the method of divide and conquer is used here, and it is called the Master Theorem (analysis of algorithms). 

In this article, we will discuss what the master theorem is. We will also see how it is used with recurrence relations to find the time complexity. 

What is the Master Theorem (Analysis of Algorithms)?

The Master Theorem gives us the conditions to determine if the solution to a recurrence relation can be expressed as O(n^k) for a constant value of k. It also provides a formula for calculating the exact value of k.

Divide-and-conquer algorithms' temporal complexity is examined using the Master Theorem method. By offering a formula that connects the recurrence relation of the algorithm to its overall time complexity, it offers a means to evaluate the time complexity of recursive algorithms. For algorithms of a particular type, where a problem is broken down into smaller subproblems of nearly similar size, the Master Theorem is very helpful.

The Master Theorem can be used to apply to the following common forms of recurrence relations:

T(n) = a.T(n/b) + f(n), a ≥ 1 and b > 1


where n = size of the problem,

a = number of sub-problems,

b = size of each sub-problem,

f(n) = work done other than recursion like dividing into sub-problems and combining them to get the solution

f(n) is of the form, f(n) = Ө(nk.log p n).

So, on substituting f(n),

T(n) = a.T(n/b) + Ө(nk.log p n), k ≥ 0 and p ∈ R

 

For example, for binary search, the expression for run-time is T(n) = T(n/2) + O(1). From the general expression of run-time, there are different conditions for the different values of the parameters. Based on these conditions, the time complexity is calculated. 

Let us see what those conditions are and how to calculate the time complexity.

Get the tech career you deserve, faster!
Connect with our expert counsellors to understand how to hack your way to success
User rating 4.7/5
1:1 doubt support
95% placement record
Akash Pal
Senior Software Engineer
326% Hike After Job Bootcamp
Himanshu Gusain
Programmer Analyst
32 LPA After Job Bootcamp
After Job
Bootcamp

Examples of Master Theorem

Example 1: Binary Search 
A traditional approach called binary search divides the search space in half repeatedly to locate a target element in a sorted array.

Recurrence relation:

T(n) = T(n/2) + O(1)

Here a = 1, b =1, and the work completed before the recursive call is O(1).
We are applying the second case of the master theorem. So the time complexity is Θ(log n). We will learn about these cases in detail.

Example 2: Quick Sort

Merge Sort is a sorting algorithm that splits an array in half, sorts each half separately, and then combines the two parts back together.

Recurrence relation:

T(n) = 2T(n/2) + O(n)

Here a = 2, b =1, and the merge steps take O(n) time. We are applying case 1 of the master theorem. The time complexity is T(n)=Θ(n log n).

Master Theorem for Dividing Functions

We can use the master theorem formula directly when the recurrence relation between the subproblems is a dividing function. The time complexity of the function is given by T(n).

T(n) = a.T(n/b) + f(n)

 

  • ‘n’ is the size of the function or input size.
     
  • ‘a’ is the count of the number of subproblems (a≥1).
     
  • ‘b’ is the size of each subproblem (b>1).
     
  • f(n) is work done outside the recursive call. For Dividing functions, f(n)=nk.log (k>0).
     

Master theorem formula for dividing function depends on the value of a,b and f(n).

Case 1: log b a> k, then T(n)= θ (n( log b a)).

Case 2: log b a= k, then the value of T(n) depends upon the value of ‘p’.

  • if (p>-1) then T(n)= θ ((nk) (log (p+1) n)).
  • if (p= -1) then T(n)= θ ((nk) (log log n)).
  • if (p< -1) then T(n)= θ (nk).
     

Case 3:  log b a< k, then the value of T(n) depends upon the value of ‘p’.  

  • if(p>= 0) then T(n)= θ ((nk) (log p n)).
  • if(p< 0) then T(n)= θ (nk).

Example

Now that we know how to use the master theorem for dividing functions, let us see some of the examples to use it. 

Binary Search: The time complexity of binary search (using recursion (T(n) = T(n/2) + O(1))) is given by the calculation done below.

T(n) = 1.T(n/2) + O(1)
∴ a = 1, b = 2 and Ө(nk.log p n) = O(1)
O(1) = Ө(n0.log 0 n)
∴ k = 0 and p = 0
log b a = log 2 1 = 0 
∴ log b a = k
p = 0 > -1


As the value of p is zero so, by case 2,
Time complexity θ ((nk) (log (p+1) n)).
= O(n0.log (0+1) n)
= O(1.log 1 n)
= O(log n)

Merge sort: The time complexity of the algorithm for merge sort using recursion (T(n) = 2T(n/2) + O(n)) is also given by the calculation done below.

T(n) = 2.T(n/2) + O(n)
∴ a = 2, b = 2 and Ө(nk.log p n) = O(n)
O(n) = Ө(n1.log 0 n)
∴ k = 1 and p = 0
log b a = log 2 2 = 1 
∴ log b a = k
p = 0 > -1


As the value of p is zero so, by case 2,
Time complexity = θ ((nk) (log (p+1) n)).
= O(n1.log (0+1) n)
= O(n.log 1 n)
= O(n.log n)

Master Theorem for Decreasing Functions

We can also use the master theorem formula directly when the recurrence relation between the subproblems is a decreasing function. The time complexity of the function is given by T(n).

T(n) = a.T(n - b) + f(n)
  • ‘n’ is the size of the function or input size.
     
  • ‘a’ is the count of the number of subproblems (a>0).
     
  • ‘n - b’ is the size of each subproblem (b>0).
     
  • f(n) is work done outside the recursive call. For Decreasing function, f(n)= θ (nk) (k>0).
     

Master theorem formula for a decreasing function depends on the value of a and f(n).

Case 1: a<1, then T(n)= θ (nk).

Case 2: a=1, then T(n)= θ (n(k+1)). 

Case 3: a>1, then T(n)= θ (n(n/b) * f(n)).

Example

Now that we know how to use the master theorem for decreasing functions let us see some of the examples to use it. 

The Time Complexity of a function given by the recurrence relation T(n)= 2 * T(n-2) + 1.

T(n)= 2 * T(n-2) + 1
∴ a=2, b=2, k=0 and f(n)=1


As the value of a is greater than 1.

Time Complexity = θ (n(n/b) * f(n)).

= θ (n(n/2) * 1)

= θ (n(n/2))

The Time Complexity of a function given by the recurrence relation T(n)= ⅓ * T(n−1) + n.

T(n)= ⅓ * T(n−1) + n.
 ∴ a=1/3, b=1, k=1 and f(n)=n


As the value of a is less than 1.

Time Complexity = θ (nk).

= θ (n1).

= θ (n)

Master Theorem Limitations

We have already seen various examples of analysing time complexity using the master theorem, but can we use it for any function in the given form?- No.

You may be wondering when can we then? 

Here are the conditions where we cannot use the master theorem:

  • If T(n) is not monotone, like sin n. By this, we mean that T(n) should be a monotonically increasing function. 
     
  • If f(n) is not a polynomial. For example, if T(n) = 2T (n/2) + 2n, f(n) is an exponent. So, we cannot use the master’s theorem here.
     
  • If a is not a constant, For example, if T(n) = 2n.T(n/3) + n.log n, here a = 2n which is not a constant. So, T(n) cannot be analysed using the master’s theorem.


Some important points to keep in mind regarding the Master Theorem

When working with the Master Theorem to solve recurrence relations, keep these key points in mind:

  1. Form of Recurrence: Ensure the recurrence fits the standard form T(n)=aT(n/b)+f(n), where a≥1,b>1, and f(n) is an asymptotically positive function.
  2. Case Identification: Identify which of the three cases of the Master Theorem applies based on the comparison of f(n) with nlogb​a.
  3. Case 1: If f(n)=O(nlogb​a−ϵ) for some ϵ>0, then T(n)=Θ(nlogb​a)
  4. Case 2: If f(n)=Θ(nlogba), thenT(n)=Θ(nlogbalogn).
  5. Case 3: If f(n)=Ω(nlogba+ϵ) for some ϵ>0, and if af(n/b)≤kf(n) for some k<1 and sufficiently large n, then T(n)=Θ(f(n)).
  6. Regularly Condition: For Case 3, ensure the additional regularity condition af(n/b)≤kf(n) holds, which helps in proving the upper bound.
  7. Exceptions: Recognize that the Master Theorem does not apply to all recurrence relations, especially if the recurrence does not match its standard form or if f(n) does not meet any of the case conditions.
  8. Supplementary Methods: Be prepared to use other methods, such as the substitution method or the recursion-tree method, for recurrences that fall outside the scope of the Master Theorem.

Frequently Asked Questions

What is master theorem formula?

The master Theorem formula is T(n) = aT(n/b) + f(n), where a and b are constants and f(n) is the work done outside the recursive call.

What is master's theorem in DAA?

Master’s theorem in DAA is a technique that gives us the formula to directly find the time complexity of an algorithm containing recurrence relations.

What are the 3 cases of the master theorem?

The three cases in the master theorem depend upon the f(n) function, which is the work done outside the recursive call. Case 1 is used when the work done in the recursive subproblems controls the time complexity of a function. Case 2 is used when the work done in the subproblems and the f(n) function is the same. Case 3 is used when the f(n) function controls the time complexity.

What is an example of the master's theorem?

We can consider a recursive function with two subproblems, both of equal size, and there is no extra work done outside the recursive call, so the time complexity of the function by master theorem is Ө (n^2).

Conclusion

In this article, we learnt about the master theorem (analysis of algorithms) and how we use it to directly find the time complexity of an algorithm containing recurrence relations. 

We also discussed a shortcut method that will help to find the answer in an instant. We practised a few problems too. Apart from this, there are other concepts that you can learn from the free video lectures on Coding Ninjas

Recommended Reading - 

You can also try DSA problems available on Coding Ninjas Studio. The questions there are frequently asked in interview rounds and will help you master efficient coding techniques. They also come with a bonus of interview experiences of scholars in big product-based companies. 

Happy learning!

Previous article
Merge Sort Time Complexity
Next article
Time & Space Complexity of Searching Algorithms
Live masterclass