## Examples of Master Theorem

### Example 1: Binary Search

A traditional approach called binary search divides the search space in half repeatedly to locate a target element in a sorted array.

Recurrence relation:

**T(n) = T(n/2) + O(1)**

Here a = 1, b =1, and the work completed before the recursive call is O(1).

We are applying the second case of the master theorem. So the time complexity is Θ(log n). We will learn about these cases in detail.

### Example 2: Quick Sort

Merge Sort is a sorting algorithm that splits an array in half, sorts each half separately, and then combines the two parts back together.

Recurrence relation:

**T(n) = 2T(n/2) + O(n)**

Here a = 2, b =1, and the merge steps take O(n) time. We are applying case 1 of the master theorem. The time complexity is T(n)=Θ(n log n).

## Master Theorem for Dividing Functions

We can use the master theorem formula directly when the recurrence relation between the subproblems is a dividing function. The time complexity of the function is given by T(n).

`T(n) = a.T(n/b) + f(n)`

**‘n’** is the size of the function or input size.

**‘a’** is the count of the number of subproblems (a≥1).

**‘b’** is the size of each subproblem (b>1).

**f(n)** is work done outside the recursive call. For Dividing functions, **f(n)=nk.log **^{p }n (k>0)**.**

Master theorem formula for dividing function depends on the value of a,b and f(n).

**Case 1:** log _{b} a> k, then T(n)= θ (n^{( log }_{b}^{ a)}).

**Case 2:** log _{b} a= k, then the value of T(n) depends upon the value of **‘p’**.

- if (p>-1) then T(n)= θ ((n
^{k}) (log ^{(p+1)} n)). - if (p= -1) then T(n)= θ ((n
^{k}) (log log n)). - if (p< -1) then T(n)= θ (n
^{k}).

**Case 3:** log _{b} a< k, then the value of T(n) depends upon the value of **‘p’**.

- if(p>= 0) then T(n)= θ ((n
^{k}) (log ^{p} n)). - if(p< 0) then T(n)= θ (n
^{k}).

### Example

Now that we know how to use the master theorem for dividing functions, let us see some of the examples to use it.

**Binary Search:** The time complexity of binary search (using recursion **(T(n) = T(n/2) + O(1))**) is given by the calculation done below.

```
T(n) = 1.T(n/2) + O(1)
∴ a = 1, b = 2 and Ө(nk.log p n) = O(1)
O(1) = Ө(n0.log 0 n)
∴ k = 0 and p = 0
log b a = log 2 1 = 0
∴ log b a = k
p = 0 > -1
```

As the value of p is zero so, by case 2,

Time complexity** **= **θ ((n**^{k}) (log ^{(p+1)} n)).

= O(n^{0}.log ^{(0+1)} n)

= O(1.log ^{1} n)

= O(log n)

**Merge sort:** The time complexity of the algorithm for merge sort using recursion **(T(n) = 2T(n/2) + O(n))** is also given by the calculation done below.

```
T(n) = 2.T(n/2) + O(n)
∴ a = 2, b = 2 and Ө(nk.log p n) = O(n)
O(n) = Ө(n1.log 0 n)
∴ k = 1 and p = 0
log b a = log 2 2 = 1
∴ log b a = k
p = 0 > -1
```

As the value of p is zero so, by case 2,

Time complexity **= θ ((n**^{k}) (log ^{(p+1)} n)).

= O(n^{1}.log ^{(0+1)} n)

= O(n.log^{ 1} n)

= O(n.log n)

## Master Theorem for Decreasing Functions

We can also use the master theorem formula directly when the recurrence relation between the subproblems is a decreasing function. The time complexity of the function is given by T(n).

`T(n) = a.T(n - b) + f(n)`

**‘n’** is the size of the function or input size.

**‘a’** is the count of the number of subproblems (a>0).

**‘n - b’** is the size of each subproblem (b>0).

**f(n)** is work done outside the recursive call. For Decreasing function, **f(n)= θ (n**^{k}) (k>0).

Master theorem formula for a decreasing function depends on the value of a and f(n).

**Case 1: **a<1, then T(n)= θ (n^{k}).

**Case 2:** a=1, then T(n)= θ (n^{(k+1)}).

**Case 3: a>1, **then T(n)= θ (n^{(n/b)} * f(n)).

### Example

Now that we know how to use the master theorem for decreasing functions let us see some of the examples to use it.

The Time Complexity of a function given by the recurrence relation **T(n)= 2 * T(n-2) + 1**.

```
T(n)= 2 * T(n-2) + 1
∴ a=2, b=2, k=0 and f(n)=1
```

As the value of a is greater than 1.

Time Complexity =** ****θ (n**^{(n/b)} * f(n)).

= θ (n^{(n/2)} * 1)

= θ (n^{(n/2)})

The Time Complexity of a function given by the recurrence relation **T(n)= ⅓ * T(n−1) + n**.

```
T(n)= ⅓ * T(n−1) + n.
∴ a=1/3, b=1, k=1 and f(n)=n
```

As the value of a is less than 1.

Time Complexity = **θ (n**^{k}).

= θ (n^{1}).

= θ (n)

## Master Theorem Limitations

We have already seen various examples of analysing time complexity using the master theorem, but can we use it for any function in the given form?- No.

You may be wondering when can we then?

Here are the conditions where we cannot use the master theorem:

- If T(n) is not monotone, like sin n. By this, we mean that T(n) should be a monotonically increasing function.

- If f(n) is not a polynomial. For example, if T(n) = 2T (n/2) + 2n, f(n) is an exponent. So, we cannot use the master’s theorem here.

- If a is not a constant, For example, if T(n) = 2n.T(n/3) + n.log n, here a = 2n which is not a constant. So, T(n) cannot be analysed using the master’s theorem.

Some important points to keep in mind regarding the Master Theorem

When working with the Master Theorem to solve recurrence relations, keep these key points in mind:

**Form of Recurrence**: Ensure the recurrence fits the standard form T(n)=aT(n/b)+f(n), where a≥1,b>1, and f(n) is an asymptotically positive function.**Case Identification**: Identify which of the three cases of the Master Theorem applies based on the comparison of f(n) with n^{logba}.**Case 1**: If f(n)=O(nlogba−ϵ) for some ϵ>0, then T(n)=Θ(n^{logba})**Case 2**: If *f*(*n*)=Θ(*n*log*b**a*), then*T*(*n*)=Θ(*n*^{log}^{b}^{}^{a}log*n*).**Case 3**: If *f*(*n*)=Ω(*n*^{log}^{b}^{ }^{a}^{+}^{ϵ}) for some *ϵ*>0, and if *af*(*n*/*b*)≤*kf*(*n*) for some *k*<1 and sufficiently large n, then *T*(*n*)=Θ(*f*(*n*)).**Regularly Condition**: For Case 3, ensure the additional regularity condition af(n/b)≤kf(n) holds, which helps in proving the upper bound.**Exceptions**: Recognize that the Master Theorem does not apply to all recurrence relations, especially if the recurrence does not match its standard form or if f(n) does not meet any of the case conditions.**Supplementary Methods**: Be prepared to use other methods, such as the substitution method or the recursion-tree method, for recurrences that fall outside the scope of the Master Theorem.

## Frequently Asked Questions

**What is master theorem formula?**

The master Theorem formula is T(n) = aT(n/b) + f(n), where a and b are constants and f(n) is the work done outside the recursive call.

**What is master's theorem in DAA?**

Master’s theorem in DAA is a technique that gives us the formula to directly find the time complexity of an algorithm containing recurrence relations.

**What are the 3 cases of the master theorem?**

The three cases in the master theorem depend upon the f(n) function, which is the work done outside the recursive call. Case 1 is used when the work done in the recursive subproblems controls the time complexity of a function. Case 2 is used when the work done in the subproblems and the f(n) function is the same. Case 3 is used when the f(n) function controls the time complexity.

**What is an example of the master's theorem?**

We can consider a recursive function with two subproblems, both of equal size, and there is no extra work done outside the recursive call, so the time complexity of the function by master theorem is Ө (n^2).

## Conclusion

In this article, we learnt about the master theorem (analysis of algorithms) and how we use it to directly find the time complexity of an algorithm containing recurrence relations.

We also discussed a shortcut method that will help to find the answer in an instant. We practised a few problems too. Apart from this, there are other concepts that you can learn from the free video lectures on __Coding Ninjas__.

Recommended Reading -

You can also try DSA problems available on __Coding Ninjas Studio__. The questions there are frequently asked in interview rounds and will help you master efficient coding techniques. They also come with a bonus of interview experiences of scholars in big product-based companies.

Happy learning!