Code360 powered by Coding Ninjas X Naukri.com. Code360 powered by Coding Ninjas X Naukri.com
Table of contents
1.
Introduction
2.
What is time complexity?
3.
What is space complexity?
4.
Let us see some time complexity expressions and their meaning
5.
Master Theorem
6.
FAQs
7.
Key Takeaways
Last Updated: Mar 27, 2024

Interesting Questions from (Substitution, Iteration, Recursion Tree, Master)

Author HET FADIA
0 upvote
Master Python: Predicting weather forecasts
Speaker
Ashwin Goyal
Product Manager @

Introduction

In this article, we solve some interesting questions using the master theorem & substitution method. We will also solve some of the most famous recursive and iterative algorithms using the master theorem and the substitution method.

To brush up on your knowledge of the basic complexity, you can read the article Time and Space complexity on Coding Ninjas Studio.

What is time complexity?

The amount of time an algorithm takes to run based on the input size is known as time complexity.

Get the tech career you deserve, faster!
Connect with our expert counsellors to understand how to hack your way to success
User rating 4.7/5
1:1 doubt support
95% placement record
Akash Pal
Senior Software Engineer
326% Hike After Job Bootcamp
Himanshu Gusain
Programmer Analyst
32 LPA After Job Bootcamp
After Job
Bootcamp

What is space complexity?

Total space taken by an algorithm based on the input size is known as space complexity.

Let us see some time complexity expressions and their meaning

Time complexity

Meaning

O(1)

Constant time

O(log n)

Logarithmic time complexity

O(n)

Linear time complexity

O(n x n)

Quadratic time complexity

O(n^4+n^2)

Polynomial-time complexity

O(2n)

Exponential time complexity

O(n!)

Factorial time complexity

Master Theorem

The master theorem helps us quickly find the time complexity for recurrence relation.

Given a recurrence T(N) = a*T(N/b) + f(N) where N is the size of the input and f(N) is a polynomial function, we can find the time complexity using the master theorem quickly. 

Figure 1: Master theorem

Source: Link

Now that we have seen the basics, let us solve some interesting problems:

1. Given that T(1) = 1, what is the time complexity of the following recurrence relation:

T(N) = 2 x T(N/2) + c where c is a constant

Solution:

Using Master Theorem:

Comparing the equation with T(N) = a x T(N/b) + f(N), we get a=2, b=2 and f(N)= constant.

Now N raised to (logba) = N raised to  (log22) = N > f(N).

As N>f(N), we will apply the first condition in the Master theorem.

Thus using master’s theorem, we get T(N) = N raised to (logba) = O(N).

Thus time complexity is O(N).

2. Given that T(1) = 1, what is the time complexity of the following recurrence relation:

T(N) = 4 x T(N/2) + N*N

Solution:

Using Master Theorem:

Comparing the equation with T(N) = a x T(N/b) + f(N), we get a=4, b=2 and f(N)= NxN.

Now N raised to (logba) = N raised to  (log24) = NxN= f(N). 

As N=f(N), we will apply the second condition in the Master theorem.

Thus using master theorem, we get O(f(N) x log(N))= O(Nx N x log(N))

Thus time complexity is O(Nx N x log(N)).

3. What is the time complexity of the below code?

T(N) = 8 x T(N/2) + O(N^5).

Solution:

Using Master Theorem:

Comparing the equation with T(N) = a x T(N/b) + f(N), we get a=8, b=2 and f(N)= N^4.

Now N raised to (logba) = N raised to  (log28) = N^4 < f(N). 

As N^4 < f(N), we will apply the third condition in Master’s theorem.

We get O(f(N))= O(N^4)

Thus time complexity is O(N^4).

4. The recurrence of the merge sort is shown below. What is the time complexity of the merge sort?

T(N) = 2T(N/2) + N, T(1) = 1.

Solution:

Using Master Theorem:

Comparing the equation with T(N) = a x T(N/b) + f(N), we get a=2, b=2 and f(N)= N.

Now N raised to (logba) = N raised to  (log22) = N = f(N). 

As N=f(N), we will apply the second condition in the Master theorem.

We get O(f(N) x log(N))= O(N x log(N))

Thus time complexity is O(N x log(N)).

5. The recurrence of the binary_search is shown below. What is the time complexity of the binary search algorithm?

T(N) = T(N/2) + constant, T(1) = 1.

Ans:

Using Master Theorem:

Comparing the equation with T(N) = a x T(N/b) + f(N), we get a=1, b=2 and f(N)= constant.

Now N raised to (logba) = N raised to  (log21) = constant = f(N). 

As constant=f(N), we will apply the second condition in the Master theorem.

Thus using master theorem, we get O(f(N) x log(N))= O(constantx log(N))

Thus time complexity is O( log(N)).

6. The recurrence of quick sort worst is represented as follows. What is the time complexity of the quick sort in the worst case?

T(N) = T(N-1) + N, T(1) = 1

Ans:

The Master theorem will not be applicable here as T(N-1) can’t be represented in the form of aT(N/b)

Here we will solve the recurrence using the substitution method.

T(N) = T(N-1) + N

T(N-1) = T(N-2) + N-1

T(N-2) = T(N-3) + N-2

...

T(2) = T(1) + 1

Adding all of them and cancelling out common terms on both sides we get

T(N) = T(1) + N + N-1 + N-2 + … 1

We know that the sum of the series 1 + 2 + 3 + … N = N  x (N+1) /2.

Thus T(N) = 1 + N(N+1)/2 = O(N x N)

7. What is the time complexity of the below code?

# Given N>0
def recursion(N):
    if N==0:
        return 0
    return N+recursion(N-1)

Ans: O(N)

Substitution Method:

T(N) =  T(N-1)+1

T(N-1) = T(N-2)+1

T(N-2) = T(N-3)+1

...

T(2) = T(1)+1

Adding on both the sides:

T(N) = T(1) + N

Since T(1) = 1, T(N) = N+1 = O(N)

Here the recursion will run for N times as the base case is N=0. The recursion will call itself as many times until the value of N becomes 0. 

Thus N times calling results in O(N) time complexity.

Must Read Recursion in Data Structure

8. What is the time complexity of the below code?

def fib(n):
    if n == 0 or n == 1:
        return 1
    return 2 x fib(n - 1)

Ans:

Here we note that while calculating the fib(N), fib(N-1) is not being calculated twice. It is calculated once and then multiplied by 2.

Thus we can represent the above recurrence as T(N) = T(N-1)+O(1).

Here the Recursion will run for N times as the base case is N=0 and N=1. The recursion will call itself as many times until the value of N becomes 0 or 1. 

Fig: Recursion in the above function

Since T(1)=1, this question has the exact same recurrence as the above question.

Thus T(N) = O(N).

9. What is the time complexity of the below code?

def fib(n):
    if n == 0 or n == 1:
        return 1
    return fib(n - 1) + fib(n-1)

Ans: O(2N)

The code is similar to the previous code. Also, it gives the same output as the previous code. But the time complexity is not as good as the previous code.

This is because we are calculating the fib(n-1) twice. 

Figure: The Tree formed for calculating the fib(N)

As we can see that fib(3) is calculated twice, and fib(2) is calculated 4 times. 

Here we get the recurrence

T(N) = 2 x T(N-1)

Thus T(N) = 2 x T(N-1) = 4 x T(N-2) = 2^(N-1) x T(N- (N-1)) =  2^(N-1) x T(1)

Since T(1) =O(1), we get T(N) = 2^(N-1).

Thus the time complexity is O(2N).

10. What is the time complexity of the below code?

def fib(n):
    dp=[-1 for i in range(n)]
    for i in range(n):
        if i<=1:
            dp[i]=i
        else:
            dp[i]=dp[i-1]+dp[i-2]
    return dp

Ans:

The time complexity here is O(N).

Here in the ith iteration, the ith Fibonacci number is calculated. Each iteration takes O(1) time to calculate. The reason is that ith Fibonacci number depends on the previous (i-1) th and (i-2)th Fibonacci number and as they are calculated, the ith Fibonacci number can be calculated very quickly.

Also check out - Rod Cutting Problem and  Euclid GCD Algorithm

FAQs

  1. What is the master theorem?
    The master theorem helps us quickly find the time complexity for recurrence relation.
    Given a recurrence T(N) = a x T(N/b) + f(N) where N is the size of the input and f(N) is a polynomial function, we can find the time complexity using the master theorem quickly. 
     
  2. When is the master theorem applicable?
    The master theorem is applicable on the recurrence T(N) = a x T(N/b) + f(N). Here f(N) must be a polynomial function for the master theorem to be applicable.
     
  3. Can we find out the time complexity in multiple ways?
    Yes, we can. We can solve some questions using the master theorem and the substitution method. Multiple methods will give us the same correct output.
     
  4. What is the need to optimize time and space?
    Optimization of time is essential in writing efficient programs. A merge sort on normal computers can beat bubble sort on supercomputers. Similarly, using large memory can also result in runtime errors.
    Thus the efficiency of programs plays a vital role. A good program is both time and space-efficient.
     
  5. What are the ways to optimize time and space?
    We can optimize time and space complexity by using better algorithms and proper data structures. Removing unnecessary arrays can save space. Similarly, data structures like HashMaps can be beneficial in searching an element.

Key Takeaways

We solve some interesting problems using the master theorem and substitution method. We found time complexity for some famous recursive and iterative algorithms.

You can also copy the programs and run them at your local machines for better understanding. You can also edit them at your convenience and check the output.

Read More - Time Complexity of Sorting Algorithms

Learn more: fibonacci series in c

Check out Coding Ninjas Studio to learn more about competitive programming, DSA, attempt mock tests, interview preparation, and much more.

Happy Learning!!!

Previous article
Amortized Time Complexity in Data Structures
Next article
Time & Space Complexity of Sorting Algorithms - 2 (N*logN)
Live masterclass