In this article, we solve some interesting questions using the master theorem & substitution method. We will also solve some of the most famous recursive and iterative algorithms using the master theorem and the substitution method.

To brush up on your knowledge of the basic complexity, you can read the article Time and Space complexity on Coding Ninjas Studio.

What is time complexity?

The amount of time an algorithm takes to run based on the input size is known as time complexity.

Get the tech career you deserve, faster!

Connect with our expert counsellors to understand how to hack your way to success

User rating 4.7/5

1:1 doubt support

95% placement record

Akash Pal

Senior Software Engineer

326% Hike After Job Bootcamp

Himanshu Gusain

Programmer Analyst

32 LPA After Job Bootcamp

After Job Bootcamp

What is space complexity?

Total space taken by an algorithm based on the input size is known as space complexity.

Let us see some time complexity expressions and their meaning

Time complexity

Meaning

O(1)

Constant time

O(log n)

Logarithmic time complexity

O(n)

Linear time complexity

O(n x n)

Quadratic time complexity

O(n^4+n^2)

Polynomial-time complexity

O(2^{n})

Exponential time complexity

O(n!)

Factorial time complexity

Master Theorem

The master theorem helps us quickly find the time complexity for recurrence relation.

Given a recurrence T(N) = a*T(N/b) + f(N) where N is the size of the input and f(N) is a polynomial function, we can find the time complexity using the master theorem quickly.

def fib(n):
if n == 0 or n == 1:
return 1
return 2 x fib(n - 1)

Ans:

Here we note that while calculating the fib(N), fib(N-1) is not being calculated twice. It is calculated once and then multiplied by 2.

Thus we can represent the above recurrence as T(N) = T(N-1)+O(1).

Here the Recursion will run for N times as the base case is N=0 and N=1. The recursion will call itself as many times until the value of N becomes 0 or 1.

Fig: Recursion in the above function

Since T(1)=1, this question has the exact same recurrence as the above question.

Thus T(N) = O(N).

9. What is the time complexity of the below code?

def fib(n):
if n == 0 or n == 1:
return 1
return fib(n - 1) + fib(n-1)

Ans: O(2^{N})

The code is similar to the previous code. Also, it gives the same output as the previous code. But the time complexity is not as good as the previous code.

This is because we are calculating the fib(n-1) twice.

Figure: The Tree formed for calculating the fib(N)

As we can see that fib(3) is calculated twice, and fib(2) is calculated 4 times.

Here we get the recurrence

T(N) = 2 x T(N-1)

Thus T(N) = 2 x T(N-1) = 4 x T(N-2) = 2^(N-1) x T(N- (N-1)) = 2^(N-1) x T(1)

Since T(1) =O(1), we get T(N) = 2^(N-1).

Thus the time complexity is O(2^{N}).

10. What is the time complexity of the below code?

def fib(n):
dp=[-1 for i in range(n)]
for i in range(n):
if i<=1:
dp[i]=i
else:
dp[i]=dp[i-1]+dp[i-2]
return dp

Ans:

The time complexity here is O(N).

Here in the ith iteration, the ith Fibonacci number is calculated. Each iteration takes O(1) time to calculate. The reason is that ith Fibonacci number depends on the previous (i-1) th and (i-2)th Fibonacci number and as they are calculated, the ith Fibonacci number can be calculated very quickly.

What is the master theorem? The master theorem helps us quickly find the time complexity for recurrence relation. Given a recurrence T(N) = a x T(N/b) + f(N) where N is the size of the input and f(N) is a polynomial function, we can find the time complexity using the master theorem quickly.

When is the master theorem applicable? The master theorem is applicable on the recurrence T(N) = a x T(N/b) + f(N). Here f(N) must be a polynomial function for the master theorem to be applicable.

Can we find out the time complexity in multiple ways? Yes, we can. We can solve some questions using the master theorem and the substitution method. Multiple methods will give us the same correct output.

What is the need to optimize time and space? Optimization of time is essential in writing efficient programs. A merge sort on normal computers can beat bubble sort on supercomputers. Similarly, using large memory can also result in runtime errors. Thus the efficiency of programs plays a vital role. A good program is both time and space-efficient.

What are the ways to optimize time and space? We can optimize time and space complexity by using better algorithms and proper data structures. Removing unnecessary arrays can save space. Similarly, data structures like HashMaps can be beneficial in searching an element.

Key Takeaways

We solve some interesting problems using the master theorem and substitution method. We found time complexity for some famous recursive and iterative algorithms.

You can also copy the programs and run them at your local machines for better understanding. You can also edit them at your convenience and check the output.