This article aims to familiarise you with the concept of time and space complexity. After reading this article, you will better understand time and space complexity along with various ways to optimise them to obtain a more efficient solution.

To brush up on your knowledge of the basic complexity, you can read the article Time and Space complexity on Coding Ninjas Studio.

Q1. What is the time complexity of the following code?

def fun(n,m):
for i in range(n):
print(i)
for i in range(m):
print(i)

Ans: Time complexity : O(n+m)

Here the first loop runs for n times and the second for loop iterates for m times. Thus time complexity is O(n+m).

Q2. What is the time complexity of the following code?

import random
def fun(N):
counter=0
for i in range(N):
counter+=random.randint(1,100)
print(counter)

Ans: Time complexity: O(N)

Here the loop is running for N times.

The random.randint(a,b) function takes O(1) time. Thus the time complexity is O(N).

Q3. What is the time and space complexity of the following code?

def fun(N,M):
arr=[]
counter=0
for i in range(N):
arr.append(i)
for i in range(M):
counter+=1
print(counter)

Ans: Time complexity: O(N+M), Space complexity: O(N).

Here the first loop runs for N times. So the array “arr” will have a size of O(N).

Also, the second loop runs for M iterations.

Thus time complexity is O(N) + O(M) = O(N+M).

Since we make an array of size O(N), the space complexity is O(N).

Q4. What is the time complexity of the following code?

def function(N,M):
counter=0
for i in range(N):
for j in range(M):
counter+=1
print(counter)

Ans: Time complexity: O(N x M).

Here the outer loop runs for N times. In the ith iteration, the inner loop runs for M times and hence the time complexity is O(M). As the loop runs for N times, the time complexity is N x M i.e. O(N x M)

Q5. What is the time complexity of the following code?

def fun(n):
for i in range(n):
print(pow(i,n))

Ans: time complexity : O(n x log(n))

Here the pow(i,n) takes log(n) time complexity as it uses binary exponentiation.

And the for loop runs for N times.

Thus time complexity is O(n x log(n)).

Q6. What is the time complexity of the following code?

def f(n):
if n == 0 or n == 1:
return 1
return f(n - 1) + f(n - 2)

Ans: Time complexity: O(2^{n})

Figure 1: working of the Fibonacci function

Figure 2: Recursion tree for calculating the 5th Fibonacci number

This is because fib(i) will call fib(i-1) and fib(i-2). But the fib(i) isn’t stored. So when we calculate fib(i+1), it will call fib(i) and so it will be calculated repeatedly for a number of times.

Here, the recurrence relation is: T(n) = T(n-1) + T(n-2) <= 2 x T(n-1)

Thus T(n) <= 2 x T(n-1) <= 4 x T(n-2) <= … <= 2^{n-1} x T(n-(n-1))

Since T(1) is constant. T(n) has a upper bound of O(2^{n}) i.e. exponential time complexity.

Similarly we can find the lower bound by using T(n)= T(n-1) + T(n-2) >= 2 x T(n-2)

T(n) >= 2 x T(n-2) >= 4 x T(n-4) … >= 2^{(n-1)/2} x T(n- (n-1).

Thus the lower bound of T(n) is also exponential.

Thus the time complexity of fib(n) is exponentially increasing i.e. T(n) = O(2^{n}).

Q7. What are the time complexity and space complexity of the following code?

def fun(n,m):
arr=[[0]*m for i in range(n)]
for i in range(n):
for j in range(m):
k=1
while k<n*m:
k*=2

Ans:

Time complexity : O(n*m* log (n*m))

Space complexity: O(n*m)

As we are making an array of size O(n*m) and no other extra space, the space complexity is O(n*m).

Here the first loop runs for n times.

The inner loop runs for m times.

The k gets multiplied by 2 every time in every iteration. Thus it will be greater than n*m in log_{2}(n*m) iterations.

Thus time complexity is O(n*m* log (n*m)).

Q8. What are the time complexity and space complexity of the following code?

# given N>0
def recursion(N):
if N==0:
return
print(N)
recursion(N-1)

Ans: Time complexity: O(N), Space complexity: O(N)

Figure 3: Shows working of recursion function

Here the recursion will run for N times as the base case is N=0. The recursion will call itself as many times until the value of N becomes 0.

Thus N times calling results in O(N) time complexity and O(N) space complexity

Alternative method: Here we can see that the print statement gets executed for N times. This indicates that the recursion is called for N times thus time complexity is O(N) and space complexity is O(N).

Q9. What is the time complexity of the following code?

def fun(N):
counter=1
for i in range(1,N+1):
for j in range(1,i+1):
counter+=1
print(counter)

Ans: O(N^{2})

Here i runs from 1 to N.

So in the ith iteration, the inner loop will run from 1 to i.

So our time complexity will be 1+ 2+ 3 + 4 + … N = N x (N+1) /2

We can thus write N x (N+1) /2 as O(N^{2}).

Alternative method: Here we can print counter for different n and we can easily find out that counter will be N x (N+1)/2

Q10. What is the time complexity of the following code?

def fun(N):
for i in range(1,N+1):
j=N
while j>0:
j//=2

Ans: O( N x log (N))

Here, the outer loop runs from 1 to N

In the inner while loop, the j gets divided by 2 every time.

So the inner loop will run for log2(j) = log2(N) times.

As the inner loop runs for N times, our time complexity will be N x log(N).

Alternative method:

We can add counter+=1 in the while loop to know the time complexity in our code.

def fun(N):
counter=0
for i in range(1,N+1):
j=N
while j>0:
j//=2
counter+=1
print(counter)
fun(1000) # 10000

As we can see for N= 1000 the counter value comes out to be 10000= 1000 x log2(1000).

We can also try it out for different N and it will come out to be N x log(N).

Q11. What is the time complexity for the following program?

def fun(N):
ans=0
for i in range(1,N+1):
for j in range(0,N,i):
ans+=1
print(ans)

Answer: O( N x log (N))

Here the outer loop runs from 1 to N

Here the j is incremented by i every time. So it will run for N/i every time.

The inner loop will run for N/i times in the iteration.

Thus the total time taken by the program will be N/1 + N/2 + N/3 + N/4 + N/5 + … N/N.

We can take the N out as a common multiple from the above sequence.

Thus the sequence will become N x (1/1 + ½ + ⅓ + ¼ + 1/N ).

Now we can represent the sequence in the form of summation as:

T(N) = N x ∑_{1}^{N} 1/i

We can represent the summation approximately as integration. The lower limit will be 1 and the upper limit will be N

= N ∫_{1}^{N} (1/x) dx

The integration of 1/x is log x. So substituting ∫_{1}^{N} (1/x) dx with [log_{e}x]_{1}^{N }we get

= N x [log_{e}x]_{1}^{N}

Now, substituting the lower and upper limit we get the following:

= N [log N - log 1]

= N x log N (Reason: Since log 1 is 0)

Thus the time complexity is N x (1/1 + ½ + ⅓ + ¼ + 1/N ) = N x log(N)

Q12. What is the time complexity for the following program?

def bisect_left(arr,item):
# arr is sorted array
low=0
high=len(arr)
while low<high:
mid=low+(high-low)//2
if arr[mid]>=item:
high=mid
else:
low=mid+1
return low

Ans:

Time complexity : O(log(length of array))

Let N be the length of the array.

Here we are using binary search on the array “arr” to search the element item. Here in each search half of the array is eliminated. Thus the time complexity can also be written as

T(N) = T(N/2) + 1.

Now T(N/2) = T(N/4) + 1 substituting it in the above equation we get T(N) = T(N/4) + 2.

Thus T(N) = T(N/2^{k}) + k.

Let us find the smallest integer k such that 2^{k} >=N, thus k >= log2(N). As k is integer k= ceil(log2(N))

Q13. What is the time complexity for the following program?

void fun(int n)
{
int k = 0;
for (int i = n; i > 0; i = i / 2)
{
for (int j = 0; j < i; ++j)
{
++k;
}
}
cout << k << endl;
}

Ans: O(n)

Here the values of i will be n, n/2, n/4, n/8 and so on until becomes 1.

After each iteration in the outer for loop, the values of i get divided by 2.

Thus the inner loop will run for n + n/2 + n/4 + n/8 + n/16 + and so on until it becomes 0.

From the above sequence, we can take out n common. So our sequence is n x (1+ ½ + ¼ + ⅛ + 1/16 … up to 0).

Now in 1+ ½ + ¼+ ⅛ + … forms Geometric progression with first term as a = 1 and common ratio “r” = ½.

Thus the 1+ ½ + ¼+ ⅛ + is upper bounded by a/(1-r) = 1/(1-½) = 2.

So, we get the upper bound of the sequence as 2 x n.

Thus time complexity will be O(N).

Q14. What is the time complexity for the following program?

def calc(arr,N):
j=0
for i in range(N):
while j<N and arr[i]<arr[j]:
j+=1

Answer: O(N).

Here the inner loop will be executed for a maximum of N times. The reason is that the j is incremented every time in the while loop. The condition in the while loop will be executed for 2 x N time because the outer for loop runs for N time and while loop runs for N times.

Thus the time complexity is O(N).

Q15. What are the time and space complexity for the following program?

maxa=50
dp=[-1 for i in range(maxa+1)]
def fib(n):
if dp[n]!=-1:
return dp[n]
if n<=1:
return n
dp[n]=fib(n-1)+fib(n-2)
return dp[n]
print(fib(maxa))

Ans:

Time complexity: O(n) and space complexity: O(n)

Here the dp array is used for memorization. So it stores the fib(n) after it was calculated. Thus in calculating fib(n), the dp[n-1] and dp[n-2] are already calculated, so fib(i) takes O(1) time after its last two numbers are calculated. So fib(i) = constant + fib(i-1). Thus time complexity is O(N).

We are making an array of size n. Similarly, the maximum depth of the recursion will be n.

1.Is time complexity always greater than or equal to space complexity?

Yes, because if we allocate space x, then the time complexity for allocating the space will be O(x). Also, due to other loops and conditions, the time complexity is always greater than or equal to space complexity.

2. How to find out time complexity in an alternative easy way?

The easy way to find out time complexity would be by adding counter+=1 in the innermost loop. Using final value based on different inputs, we can get a relation between the input and output and thus guess approximate time complexity.

3. How to measure the space taken by the array?

In python, we can get the space taken by an array using the sys.getsizeof() function. For example, the sys.getsizeof(arr) gives the memory used by object arr.

4. What is the need to optimise time and space?

Optimisation of time is essential in writing efficient programs. A merge sort on normal computers can beat bubble sort on supercomputers. Similarly, using large memory can also result in runtime errors. Thus the efficiency of programs plays a vital role. A good program is both time and space-efficient.

5. What are the ways to optimise time and space?

We can optimise time and space complexity by using better algorithms and proper data structures. Removing unnecessary arrays can save space. Similarly, data structures like HashMaps can be beneficial in searching an element.

Conclusion

Here, we learn about various programs' time and space complexity. You can also copy the programs and run them at your local machines for better understanding. You can also edit them at your convenience and check the output.