This article aims to familiarise you with the time and space complexity for loops.

For making any efficient algorithm, we must know the time and space complexity. By optimising the time complexity, the code becomes efficient and runs fast.

To brush up on your knowledge of the basic complexity, you can read the article Time and Space complexity on Coding Ninjas Studio.

What is time complexity?

The amount of time an algorithm takes to run based on the input size is known as time complexity.

Total space taken by an algorithm based on the input size is known as space complexity.

Let us see some time complexity expressions and their meaning

Time complexity

Meaning

O(1)

Constant time

O(log n)

Logarithmic time complexity

O(n)

Linear time complexity

O(n x n)

Quadratic time complexity

O(n^4+n^2)

Polynomial-time complexity

O(2^{n})

Exponential time complexity

O(n!)

Factorial time complexity

Let us see some examples below

1) O(1) or constant time complexity

The program is said to have constant time complexity if it is independent of the input size. Moreover, it does not consist of recursion or loops or function calls to non-constant functions. When the loop size is constant, then also it is said to have constant time complexity.

Similarly, if the program has a constant space allocation, it is said to have constant space complexity.

Below are some examples of constant time complexity and constant space complexity:

a=5
b=6
print(a+b)

constant=5
for i in range(constant):
print(i)

2) Logarithmic time complexity

A program is said to have O(log(N)) time complexity if it contains loops that iterate for a constant multiple of log(N) and/or recursive functions which call itself for constant multiple of log(N).

def bisect_left(arr,item):
# arr is sorted array
low=0
high=len(arr)
while low<high:
mid=low+(high-low)//2
if arr[mid]>=item:
high=mid
else:
low=mid+1
return low

The binary search algorithm as seen above has logarithmic time complexity.

i=1
c=5 # c= constant here
n=1000
while i<n:
i*=c

The above code also has logarithmic time complexity. The value of i will be 1,c,c^{2}, c^{3}, c^{k} after every iteration. At last, the value of i will be c^{K+1}, and the loop will break. Thus c^{k} < n implies k<= log_{c}n.

Logarithmic time complexity.

3) O(n) time complexity

A program is said to have O(n) time complexity if it contains loops that iterate for a constant multiple of n and/or recursive functions that call itself for a constant multiple of n. The program may have both loops and recursive functions.

n=100
for i in range(2*n):
print(i)

Here the loop runs for 2 x n times. Thus time complexity = O(N)

n=100
def fib(n):
if n==1:
return
fib(n-1)

Here the recursive function calls itself for n times. We can see this in the below diagram. fib(100) calls fib(99), which calls fib(98) and so on. fib(1) doesn’t call any function, and thus, we return at that point.

Thus time complexity = O(N).

Figure showing recursion tree of fib(n).

O(n) space complexity

A program is said to have constant space if it makes an array of a constant multiple of n and/or has a recursion tree having a depth of a constant multiple of n.

n=2000
l=[0]*(n//2)

Here we are making an array of size n/2. Since n/2 = ½ x n = constant x n, we can represent the size of the array as a constant multiple of n. Thus space complexity is O(N).

n=100
def fib(n):
if n==1:
return
fib(n-1)

The above function has a depth of n. Thus space complexity is O(N).

4) Polynomial-time complexity

A program is said to have polynomial time complexity if the time complexity can be represented in the form of O(N^{c}).

All programs having linear time complexity(O(N) time complexity), quadratic time complexity(O(N x N) time complexity), cubic time complexity(O(N^{3}) time complexity) are said to have polynomial time complexity.

The programs having polynomial time complexity are bubble sort, insertion sort etc.

5) Exponential time complexity

A program is said to have exponential time complexity if it has a time complexity of the form O(c^{N}) where c is constant.

The following program has exponential time complexity:

def fib(n):
if n<=1:
return n
return fib(n-1)+fib(n-2)

The figure shows the recursion tree for calculating the fib(5)

A program is said to have exponential time complexity if it has a time complexity of the form O(n!).

from itertools import permutations
n=5
arr=list(range(n))
for i in permutations(arr):
pass

The above program has factorial time complexity. It prints all the permutations of the array arr. Since there are n! permutations of an array, the loop will run for n! times. Printing the array will cost O(n). So time complexity = n! x n.

Is time complexity always greater than or equal to space complexity? Yes, because if we allocate space x, then the time complexity for allocating the space will be O(x). Also, due to other loops and conditions, the time complexity is always greater than or equal to space complexity.

Which algorithm is better, one having time complexity O(2^{n}) or one having time complexity O(n!)? O(n!) grows much faster than O(2^{n}). Also, we can see this from the graph given. Thus an algorithm with O(2^{n}) time complexity is much better than an algorithm with factorial time complexity.

Which algorithm is better, one having time complexity O(n) or one having time complexity O(log(n)^{3})? O(n) grows much faster than O(log(n)^{3}). At n=million, the first one will have value million, whereas the latter will have value 20^{3} = 8000. Thus an algorithm having O(log(n)^{3}) is much better than an algorithm having time complexity O(n)

What is the need to optimise time and space? Optimisation of time is essential in writing efficient programs. A merge sort on normal computers can beat bubble sort on supercomputers. Similarly, using large memory can also result in runtime errors. Thus the efficiency of programs plays a vital role. A good program is both time and space-efficient.

What are the ways to optimise time and space? We can optimise time and space complexity by using better algorithms and proper data structures. Removing unnecessary arrays can save space. Similarly, data structures like HashMaps can be beneficial in searching an element.

Key Takeaways

We understand the basics of time and space complexity for codes and their importance for solving an algorithm. We saw different programs having different time complexity.

You can also continue reading further article Link.