Table of contents
1.
Introduction
2.
What is time complexity?
3.
What is space complexity?
4.
Let us see some time complexity expressions and their meaning
5.
Show the comparison of time complexity from best to worst
6.
FAQs
7.
Key Takeaways
Last Updated: Mar 27, 2024

Time & Space Complexity for loops

Author HET FADIA
1 upvote
Career growth poll
Do you think IIT Guwahati certified course can help you in your career?

Introduction

This article aims to familiarise you with the time and space complexity for loops.

For making any efficient algorithm, we must know the time and space complexity. By optimising the time complexity, the code becomes efficient and runs fast.

To brush up on your knowledge of the basic complexity, you can read the article Time and Space complexity on Coding Ninjas Studio.

What is time complexity?

The amount of time an algorithm takes to run based on the input size is known as time complexity.

Read More About, Data Structures and Algorithms and Rabin Karp Algorithm

What is space complexity?

Total space taken by an algorithm based on the input size is known as space complexity.

Let us see some time complexity expressions and their meaning

Time complexity

Meaning

O(1)

Constant time

O(log n)

Logarithmic time complexity

O(n)

Linear time complexity

O(n x n)

Quadratic time complexity

O(n^4+n^2)

Polynomial-time complexity

O(2n)

Exponential time complexity

O(n!)

Factorial time complexity

 

Let us see some examples below

1) O(1) or constant time complexity

The program is said to have constant time complexity if it is independent of the input size. Moreover, it does not consist of recursion or loops or function calls to non-constant functions. When the loop size is constant, then also it is said to have constant time complexity.

Similarly, if the program has a constant space allocation, it is said to have constant space complexity.

Below are some examples of constant time complexity and constant space complexity:

a=5
b=6
print(a+b)
You can also try this code with Online Python Compiler
Run Code
constant=5
for i in range(constant):
    print(i)
You can also try this code with Online Python Compiler
Run Code

 

2) Logarithmic time complexity

A program is said to have O(log(N)) time complexity if it contains loops that iterate for a constant multiple of log(N) and/or recursive functions which call itself for constant multiple of log(N). 

def bisect_left(arr,item):
    # arr is sorted array
    low=0
    high=len(arr)
    while low<high:
        mid=low+(high-low)//2
        if arr[mid]>=item:
            high=mid
        else:
            low=mid+1
    return low
You can also try this code with Online Python Compiler
Run Code

 

The binary search algorithm as seen above has logarithmic time complexity.

i=1
c=5 # c= constant here
n=1000
while i<n:
    i*=c
You can also try this code with Online Python Compiler
Run Code

 

The above code also has logarithmic time complexity. The value of i will be 1,c,c2, c3, ck after every iteration. At last, the value of i will be cK+1, and the loop will break. Thus ck < n implies k<= logcn. 

Logarithmic time complexity.

3) O(n) time complexity

A program is said to have O(n) time complexity if it contains loops that iterate for a constant multiple of n and/or recursive functions that call itself for a constant multiple of n. The program may have both loops and recursive functions.

n=100
for i in range(2*n):
    print(i)
You can also try this code with Online Python Compiler
Run Code

 

Here the loop runs for 2 x n times. Thus time complexity = O(N)

n=100
def fib(n):
    if n==1:
        return
    fib(n-1)
You can also try this code with Online Python Compiler
Run Code

Here the recursive function calls itself for n times. We can see this in the below diagram. fib(100) calls fib(99), which calls fib(98) and so on. fib(1) doesn’t call any function, and thus, we return at that point.

Thus time complexity = O(N).

Figure showing recursion tree of fib(n).

O(n) space complexity

A program is said to have constant space if it makes an array of a constant multiple of n and/or has a recursion tree having a depth of a constant multiple of n.

n=2000
l=[0]*(n//2)
You can also try this code with Online Python Compiler
Run Code

 

Here we are making an array of size n/2. Since n/2 = ½ x n = constant x n, we can represent the size of the array as a constant multiple of n. Thus space complexity is O(N).

n=100
def fib(n):
    if n==1:
        return
    fib(n-1)
You can also try this code with Online Python Compiler
Run Code

 

The above function has a depth of n. Thus space complexity is O(N).

4) Polynomial-time complexity

A program is said to have polynomial time complexity if the time complexity can be represented in the form of O(Nc).

All programs having linear time complexity(O(N) time complexity), quadratic time complexity(O(N x N) time complexity), cubic time complexity(O(N3) time complexity) are said to have polynomial time complexity.

The programs having polynomial time complexity are bubble sort, insertion sort etc.

5) Exponential time complexity

A program is said to have exponential time complexity if it has a time complexity of the form O(cN) where c is constant.

The following program has exponential time complexity:

def fib(n):
    if n<=1:
        return n
    return fib(n-1)+fib(n-2)
You can also try this code with Online Python Compiler
Run Code

The figure shows the recursion tree for calculating the fib(5)

Source: link

Figure shows exponential time complexity

 

6) Factorial time complexity

A program is said to have exponential time complexity if it has a time complexity of the form O(n!).

from itertools import permutations
n=5
arr=list(range(n))
for i in permutations(arr):
    pass
You can also try this code with Online Python Compiler
Run Code

 

The above program has factorial time complexity. It prints all the permutations of the array arr. Since there are n! permutations of an array, the loop will run for n! times. Printing the array will cost O(n). So time complexity = n! x n.

 

Also see, Morris Traversal for Inorder.

Also see, Longest Common Substring

Show the comparison of time complexity from best to worst

O(1) > O(log(n)) > O(n) > O(n x log(n)) O(n2) > O(n3) > O(2n) > O(n!) > O(nn)

Fig shows comparison between different time complexity.

Source: link

Must Read What are Loops in Java.

FAQs

  1. Is time complexity always greater than or equal to space complexity?
    Yes, because if we allocate space x, then the time complexity for allocating the space will be O(x). Also, due to other loops and conditions, the time complexity is always greater than or equal to space complexity.
     
  2. Which algorithm is better, one having time complexity O(2n) or one having time complexity O(n!)?
    O(n!) grows much faster than O(2n). Also, we can see this from the graph given. Thus an algorithm with O(2n) time complexity is much better than an algorithm with factorial time complexity.
     
  3. Which algorithm is better, one having time complexity O(n) or one having time complexity O(log(n)3)?
    O(n) grows much faster than O(log(n)3). At n=million, the first one will have value million, whereas the latter will have value 203 = 8000. Thus an algorithm having O(log(n)3) is much better than an algorithm having time complexity O(n)
     
  4. What is the need to optimise time and space?
    Optimisation of time is essential in writing efficient programs. A merge sort on normal computers can beat bubble sort on supercomputers. Similarly, using large memory can also result in runtime errors.
    Thus the efficiency of programs plays a vital role. A good program is both time and space-efficient.
     
  5. What are the ways to optimise time and space?
    We can optimise time and space complexity by using better algorithms and proper data structures. Removing unnecessary arrays can save space. Similarly, data structures like HashMaps can be beneficial in searching an element.

Key Takeaways

We understand the basics of time and space complexity for codes and their importance for solving an algorithm. We saw different programs having different time complexity.

You can also continue reading further article Link.

Read more about  Characteristics of OOPS here. 

Check out Coding Ninjas Studio to learn more about competitive programming, DSA, attempt mock tests, interview preparation and much more.

Happy Learning!!!

Live masterclass