Imagine you searched something on Google, and it took more than an hour to display the result. Would you ever use Google search again? Probably Not. Hence time complexity and space complexity are essential factors in determining whether or not an algorithm is good.

Googleâ€™s search algorithm is just one example. Algorithms turn up all over the place, pretty much anywhere a computer is involved.

Therefore, companies hire skilled programmers who can write efficient codes because understanding the performance of an algorithm can help the company scale beyond its wildest expectations of the business.

This article will discuss the primary definition of Time Complexity and Space Complexity, followed by various examples to demonstrate how to evaluate them.

Code 1 prints Hello! once and stops the execution.

In contrast, code 2 is printing Hello! 5 times, or you can say N times, where N is the number entered by the user and then stops the execution.

When you run them with a C++ compiler, you will see the sentence "The code is executed in 0.2 seconds." Many of us think this is time-complexity, but this is not the case.

Instead, we can infer from the above example that the time required to complete the program is equal to the sum of the number of times each statement gets executed.

More precisely, time complexity deals with the amount of time required by the algorithm or program to execute to completion.

You may be wondering if we use some precise numbers on the clock to express time complexity.

If you are unfamiliar with asymptotic notation, do not forget to read about it.

How to Find Time Complexity

The steps involved in finding the time complexity of an algorithm are:

Find the number of statements with constant time complexity (O(1)).

Find the number of statements with higher orders of complexity like O(N), O(N^{2}), O(log N), etc.

Express the total time complexity as a sum of the constant.

Drop the non-dominant terms: Anything you represent as O(N^{2}+N) can be written as O(N^{2}) to get higher-order time complexity.

Ignore constants; if any, it does not affect the overall time complexity.

For Example, Anything you might think is O(3n) is O(n).

Now that we know how to find the algorithmâ€™s time complexity let us see a few time complexity examples to understand time complexity evaluation better.

Time Complexity Calculation Examples

C++

C++

#include <iostream> using namespace std;

int main() { int a,b,c; cout<<"Enter two numbers: "; cin>>a>>b; c=a+b; cout<<a<<" + "<<b<<" = "<<c; return 0; }

You can also try this code with Online C++ Compiler

There are only initialisation, input and output statements, an arithmetic operation, and a return statement in the code above. All these have a time complexity O(1).

So the time complexity of the entire code is O(1).

C++

C++

#include <iostream> using namespace std;

int main() { int N,sum=0; cout<<"Enter the length of the array: "; cin>>N; int arr[N]; for(int i=0;i<N;i++) //time complexity O(n) { cout<<"Enter the element in position "<<i<<": "; cin>>arr[i]; sum+=arr[i]; } cout<<"The sum of the elements is "<<sum; return 0; }

You can also try this code with Online C++ Compiler

Enter the length of the array: 5
Enter the element in position 0: 1
Enter the element in position 1: 2
Enter the element in position 2: 3
Enter the element in position 3: 4
Enter the element in position 4: 5
The sum of the elements is 15

In the code above, there is a for loop in addition to initialisation, input and output statements, an arithmetic operation and a return statement.

Apart from the loop, the rest of the lines have a time complexity O(1).

The for loop will run n times, so its time complexity will be O(N).

Thus, the entire code will have a time complexity of O(N + 9), but the constant is usually neglected for large values of N, and the time complexity will be O(N), also known as Linear-Time.

C++

C++

#include <iostream> using namespace std;

int main() { int N,sum=0; cout<<"Enter the dimensions of the square matrix: "; cin>>N; int arr[N][N]; for(int i=0;i<N;i++) //outer loop { for(int j=0;j<N;j++) //inner loop { cout<<"Enter the element in position "<<i<<","<<j<<": "; cin>>arr[i][j]; sum+=arr[i][j]; } } cout<<"The sum of the elements is "<<sum; return 0; }

You can also try this code with Online C++ Compiler

Enter the dimensions of the square matrix: 2
Enter the element in position 0,0: 1
Enter the element in position 0,1: 2
Enter the element in position 1,0: 3
Enter the element in position 1,1: 4
The sum of the elements is 10

In the above two examples, we already discussed the time complexity of for loop and other statements.

But here, the loops are nested, which implies that the inner loop will run N times for one iteration of the outer loop.

Therefore it will run N âœ• N times, so its time complexity will be O(N^{2}). Thus, the entire code will have a time complexity of O(N^{2} + 9), where 9 is the sum of O(1) time complexity of the remaining statements.

As we already know, for large values of n, the constant is usually neglected, and hence the time complexity will be O(N^{2}).

In the code above, there is a while loop in addition to initialisation, input and output statements, an arithmetic operation and a return statement.

Let us examine mathematically how this while loop works here.

Suppose N=8; then we are getting series of 8 4 2 1

Likewise, if we try for different inputs like 10, we will get 10 5 2 1

We can see that in each iteration, the number will be reduced to the previous half until the whole condition is violated.

Such algorithms of breaking a number or set of numbers into halves fall under logarithmic complexity O(log N).

If you are still unsure, we recommend looking at the larger entries and then checking the log (N) to see if the loop took that long.

Also, you can go through the Binary-Search Algorithm, which takes O(logN) time to understand logarithmic complexity in depth. The list does not end here; we have log-linear time O(NlogN), cubic time complexity O(N^{3}), exponential time complexity O(2^{n}), factorial time complexity O(N!) and many more.

What is Space Complexity?

Space complexity deals with the amount of memory required in a system for a particular algorithm. Now, before we dive deeper into the concept, we must know what auxiliary space is.

Auxiliary space is the extra space used by an algorithm, which on adding with the space taken by the input values, gives us space complexity.

Space complexity = Auxiliary Space + Space taken by input values

Note: Space complexity isnâ€™t the direct sum of the auxiliary space and the space taken by the input values. It is the maximum possible value of the sum for a particular algorithm.

To understand space complexity better, let us consider the simple example of printing the Fibonacci series with and without recursion.

1. Fibonacci Number Without Recursion

C++

C++

#include<iostream> using namespace std; int main() { int a = 0, b = 1, c,i,num; cout<<"Enter the number of elements: "; cin>>num; cout<<a<<" "<<b<<" "; for(i=2;i<num;i++) { c = a + b; cout<<c<<" "; a=b; b=c; } return 0; }

You can also try this code with Online C++ Compiler

#include<iostream> using namespace std; int Fibonacci(int n) { if((n==1)||(n==0)) return(n); else { return(Fibonacci(n-1)+Fibonacci(n-2)); } } int main() { int num , i=0; cout<<"Enter the number of terms of series : "; cin>>num; while(i < num) { cout<<" "<< Fibonacci(i); i++; } return 0; }

You can also try this code with Online C++ Compiler

The following steps describe how to find the space complexity of an algorithm:

Find the number of variables of different data types initialised in the algorithm.

Find the memory required by each data type (this can be done using the sizeof( ) operator. Learn how here).

Multiply the memory required by each data type with the number of variables of that data type present. The result will give the memory required by each data type.

Add the memory required by each data type.

Ignore the constants as explained in time complexity, if any, in the sum to obtain the space complexity in the Big O notation.

Now that we know how to find the space complexity let us see a few examples to understand better.

Space Complexity Calculation Examples

To understand how to calculate the space complexity of an algorithm, let us consider a few examples.

C++

C++

#include<iostream> using namespace std;

int main() { int a, b, c; cout<<"Enter the value of a and b "; cin>>a>>b; c = a + b; cout<<"The sum is "<<c; } return 0;

You can also try this code with Online C++ Compiler

In the code shown above, three integer-type variables are used. Now, the size of integer type variables is usually 2 or 4 bytes depending on the compiler.

If we assume it to be 4 bytes in our case, the total memory required for the code is 4 x 3 = 12 bytes, which is a constant. So the space complexity is O(1).

C++

C++

#include<iostream> using namespace std;

int main() { int n, sum = 0; cout<<"Enter the number of elements "; cin>>n; int arr[n]; for(int i = 0; i < n; i++) { cout<<"Enter the element in position "<<i<<" "; cin>>arr[i]; sum = sum + arr[i]; } cout<<sum; }

You can also try this code with Online C++ Compiler

A valid algorithm executes in a finite period of time. The time required by the algorithm to solve a particular problem is referred to as the algorithm's time complexity. In algorithm analysis, time complexity is a very valuable statistic. It is the amount of time required to complete an algorithm. To calculate the time complexity, we must take into account the cost of each fundamental instruction as well as the number of times the instruction is executed.

The amount of memory space that an algorithm or problem requires during execution is referred to as its space complexity. The space complexity is calculated not only by the space needed by the variables in the problem/algo, but it also includes and takes into account the space for input values. Constant space complexity often results in faster execution times because the algorithm does not need to allocate or deallocate memory dynamically.

Importance of Time and Space Complexity in Data Structures

Developers of real-world programs are constrained by the physical memory of the platforms they plan to run on. This is where space complexity comes into play, as we never want to run a function or process that consumes more space than the system has available at any one time. On the other hand, we don't want our operations to take excessive time so that they bog down and slow down our apps.

In cases of higher memory requirements, this will cause memory allocation to get slower, affecting the time complexity. Also, in the case of high volume input data, the algorithm tends to take more time to read, increasing the time complexity as well.

Can time and space complexity be different?

Yes, usually time and space complexity tend to be different. For example, Linear Search can have a worst case time complexity of O(N), whereas the space complexity remains O(1). This also varies depending on the input and its variable size.

What is more important, time complexity and space complexity?

Both are equally important, time and space both need to be optimized as much as possible, but this can also vary depending on the problem. Some problems might need to be solved faster, no matter how much space is required, and vice versa.

Can space complexity be larger than time complexity?

Yes, this is definitely possible. In the case of an algorithm that has to write a large amount of output while storing variables and data structures, the space complexity will be higher than the time complexity.

Conclusion

Here, we talked about how time complexity deals with the amount of time required to execute a program, while space complexity deals with the memory requirements. We learned to compute both Time Complexity and Space Complexity and also how important it is to keep time and space complexity in mind to write good code.

As the saying goes, practice makes perfect. So, donâ€™t forget to practice the wide variety of DSA problems frequently asked in interview rounds readily available on Code360. This will help you master efficient coding techniques with a bonus of interview experiences with scholars in big product-based companies.

Live masterclass

Switch from service to product-based MAANG companies

by Pranav Malik, SDE3 @ Oracle, Ex- Microsoft

17 Oct, 2024

01:30 PM

Predictive Analytics: Forecast ticket sales for an event like Coldplay

by Megna Roy, Data Analyst @ Amazon

15 Oct, 2024

01:30 PM

Switch from service to product-based MAANG companies

by Pranav Malik, SDE3 @ Oracle, Ex- Microsoft

17 Oct, 2024

01:30 PM

Predictive Analytics: Forecast ticket sales for an event like Coldplay