Table of contents
1.
Introduction
2.
What is Time Complexity?
3.
What is Space Complexity?
4.
Why Are Time and Space Complexities Important?
5.
Types Of Time Complexity
5.1.
Best Time Complexity
5.2.
Average Time Complexity
5.3.
Worst Time Complexity
6.
Sorting Algorithms Time and Space Complexity Cheat Sheet
7.
Selection Sort
7.1.
Time Complexity 
7.2.
Space Complexity
8.
Insertion Sort
8.1.
Time Complexity
8.2.
Space Complexity
9.
Bubble Sort
9.1.
Time complexity
9.2.
Space Complexity
10.
Merge Sort
10.1.
Time complexity
10.2.
Space complexity
11.
Quick Sort
11.1.
Time Complexity
11.2.
Space Complexity
12.
Heap Sort
12.1.
Time Complexity
12.2.
Space complexity
13.
Counting Sort
13.1.
Time Complexity
13.2.
Space complexity
14.
Radix Sort
14.1.
Time Complexity
14.2.
Space Complexity
15.
Bucket Sort
15.1.
Time Complexity
15.2.
Space Complexity
16.
Count Sort
16.1.
Time Complexity
16.2.
Space Complexity
17.
Shell Sort
17.1.
Time Complexity
17.2.
Space Complexity
18.
Tim Sort
18.1.
Time Complexity
18.2.
Space Complexity
19.
Tree Sort
19.1.
Time Complexity
19.2.
Space Complexity
20.
Cube Sort
20.1.
Time Complexity
20.2.
Space Complexity
21.
Frequently Asked Questions
21.1.
What is the time complexity of sorting algorithms?
21.2.
What is the fastest time complexity for sorting?
21.3.
Which sorting algorithm has log n complexity?
21.4.
Which sorting algorithm has a space complexity of O(1)?
21.5.
Which sorting algorithm has best time complexity?
21.6.
What is the quick sort time complexity?
22.
Conclusion
Last Updated: Jan 24, 2025
Easy

Space and Time Complexity of Sorting Algorithms

Author Abhishek Ranjan
9 upvotes
Career growth poll
Do you think IIT Guwahati certified course can help you in your career?

Introduction

Sorting algorithms are very important algorithms in computer science. Having a good grasp of sorting algorithms is necessary to solve complex problems. Learning the time and space complexity of different Sorting algorithms helps you decide which sorting algorithm is best for the given problem. 

time complexity of sorting algorithms

In this article, we will discuss the space and time complexity of sorting algorithms:  Selection Sort, Insertion Sort, Bubble Sort, Merge Sort, Quicksort, Heap Sort, Counting Sort, Radix Sort, and Bucket Sort. 

What is Time Complexity?

The amount of time an algorithm takes to run based on the input size is known as time complexity. In computer science, a problem can be solved using different algorithms. Some algorithms may be high-speed, and some may be very slow. We must find out the fastest and most efficient algorithm to save time to solve the problem. To find out the fastest algorithm, we must understand the time complexity.

What is Space Complexity?

Space Complexity is the measure of memory consumed by a program to operate on the input of a given size. Hence, Space complexity is essentially the sum of the  Auxiliary memory used and the memory used by input. However, this definition isn’t popularly used for comparing algorithms; otherwise, the space complexity of bubble and merge sort would be the same as O(n). This is why it’s best to say that certain algorithms like bubble sort require O(1) extra memory, and merge sort requires O(n) extra memory when discussing algorithms' complexity analysis.

Why Are Time and Space Complexities Important?

Time and space complexities are important because they tell us how fast a program runs and how much memory it uses. 

Think of it like this: when you're cooking a meal, you want to know how long it will take (time complexity) and how much space you need on your counter and in your plates (space complexity).

Time complexity gives us an idea of how the time to run a program increases as the amount of data it deals with grows. For example, a program might take 5 minutes to sort 100 names, but 50 minutes to sort 10,000 names. Knowing this helps programmers choose the fastest algorithms for their needs.

Space complexity is about how much memory a program needs to work. If a program needs a lot of memory, it might not run well on a phone or a small computer. 

Both are important because they help us make programs that run efficiently, without wasting time or memory. This is crucial for creating good software, especially for apps and systems that handle lots of data or need to work very fast.

Types Of Time Complexity

Best Time Complexity

This refers to the minimum amount of time that an algorithm requires to execute for a given input size under ideal conditions. It represents the fastest execution time achievable by the algorithm. The best-case time complexity is often denoted by the notation O(f(n)), where f(n) is a function representing the minimum number of operations required as a function of the input size n.

Average Time Complexity

Average time complexity refers to the average amount of time an algorithm takes to execute across all possible inputs of a given size. It is an estimation of the expected running time of the algorithm when considering various input distributions. The average-case time complexity is denoted by O(f(n)), where f(n) represents the average number of operations required for input size n.

Worst Time Complexity

This represents the maximum amount of time an algorithm takes to execute for a given input size, considering the most unfavorable scenario. It provides an upper bound on the running time of the algorithm and is denoted by O(f(n)), where f(n) is a function representing the maximum number of operations required as a function of the input size n. The worst-case time complexity is crucial for assessing the algorithm's performance in the most demanding situations.

Sorting Algorithms Time and Space Complexity Cheat Sheet

ns

  Algorithm

                    Time Complexity

    Space Complexity

(Worst Case)

Selection SortO(N^2)O(N^2)O(N^2)O(1)
Insertion SortO(N^2)O(N^2)O(N^2)O(1)
Bubble SortO(N^2)O(N^2)O(N^2)O(1)
Merge SortO(N log N)O(N log N)O(N log N)O(N)
Quick SortO(N log N)O(N log N)O(N^2)  O(N)
Heap SortO(N log N)O(N log N)O(N log N)O(1)
Counting SortO(N + K)O(N + K)O(N + K)O(K)
Radix SortO(N * K)O(N * K)O(N * K)O(N + K)
Bucket SortO(N + K)O(N + K)O(N^2)O(N)
Count SortO(N+K)O(N+K)O(N+K)O(N+K)
Shell SortO(1)O(1) to O(Nlog2N)O(1) to O(Nlog2N)O(1)
Tim SortO(N)O(NlogN)O(NlogN)O(N)
Tree SortO(N)O(NlogN)O(N2)O(N)
Cube SortO(N)O(NlogN)O(NlogN)O(N)

Selection Sort

Selection Sort is one of the brute force approaches to sort an array. It divides the array into two subarrays: sorted and unsorted. Initially, the sorted subarray is empty, and the original array is the unsorted subarray. Then, it repeatedly finds the least element in the unsorted subarray and puts it at the end of the sorted subarray. 

Time Complexity 

The algorithm iterates the unsorted subarray (N - 1) times, and, after every iteration, the size of the unsorted subarray reduces by one. Thus the total number of comparisons is (N - 1) + (N - 2) + (N - 3) + ........... + 1 which is N * (N - 1)/2. So, the time complexity is quadratic.

It performs the same number of comparisons for any given array of size N, so the worst, best, and average cases are equal.

Worst Case = Best Case = Average Case  O(n^2)

Space Complexity

The algorithm doesn't use any extra space, so the space complexity is constant.

Insertion Sort

Similar to selection sort, Insertion sort also divides the array into two parts: sorted and unsorted. But in insertion sort, the sorted subarray initially contains the first element. Then the algorithm repeatedly picks the elements from the unsorted subarray and puts them at the correct position in the sorted subarray. 

Time Complexity

Best Case 

If the array is already sorted, then the algorithm picks the first element from the  unsorted subarray and puts it at the end of the sorted subarray. So the time complexity for the sorted array is O(n).

Worst Case

For the array sorted in reverse order, the algorithm picks the first element from the unsorted subarray and places it at the beginning of the sorted subarray. Thus the total number of comparisons is N * (N - 1)/2. so the worst-case time complexity is O(N^2).

Average Case

The average case time complexity of insertion sort is also O(N^2).

Space Complexity

The algorithm doesn't use any extra space other than the original array, so the space complexity is O(1).

Bubble Sort

If the array is already sorted, then in the first pass, we do not perform any swap. Then, we know that no more swaps are required. So we can stop sorting. Thus the best time complexity turns out to be linear.

Time complexity

Best Case

If the array is already sorted, then in the first pass, we do not perform any swap. Then, we know that no more swaps are required. So we can stop sorting. Thus the best case time complexity is linear.

Worst Case

If the array is sorted in reverse order, then in the first pass, we perform (N - 1) swaps, (N - 2) in second, (N - 3) in third, and so on. Thus the total number of swaps is N * (N - 1)/2. So, the overall time complexity is O(n * n).

Average Case

The average number of swaps for any random array is (N^2)/4, so the average case time complexity is O(N^2).

Space Complexity

The algorithm doesn't use any extra space other than the original array, so the space complexity is O(1).

Merge Sort

Merge sort is one of the most efficient sorting techniques. It follows divide and conquer strategy. It first divides the original array into N subarrays of size one each and then repeatedly merges two in sorted order. 

Time complexity

The merge sort performs the same number of operations regardless of the input array. It divides the array recursively into two subarrays of equal size which will create logN subarrays. Then it repeatedly merges two subarrays in sorted order, which take linear time. So, the overall complexity is O(N log N).

Best case = worst case = average case = O(N logN)

Space complexity

An extra array of size N is needed to store the merged array. Thus the space complexity is O(N).

Quick Sort

Similar to merge sort, quick sort also uses divide and conquer strategy. It selects an element as a pivot and partitions the array around it. It moves all the elements less than pivot to its left and all the elements greater than it to the right, then recursively sorts the subarrays.

Time Complexity

Time complexity of quick sort depends upon the pivot element.

Worse Case

When the array is already sorted, and we select the leftmost elements as a pivot, the algorithm recursively creates N subarrays of size N, N - 1, N - 2,.......,1. As the array is sorted, each subarray requires linear time for partitioning, resulting in quadratic time complexity.

Best Case

If we pick the median element as a pivot every time, then the algorithm creates logN subarrays (similar to merge sort). So for such a case, the time complexity is O(NlogN), which is the best case time complexity.

Average Case

There are many ways to avoid the worst case of quicksort, like choosing the element from the middle of the array as pivot, randomly generating a pivot for each subarray, selecting the median of the first, last, and middle element as pivot, etc. By using these methods, we can ensure equal partitioning, on average. Thus, quick sort's average case time complexity is O(NlogN).

Space Complexity

Unlike merge sort quicksort doesn't require any extra space to store arrays but additional memory is required for creating stack frames in recursive calls. So, the space complexity again depends on the pivot.

Worse Case

If we select the largest or smallest element as a pivot, there are in total N recursive calls, so the size of the recursion tree will be N. Thus, the space complexity for such a case is O(N) which is the worst case.

Best Case

If we manage to partition the array in equal halves each time, then the size of the recursion tree is logN. So, in this case, the space complexity is O(N log N). 

Heap Sort

In the heap sort, we convert the given array into min-heap or max-heap. Then we repeatedly fetch the largest or smallest element from the heap and place them accordingly.

Time Complexity

On average, O(logN) time is required to fetch the smallest or largest element from the heap, and we have to fetch N elements. So overall time complexity of quicksort is O(NlogN).

The order of time taken by quicksort is always the same irrespective of the array.

Best case = Worst case = Average case = O(N log N) 

Space complexity

Heap sort doesn't require extra space as it simply converts the given array into a heap. Therefore, its space complexity is O(1).

Counting Sort

Counting sort keeps the number of occurrences of each unique element in the given array into an auxiliary array of size K, equal to the range of elements in the input.

K = (max element - minimum element + 1)

Then, this auxiliary array is used to sort the given array.

Time Complexity

The time complexity to iterate over each element in the given input is O(N). Then the algorithm iterates through the auxiliary array that contributes O(K) time complexity. So, the overall time complexity is O(N + K).

Best case = worst case = average case = O(N + K)

The time complexity of the counting sort is linear, but we can't use it if the range of input elements is large.

Space complexity

The algorithm requires an auxiliary array of size K, so the space complexity is O(k).

Radix Sort

Radix sort overcomes the problem of counting sort. It sorts the array digit by digit. It starts with the least significant digit and moves to the next significant digit, and sorts based on that digit. In the case of a tie, it uses the sorted order of the previous digit. It uses the counting sort for a particular digit.

Time Complexity

Let K be the number of digits in the maximum element of the array and b be the base of array elements.

The counting sort for a particular digit takes O(N + b) time, and the algorithm runs the counting sort for all the K digits. Therefore, the total time complexity is O(K * (N + b))

best case = worst case = average case = O(K * (N + b))

Space Complexity

The space complexity of radix sort is the same as counting sort, which is O(N + K).

Bucket Sort

By suitably mapping each element to a bucket, we partition the array into several groups called buckets in bucket sort. The contents from the buckets are then progressively gathered to form the sorted array, which is then sorted using any efficient sorting algorithm. The input characteristic determines the mapping function.

Time Complexity

Worst Case

When all elements in an array are the same, the algorithm puts them all in the same bucket. The overall time complexity will become quadratic if we apply a quadratic time complexity algorithm to sort that bucket, such as insertion sort, selection sort, etc.

Average Case and Best Case

Bucket sort runs in linear time when the elements are spread randomly in the array (e.g., Given array is a permutation of size N), as long as the sum of the squares of the bucket sizes is linear in the total number of items.

Space Complexity

If K is the number of buckets required, then O(K) extra space is needed to store K empty buckets, and then we map each element to a bucket that requires O(N) extra space. So the overall space complexity is O(N + K).

Count Sort

Count Sort is a sorting algorithm that works by counting the occurrences of each unique element and arranging them in sorted order. It is efficient for sorting integers within a specific range.

Time Complexity

Count Sort has a time complexity of O(n+k), where n is the number of elements in the input array and k is the range of the input.

Space Complexity

Count Sort has a space complexity of O(n+k) due to the additional space required for counting the occurrences of elements and storing the sorted output.

Shell Sort

Shell Sort is an in-place comparison-based sorting algorithm that divides the array into smaller subarrays and sorts them independently using an insertion sort. It has better performance than insertion sort on average.

Time Complexity

Shell Sort has a time complexity ranging from O(1) to O(nlog2n) depending on the gap sequence used.

Space Complexity

Shell Sort has a space complexity of O(1) as it operates in-place, requiring only a constant amount of extra space.

Tim Sort

Tim Sort is a hybrid sorting algorithm derived from merge sort and insertion sort. It partitions the array into blocks, sorts them individually using insertion sort, and then merges them together.

Time Complexity

Tim Sort has a time complexity of O(nlogn) on average and worst-case scenarios due to its merge sort-based strategy.

Space Complexity

Tim Sort has a space complexity of O(n) because it requires additional space for merging the sorted blocks.

Tree Sort

Tree Sort is a sorting algorithm that builds a binary search tree from the elements to be sorted and then performs an in-order traversal to retrieve the elements in sorted order.

Time Complexity

Tree Sort has an average time complexity of O(nlogn) and a worst-case time complexity of O(n2) if the tree is unbalanced.

Space Complexity

Tree Sort has a space complexity of O(n) to store the elements in the binary search tree.

Cube Sort

Cube Sort is a variation of merge sort that partitions the array into sub-cubes and sorts them independently, followed by merging the sub-cubes to obtain the sorted array.

Time Complexity

Cube Sort has a time complexity of O(nlogn) on average and in the worst-case scenario due to its merge sort-like strategy.

Space Complexity

Cube Sort has a space complexity of O(n) as it requires additional space for merging the sub-cubes.

Frequently Asked Questions

What is the time complexity of sorting algorithms?

The computational complexity of an algorithm is measured by how long it takes to do a specific action, such as sorting a list of data, and this is known as the time complexity of sorting algorithms.

What is the fastest time complexity for sorting?

The fastest time complexity for sorting is O(nlogn), achieved by algorithms like Merge Sort, Heap Sort, and Tim Sort.

Which sorting algorithm has log n complexity?

Sorting algorithms with log n complexity include Shell Sort, which has a time complexity ranging from O(1) to O(nlog2n).

Which sorting algorithm has a space complexity of O(1)?

Shell Sort typically has a space complexity of O(1) as it operates in-place, requiring only a constant amount of extra space.

Which sorting algorithm has best time complexity?

The quick sort algorithm has the best time complexity of all the sorting algorithms. The best, average, and worst cases of Quicksort's temporal complexity are O(N log N), O(N log N), and O(N log N), respectively.

What is the quick sort time complexity?

The best time complexity of quick sort is O(N log N), and the worst time complexity and the average time complexity of quick sort are also the same as the best, i.e., O(N log N).

Conclusion

In this article, we discussed the space and time complexity of sorting algorithms. Having a good grasp of sorting algorithms is very important to crack coding interviews. You can also check out our other blogs on sorting algorithms to get a better hold on this topic.

Recommended Articles -


To learn more about such Data Structures and algorithms, Coding Ninjas Studio is your one-stop destination. This platform will help you to improve your coding techniques and give you an overview of interview experiences in various product-based companies.

Happy learning!

Live masterclass