Introduction
The Sorting Algorithm is a set of instructions that rearrange a given set of elements according to a comparison operator, which compares the elements according to the conditions provided in the comparison operator.
Time Complexity is the time an algorithm takes to run to the function of the length of the input array, and Space Complexity is the extra space required by an algorithm as a function of the input array size. Time and Space Complexities are two crucial aspects of any algorithm. It determines the efficiency of that algorithm.
Further in this article, we will discuss the time and space complexity of different sorting algorithms having O(n log n) complexity, along with examples of every algorithm's best case, worst case, and average case.
We will see how some algorithms perform better than others in particular instances and viceversa.
Note: In this article, 'n' is used as the size of the input array wherever applicable.
Recommended topic, kth largest element in an array
Quick Sort
Quicksort is a divide and conquer algorithm. In this algorithm, we pick a pivot element and partition the array based on the pivot element, i.e., we place the pivot element at its correct position in the sorted array, and the smaller elements are placed on its left side. Greater elements are placed on its right side. There are different methods to determine the pivot element, and it is always preferred to be done in constant time. Quick Sort algorithm can be summarised as follows:
 Determine the pivot element
 Partition the given array based on pivot element
 Repeat steps 1 and 2 for the subarrays on the left and right sides of the pivot element.
We can maintain a pointer to the smaller element. While traversing the array, whenever we find an element smaller than the pivot, we swap it with the pointer position element and increase it by one. We can see that the partition operation takes linear time. Now, every time we place a pivot element in its correct position, we divide the array into two halves. If we consider that in each function call, the array is divided into two equal parts, then the size of the function call stack will be O(log n), and we know that here each function call takes O(n) time (due to partition operation). So the overall time complexity of the algorithm is O(n log n). The recurrence relation of this algorithm is: T(n) = 2T(n/2) + 𝛳(n).
We will now discuss cases where quicksort performs the best and the worst. We will also discuss its performance on average. We will take the pivot element as the first element on the array.
Best case
Quicksort performs best when the pivot element is the median element, i.e., it divides the array into two equal parts. In this, the height of the function call stack is log(n), and each function call takes O(n) time. So, the overall time complexity will be O(n log n). We need O(log n) space for the function call stack. Recurrence relation for the best case can be written as: T(n) = 2T(n/2) + 𝛳(n)
Example: 4, 2, 1, 3, 5, 6
Pivot element: 4
Left subarray: 2, 1, 3
Right subarray: 5, 6
We can see that size of the left and right subarray differ only by one.
Worst Case
Quicksort performs worst when the pivot element divides the array into two parts where one part has a size of (n1), and the other part is of zero size. This will lead to a function call stack size of O(n), and each function call is O(n). So time complexity will be O(n^{2}), and we need O(n) space to store the function call stack. Recurrence relation for worstcase can be written as: T(n) = T(n1) + 𝛳(n)
Example: 1, 2, 3, 4, 5, 6
Pivot element: 1
Left subarray: NULL
Right Subarray: 2, 3, 4, 5, 6
We can see that the left subarray has 0 sizes, and the right subarray has (n1) size.
Average case
On a random array, quick sort works in O(n log n) time and O(log n) space on an average.
We consider the average case where (n/9) number of elements are placed in one group and (9n/10) number of elements placed in another group.
The recurrence relation of this case can be written as:
T(n) = T(n/10) + T(9n/10) + 𝛳(n)
Merge Sort
Merge sort is a divide and conquer algorithm like quicksort. It divides the array into two equalsized subarrays and calls itself for both subarrays. After this, the two subarrays are merged into one array. Merge Sort algorithm can be summarised as follows:
 Find a middle element to divide the array into two halves
 Call merge sort for the left half
 Call merge sort for the right half
 Merge the left and right halves
The left and right halves will be sorted in themselves before step4. So, we can merge the left and right halves in linear time by maintaining two pointers in the two subarrays and comparing their values. The size of the function call stack of merge sort will be O(log n) in every case as the array is always divided into two equalsized subarrays, and the merge operation takes O(n) time. So, the overall time complexity will be O(n log n), and we need O(n) extra space in the merge operation. We can write its recurrence relation as
T(n) = 2T(n/2) + 𝛳(n)
Heap Sort
Heap Sort uses the concept of Binary Heap for sorting a given array. A Binary Heap is a complete binary tree that can be represented as an array. Heap Sort consists of the following steps:
 Build a max heap from the given array.
 Swap the root of the tree with the last element and reduce the size of the tree by one.
 Heapify the root of the tree
 Repeat steps 2 and 3 till the size of the tree is greater than one
We know that building a max heap takes O(n) time, and the heapify function takes O(log n) time. We are calling the heapify function every time we remove the largest element from the top, i.e., n times. So overall, the time complexity will be O(n log n). There is no need for extra space in Heap Sort, So space complexity is O(1).
You can refer to this article for more information on maxheap and heapify functions.
Radix Sort
In Radix Sort, the array is sorted digit by digit from the least significant digit to the most significant digit. We can use any stable sorting algorithm to sort according to numbers. The sorting algorithm that we use must be stable as the elements must appear in the same order as those in the original array. We can use counting sort here to sort individual digit places as it is stable and the digits have a small range of 09, so it will perform better with O(n) time complexity.
Overall, it takes O(n k) time in radix sort in all cases because we have to compare all the digits of all elements in the array to sort them. Now, to sort each radix(sorting according to a particular digit place), we use counting sort, and counting sort creates an auxiliary array of size k, which is used to store the count of each element. So space complexity is O(k).
Note: k is the maximum number of digits in the elements of an array.
For example [ 1, 2, 33, 199 ]
Here k=3 because the maximum number of digits is three, in the '199' element.
Let’s observe the working of Radix Sort:
Given unsorted array 20, 25, 90, 2, 102, 900
We can write this array as: 020, 025, 090, 002, 102, 900
020, 090, 900, 002, 102, 025 >O(n) time
900, 002, 102, 020, 025, 090 >O(n) time
002, 020, 025, 090, 102, 900 >O(n) time
We performed O(n) operations three times, where 3 is the value of k here. So, overall times complexity is O(NK).
Bucket Sort
In the Bucket Sort algorithm, the elements are divided into n different buckets of subsequent ranges, and then individual buckets are sorted. After each bucket is sorted, they are merged. Each bucket will have a single element in an ideal case, so it is already sorted. If the buckets contain more than one element, then for sorting individual buckets, we can either recursively call the same bucketSort function or use any other sorting algorithm. The time complexity to sort separate buckets is O(n) and to merge the buckets is O(k), where k is the number of buckets. So the overall time complexity is O(n+k), and when the number of buckets is equal to n, time complexity becomes O(n). The space complexity of this algorithm is O(n+k) to store the elements in buckets. Now, let's see the cases where this algorithm performs the best and the worst.
Best Case
The best case occurs when the given elements are distributed uniformly in the buckets, and the elements in each bucket are already sorted. Considering that it takes linear time to determine that the array is already sorted, we get a time complexity of O(n+k) where O(k) is the complexity of sorting the elements in a bucket and O(n) is the complexity of making and merging the buckets.
Example: arr[ ]=[ 45, 46, 47, 48, 49 ]
We have 5 buckets B1, B2, B3, B4, B5
B1: 45
B2: 46
B3: 47
B4: 48
B5: 49
Here the elements are distributed uniformly over all the buckets.
Worst Case
When the elements are in close range, they are likely to accumulate in the same bucket. In this case, the time complexity depends on the sorting algorithm used. If insertion sort is used, then in the worst case, it can take O(n^{2}) time.
Example: arr[ ]=[ 45, 46, 47, 48, 49 ]
We have five buckets B1, B2, B3, B4, B5
B1: NULL
B2: NULL
B3: 45, 46, 47, 48, 49
B4: NULL
B5: NULL
Here all the elements lie in the same bucket.
Average Case
When the elements are distributed randomly over a range, even if they are not uniformly distributed, it takes linear time on average.
Summarising the time complexities of every algorithm:
Algorithms  BestCase  Average  WorstCase 
Quick Sort  O(n log n)  O(n log n)  O(n^{2}) 
Bubble Sort  O(n)  O(n^{2})  O(n^{2}) 
Merge Sort  O(n log n)  O(n log n)  O(n log n) 
Insertion Sort  O(n)  O(n^{2})  O(n^{2}) 
Selection Sort  O(n^{2})  O(n^{2})  O(n^{2}) 
Heap Sort  O(n log n)  O(n log n)  O(n log n) 
Radix Sort  O(n k)  O(n k)  O(n k) 
Bucket Sort  O(n+k)  O(n+k)  O(n^{2}) 
Read More  Time Complexity of Sorting Algorithms and Euclid GCD Algorithm
FAQs

Which sorting algorithm is used in linked lists?
Merge sort is used to sort linked lists.

Which sorting algorithm is used when data is uniformly distributed over a range?
Bucket Sort as it divides the range into different buckets.

What is the time complexity of the heapify function?
O(log n)

What is the time complexity to build a heap?
O(n). It is often mistaken by O(n log n).

Which algorithm is used in the sort() function in C++ STL?
IntroSort, which is a hybrid algorithm of quick sort, heap sort, and insertion sort, is used in the sort function.