Quick Sort is a widely used sorting algorithm in computer science. It's an efficient, comparison-based sorting algorithm that follows a divide-and-conquer approach to sort a list or an array of elements.
The reason is that we humans love to simplify our work by dividing it into smaller parts. Keeping this fact in mind, people have developed computer algorithms that follow the divide and conquer principle. One of these algorithms is thequicksort algorithm used to sort the elements in an array.
Now you may be wondering that you already know some other sorting technique, so what’s the use of knowing another? Different algorithms have different time and space complexities. Thus, knowing quicksort is as vital as learning any other sorting algorithm. So, if you don’t know about quicksort already, just hang in there with me because together, we will learn all about quicksort in this article.
What is Quick Sort?
QuickSort is a sorting method that works by dividing a list into smaller groups, sorting those groups, and then putting them back together in order. It's like organizing a messy room by making smaller piles of stuff, arranging each pile, and then stacking them neatly.
QuickSort is like sorting a deck of cards. Imagine we have a mixed-up deck, and we want to arrange the cards in order. We pick one card as a "pivot" (like a reference card) and put all smaller cards on one side. Put larger cards on the other side. Now, we need to do the same for the two smaller groups of cards. Keep doing this until all groups have just one card. We have sorted the deck! It's fast because we're only comparing and moving cards a few times, not checking each card against all the others like some other methods. That's QuickSort—sorting by dividing and conquering!
Working of QuickSort
As we already know, quicksort is used to sort the elements of an array in ascending or descending order, but then how does it implement the divide and conquer principle?
In quicksort, an element is chosen as the pivot. Comparisons with it divide the array into two partitions, where the left side of the pivot has elements smaller than it while the right side has elements larger than it.
Quicksort has then applied to both the partitions again, and the process is continued till the array is sorted.
Now that we know how quicksort works, you are probably thinking, how do we choose the pivot?
There are four ways to choose the pivot:
Consider the first element as the pivot.
Consider the last element as the pivot.
Consider any random element as the pivot.
Use the Median of 3 methods.
Let us now consider a quicksort example and understand the four different methods.
First Element as the Pivot
After choosing a pivot, our next step is to swap the pivot with the last element and assign two counters for loops, i and j, as shown below: We mentioned before that i and j are counters for loops, but for what loops?
In quicksort, we will keep on incrementing i till we find an element larger than the pivot, and we will keep on decrementing j till we find an element smaller than the pivot.
When we find two such elements, we will swap them.
To understand this, let us consider a quicksort illustration.
Here, i is not greater than the pivot, but j is less than the pivot, so we will increase i till our condition is met. Finally, i is greater than the pivot, so our next step would be to swap the elements in positions i and j.
We will increment and decrement, i and j, respectively, till i > j.
Now that i has crossed j, we will swap the element in the ith position and the pivot.
Here, we can see that the elements to the left of the pivot are less than it, while the elements to the right of the pivot are greater than it. So we have got two partitions and the sorted position of the pivot (8).
Now, we will again apply the quicksort to the partitions. But, the right partition has only one element, so we’ll use a quicksort to the left partition only.
Last Element as the Pivot
In the left partition, we can see that 0 is the smallest element, so it is already in its sorted position. Therefore, considering 0 as the pivot would be a waste of time, so let’s use the last element as the pivot here. We will also initialise the counters i and j. We already know how the counters are used in quicksort, but let’s see it once more for the sake of our understanding.
Here, again two partitions are formed where the elements to the left of the pivot are smaller than it, while the elements to the right are larger.
Since we can see that the elements in the left partition are already sorted, we will ignore it. A computer program will, however, sort it again using quicksort.
Random Element as the Pivot
To sort the right partition we got, let us use the third method of considering the pivot.
Instead of any random element, let us choose the element at the middle index as the pivot.
Here, the middle element is 3. So, we will consider that to be the pivot.
Now that we have determined the pivot, we will continue the process as we have before.
Thus, the array is now like this.
The last three elements left can quickly be sorted by choosing the pivot following any of the methods mentioned above to give the sorted array as follows:
The Median of 3 Method
There is another method to choose the pivot element, the median of 3 methods.
Previously, we had seen that we might encounter a situation where the element we’re choosing as a pivot is already in its sorted position. So, considering that element as the pivot would unnecessarily increase the complexity of the algorithm.
To avoid this, the median of 3 methods is used to choose the pivot.
Let us consider the same array as an example to understand how this method works.
Here, the first element is 8, the last element is 0, and the median element (0+9/2 ≈ 4) is 6.
In the median of 3 rule, we will first sort these three elements in ascending order and place them in the first, median and last positions, respectively, as follows:
As shown below, we will swap the element in the median position, i.e., 6, with the element in the second to the last position. The element in the second to the last position will now be our pivot.
Our next step is to assign the counters. i and j will be assigned a little differently from how we were doing it before, but why?
The reason is, we already sorted the first, last and the median (now the pivot) element before. So, the first and the last elements are already in the correct partition.
Therefore i and j will be assigned as follows:
Therefore, the array is now as follows:
A new pivot is obtained for the left and the right partitions, and the process of quicksort continues to sort the entire array in ascending order.
Algorithm of QuickSort
Now that we know how quicksort works, we can easily figure out its algorithm.
Step 1: Choose the pivot following any of the four methods (the last element is commonly used as the pivot).
Step 2: Assign the counter i to the leftmost element’s index and the counter j to the rightmost element’s index, excluding the counter.
Step 3: If i is less than j, proceed.
Step 4: If the element in position i is less than the pivot, increment i.
Step 5: If the element in position j is greater than the pivot, decrement j.
Step 6: If the element in i is greater than the pivot and the element in position j is less than the pivot, swap the elements in i and j.
Step 7: If i is greater than j, swap the element in position i and the pivot.
Step 8: Continue the process for the left and the right partitions formed.
With this, we have completed the partition.
Now, for the quicksort part,
Step 1: Declare a function with three parameters, an array (say arr) and two integer type variables (say i and j).
Step 2: If arr[i] < arr[j], partition the array between arr[i] and arr[j].
Step 3: Recursively call the function with three parameters, an array (arr) and two integers (i and (partition position - 1)).
Step 4: Recursively call the function with three parameters, an array (arr) and two integers ((partition position + 1), j).
Knowing the algorithm, we can easily use the syntax of different programming languages to write the code for quicksort in C, quicksort in Java or quicksort in Python.
Code implementation of the Quick Sort
The complete code for quicksort, considering the last element as the pivot (in C++), quicksort below:
Implementation in C++
C++
C++
#include<stdio.h> #include <iostream> using namespace std; int partition(int x[], int i, int j) { int left=i, right=j-1, pivot=j, complete=0, t; while(!complete) { while(x[left]<=x[pivot] && pivot!=left) left++; if(pivot!=left) { t=x[pivot]; x[pivot]=x[left]; x[left]=t; pivot=left; } else complete=1; if(!complete) { while(x[right]>=x[pivot] && pivot!=right) right--; if(pivot!=right) { t=x[pivot]; x[pivot]=x[right]; x[right]=t; pivot=right; } else complete=1; } } return pivot; } void quicksort(int x[], int i, int j) { int pivot; if(i<j) { pivot=partition(x, i, j); quicksort(x, i, pivot-1); quicksort(x, pivot+1, j); } } int main() { int arr[] = {8,1,6,9,6,3,5,2,7,0}; int len = sizeof(arr)/sizeof(arr[0]); quicksort(arr,0,len-1); for(int i=0;i<len;i++) cout<<" "<<arr[i]; return 0; }
Output
Implementation in C
C
C
#include <stdio.h>
int partition(int x[], int i, int j) { int left = i, right = j - 1, pivot = j, complete = 0, t; while (!complete) { while (x[left] <= x[pivot] && pivot != left) left++; if (pivot != left) { t = x[pivot]; x[pivot] = x[left]; x[left] = t; pivot = left; } else complete = 1; if (!complete) { while (x[right] >= x[pivot] && pivot != right) right--; if (pivot != right) { t = x[pivot]; x[pivot] = x[right]; x[right] = t; pivot = right; } else complete = 1; } } return pivot; }
void quicksort(int x[], int i, int j) { int pivot; if (i < j) { pivot = partition(x, i, j); quicksort(x, i, pivot - 1); quicksort(x, pivot + 1, j); } }
int main() { int arr[] = {12, 1, 16, 9, 61, 3, 81, 21, 17, 0}; int len = sizeof(arr) / sizeof(arr[0]); quicksort(arr, 0, len - 1); for (int i = 0; i < len; i++) printf("%d ", arr[i]); return 0; }
Output
Implementation in Java
Java
Java
public class QuickSort {
public static int partition(int[] x, int i, int j) { int left = i, right = j - 1, pivot = j, complete = 0, t; while (complete != 1) { while (x[left] <= x[pivot] && pivot != left) left++; if (pivot != left) { t = x[pivot]; x[pivot] = x[left]; x[left] = t; pivot = left; } else complete = 1; if (complete != 1) { while (x[right] >= x[pivot] && pivot != right) right--; if (pivot != right) { t = x[pivot]; x[pivot] = x[right]; x[right] = t; pivot = right; } else complete = 1; } } return pivot; }
public static void quicksort(int[] x, int i, int j) { int pivot; if (i < j) { pivot = partition(x, i, j); quicksort(x, i, pivot - 1); quicksort(x, pivot + 1, j); } }
public static void main(String[] args) { int[] arr = {31, 11, 61, 52, 65, 16, 5, 21, 7, 10}; int len = arr.length; quicksort(arr, 0, len - 1); for (int i = 0; i < len; i++) System.out.print(arr[i] + " "); } }
Output
Implementation in Python
Python
Python
def partition(x, i, j): left, right, pivot, complete = i, j - 1, j, 0 while complete != 1: while x[left] <= x[pivot] and pivot != left: left += 1 if pivot != left: x[pivot], x[left] = x[left], x[pivot] pivot = left else: complete = 1 if complete != 1: while x[right] >= x[pivot] and pivot != right: right -= 1 if pivot != right: x[pivot], x[right] = x[right], x[pivot] pivot = right else: complete = 1 return pivot
def quicksort(x, i, j): if i < j: pivot = partition(x, i, j) quicksort(x, i, pivot - 1) quicksort(x, pivot + 1, j)
arr = [81, 3, 62, 71, 6, 33, 42, 27, 71, 0] quicksort(arr, 0, len(arr) - 1) for num in arr: print(num, end=" ")
Output
QuickSort Time Complexity Analysis
Now that we know about quicksort, our next job would be to analyse the time complexity.
To do that, let us again consider an array.
Best Case Complexity
Let us consider that each time, the partitioning occurs from the middle. We can visualise it like this:
Here, the number of iterations required to partition the array always sums up to n for each level.
To calculate the number of levels, we have to determine the number of times we divide the size of the initial array by two until we get 1.
Therefore,
n / 2k = 1
=> n = 2k
=> log2(n) = log2(2k)
=> k = log2(n)
Hence, the total time complexity will be a function of nlogn i.e. O(nlogn)
Now, most of the time, the partitioning is not precisely from the middle. This is the ideal case, and so it gives the best case complexity.
Average Case Complexity
For average-case complexity, let us consider the partitioning to be in a 3:1 ratio.
The total number of iterations for partitioning in each level will again be n.
For the number of levels, we have to determine the number of times we divide the size of the initial array by four till we get 1.
Therefore,
n / 4k = 1
=> n = 4k
=> log4(n) = log4(4k)
=> k = log4(n)
Hence the average case time complexity will again be O(nlogn).
Worst Case Complexity
In the worst-case scenario, we will be sorting an array that is already sorted. So, the partitioning diagram will be as follows:
Therefore, for large values of n, the time complexity will be,
Like quicksort,merge sort is another divide and conquer algorithm following a similar method to sort arrays.
Then why do we need two different algorithms?
To understand this, let us compare and contrast the two algorithms.
Property
QuickSort
Merge Sort
Partition of elements
Partitions are made in any ratio.
Partition is made into two halves.
Worst-case complexity
O(n2)
O(n.log n)
Suitability
It works well on small arrays.
It works well on any array, irrespective of the size.
Speed
Works comparatively faster for small data sets.
It has a consistent speed for all sizes of data sets.
Storage Requirements
Less
More
Efficiency
Inefficient for large data sets.
Comparatively more efficient.
Sorting Method
Internal
External
Stability
Not stable
Stable
Preferred Use
Arrays
Linked Lists
Therefore, we can conclude that merge sort is often more helpful than quicksort, especially for large data sets.
Advantages of QuickSort
There are several advantages of the QuickSort algorithm:
It is generally faster than many other sorting algorithms, especially for large datasets. It has an average and best-case time complexity of O(n log n), making it efficient for most scenarios
It is an in-place sorting algorithm, which means it doesn't require additional memory for sorting. It sorts the elements within the original array, making it memory-efficient
It exhibits good performance on average, especially when the pivot selection and partitioning methods are optimized
It's one of the most widely used sorting algorithms and is often the default sorting algorithm in programming libraries
Disadvantages of QuickSort
Along with the advantages, the QuickSort algorithm has some disadvantages as well:
In the worst-case scenario, it can have a time complexity of O(n^2). This occurs when the pivot selection consistently leads to unbalanced partitions
It is not a stable sorting algorithm. It may change the relative order of equal elements in the sorted array
The choice of the pivot element significantly affects its performance. Poor pivot selection can lead to inefficient sorting
It is less suitable for sorting linked lists compared to arrays because it relies on random access to elements
It uses a recursive approach, which may lead to stack overflow errors for very large datasets
In quicksort, we first select a pivot, with which comparisons are made to make two partitions. The exact process is then repeated for both the partitions until the array is sorted.
What is meant by quicksort?
Quicksort is a divide and conquer algorithm used to sort the elements in an array.
Why is quicksort called Quick?
Quicksort is called quick because it is faster than other sorting algorithms when sorting small data sets.
Why is quicksort the best sorting algorithm?
QuickSort is favored for its efficiency, in-place sorting, and good average-case performance with O(n log n) time complexity. Its adaptability via pivot selection strategies helps mitigate worst-case scenarios. However, choosing the best sorting algorithm depends on specific requirements, and other algorithms may be more suitable in some cases.
Conclusion
In this article, we learned about quicksort, how it works and how to implement it, but this isn’t the end.
Keeping the theoretical knowledge at our fingertips helps us get about half the work done. To gain complete understanding, practice is a must. A variety of coding questions from interviews are available here.
Apart from this, you can also checkCoding Ninjas Studio, where you’ll find thousands of other coding questions to practice and interview experiences of scholars in renowned product-based companies.
Happy learning!
Live masterclass
Master PowerBI using Netflix Data
by Ashwin Goyal, Product @ HRS Groups, Ex - Udaan, OYO