Table of contents
1.
Introduction
2.
What is Sorting?
3.
Why are sorting algorithms important?
4.
Characteristics of Sorting Algorithms
5.
Classification of sorting algorithms
5.1.
Bubble Sort
5.1.1.
Bubble Sort Complexity
5.2.
Selection Sort
5.2.1.
Steps of Selection Sort:
5.2.2.
Selection Sort Complexity
5.3.
Insertion Sort
5.3.1.
Steps of Insertion Sort:
5.3.2.
Complexity of Insertion Sort
5.3.3.
Importance of Insertion Sort:
5.4.
Quick Sort
5.4.1.
Steps of Quick Sort:
5.4.2.
Importance of Quick Sort:
5.4.3.
Complexity of Quick Sort
5.5.
Merge Sort
5.5.1.
Steps of Merge Sort:
5.5.2.
Importance of Merge Sort:
5.5.3.
Complexity of Merge Sort
5.6.
Heap Sort
5.6.1.
Steps of Heap Sort:
5.6.2.
Complexity of Heap Sort
6.
Stability of Sorting Algorithm
7.
Frequently Asked Questions
7.1.
What is a sorting algorithm and its types?
7.2.
What are the four basic sorting algorithms?
7.3.
Which is the fastest sorting algorithm?
7.4.
What is the main purpose of sorting?
7.5.
Which sorting algorithm is the fastest?
7.6.
What is the stability of sorting algorithms?
7.7.
What is the best case in sorting?
8.
Conclusion
Last Updated: Aug 19, 2025
Medium

Sorting Algorithms

Author Yukti Kumari
1 upvote
Career growth poll
Do you think IIT Guwahati certified course can help you in your career?

Introduction

A sorting algorithm arranges elements in a specific order, typically ascending or descending, to enhance data searching, retrieval, and processing efficiency. Different sorting techniques, such as Bubble Sort, Quick Sort, and Merge Sort, are optimized based on time complexity, stability, and memory usage.

6 Sorting Algorithms, Features and Functions

In this article, we will discuss different sorting techniques with examples to understand their applications.

What is Sorting?

Sorting is the process of organizing data in a defined sequence, improving efficiency in searching, analysis, and presentation. It plays a crucial role in computer science, enabling faster data processing in various applications. Sorting algorithms work by comparing and rearranging elements until the desired order is achieved, ensuring optimal performance for different data structures and use cases.

Why are sorting algorithms important?

Sorting algorithms are important because we use them daily, which includes the following:

  • The way we arrange our books on the bookshelf
     
  • The way our clothes are organized
     
  • Dishes in the kitchen are arranged in some sort of order
     
  • The way the rows are arranged at the time of morning prayers in the school

 

Sorting of all these things may include any sorting order as mentioned below:

  • Arranging the books based on chronology
     
  • Arranging clothes based on new to old
     
  • Gathering for morning prayers and get arranged in the ascending order of height

Characteristics of Sorting Algorithms

A few important characteristics of sorting algorithms are : 

1. Time Complexity: Time complexity refers to the amount of time an algorithm takes to complete the sorting process relative to the size of the input. It is usually expressed using Big O notation. Different sorting algorithms have different time complexities, such as O(n^2) for Bubble Sort and O(n log n) for Quick Sort and Merge Sort. The time complexity can vary based on the best-case, average-case, and worst-case scenarios, depending on the initial arrangement of the elements being sorted.

2. Space Complexity: Space complexity refers to the amount of additional memory space required by an algorithm during the sorting process. Some sorting algorithms, like Bubble Sort and Selection Sort, operate directly on the input array and require only a constant amount of extra space, typically O(1). Other algorithms, such as Merge Sort, require additional space proportional to the size of the input, usually O(n), to store temporary subarrays during the merging process.

3. Stability: Stability is a characteristic that determines whether a sorting algorithm preserves the relative order of equal elements after sorting. In a stable sorting algorithm, if two elements have the same value, their relative order in the sorted output remains the same as in the original input. Bubble Sort, Insertion Sort, and Merge Sort are examples of stable sorting algorithms. On the other hand, algorithms like quicksort and Heap Sort are not inherently stable but can be modified to achieve stability.

4. In-Place vs. Out-of-Place: In-place sorting algorithms perform the sorting operation directly on the input array without requiring additional memory space. They modify the original array to achieve the sorted order. Examples of in-place sorting algorithms include Bubble Sort, Selection Sort, and Quick Sort. Out-of-place sorting algorithms, such as Merge Sort, require additional memory to store temporary subarrays during the sorting process. They create new arrays to hold the sorted elements, leaving the original array unmodified.

5. Comparison-Based vs. Non-Comparison-Based: Comparison-based sorting algorithms rely on comparing elements to determine their relative order. They use comparisons like "less than," "greater than," or "equal to" to make decisions about element placement. Examples of comparison-based sorting algorithms include Bubble Sort, Quick Sort, and Merge Sort. Non-comparison-based sorting algorithms, such as Counting Sort and Radix Sort, do not rely on element comparisons. Instead, they utilize the inherent properties or distribution of the elements to determine the sorted order.

6. Adaptivity: Adaptivity refers to the ability of a sorting algorithm to perform efficiently when the input is already partially sorted or has specific patterns. Adaptive sorting algorithms can take advantage of existing order in the input to reduce the number of comparisons or swaps required. For example, Insertion Sort performs well on nearly sorted arrays, as it requires fewer element shifts in such cases. Adaptive sorting algorithms can provide better performance in scenarios where the input data has some inherent structure or order.

Classification of sorting algorithms

Sorting may be classified into different types. Some major sorting algorithms are:

Let’s explain them with the help of examples.

Bubble Sort

In bubble sort, if the adjacent elements are in the wrong order, they are swapped continuously until the correct order is achieved. The Disadvantage of using bubble sort is that it is quite slow.

For Example: Consider an unordered list [4, 6, 2, 1].

Pass 1

● 4 < 6 : no change [4, 6, 2, 1] ● Now move next 6 > 2 : swap the elements [4, 2, 6, 1]

● Now 6 > 1 : swap the elements [4, 2, 1, 6]

Pass 2

● 4 > 2 : swap the elements [2, 4, 1, 6]

● 4 > 1 : swap the elements [2, 1, 4, 6]

● 4 < 6 : no change is needed [2, 1, 4, 6]

Pass 3

● 2 > 1 : swap the elements [1, 2, 4, 6]

● 2 < 4 : no change is needed [1, 2, 4, 6]

● 4 < 6 : no change is needed [1, 2, 4, 6]

Pass 4

● 1 < 2 : no change is needed [1, 2, 4, 6]

● 2 < 4 : no change is needed [1, 2, 4, 6]

● 4 < 6 : no change is needed [1, 2, 4, 6]

Bubble Sort Complexity

Bubble Sort Complexity

It is quite impractical and too slow. Hence, for a large set of data, this sorting algorithm is not useful.

Selection Sort

Selection sort repeatedly finds the minimum element from an unsorted array and puts it at the beginning of the array. It is an in-place comparison-based sorting algorithm.

Two arrays are maintained in case of selection sort:

  • The unsorted array
  • Sorted array

Initially, the sorted array is an empty array, and an unsorted array has all the elements. It generally follows the approach of selecting the smallest element from an unsorted array and that smallest element is placed at the leftmost, which becomes the part of the sorted array finally.

Steps of Selection Sort:

  • The first step is to iterate the complete array
     
  • When we reach the end, we will get to know the sorted element in the list
     
  • Iterate that sorted element with the leftmost element in the unsorted list
     
  • Now, that leftmost element will be part of the sorted array and will not be included in the unsorted array in the next iteration
     
  • Steps will be repeated until all the elements are sorted
     

The following image shows the selection sort in a better way:

Steps of Selection Sort

Selection Sort Complexity

Insertion Sort

It is simple and easy to implement, but it does not have an outstanding performance. It works on the principle of a sorted item with one item at a time. Basically in each iteration of this sorting, an item is taken from the array, and it is inserted at its correct position by comparing the element from its neighbor. The process is repeated until there is no more unsorted item on the list.

Insertion Sort

Steps of Insertion Sort:

  • Leave the first element of the list, and move to the next element in the array.
     
  • Now compare with all the elements in the sorted sub-list.
     
  • Shift all the elements in the sorted sublist that are greater than the elements to be sorted.
     
  • Insert the element at that position.
     
  • Repeat all the steps until the list is sorted.

 

For Example:

Consider a list of items as 7, 4, 5, 2

Step 1: There is no element on the left side of 7 so leave the element as it is.
 

Step 2: Now, 7>4, so swap it. So the new list would be 4, 7, 5, 2.
 

Step 3: Now 7>5, so swap it as well. So the new list would be 4, 5, 7, 2.
 

Step 4: As 7>2, so swap it. The new list would be 2, 4, 5, and 7.

Complexity of Insertion Sort

Complexity of Insertion Sort

Importance of Insertion Sort:

  • It is important for smaller data sets but inefficient for large data sets.
     
  • It has less space complexity it requires a single addition to memory space.
     
  • It has an overall complexity of O(n2).

Quick Sort

It is a commonly used sorting algorithm. It follows the approach of divide and conquers and follows the following approach.

  1. Takes two empty arrays in which,
    a) The first array stores the elements that are smaller than the pivot element.

    b) The second array stores the elements that are larger than the pivot element.
     
  2. Partitioning the array and swapping them in place.

 

Steps of Quick Sort:

  • The basis of comparison would be an element that is a “pivot” element in this case.
     
  • Take two pointers; start one pointer from the left and the other pointer from the right.
     
  • When we have less value than the pivot element in the left pointer of the array, move it to the right by 1.
     
  • When we have a larger value than the pivot element in the right pointer of the array, move it to the left by 1.
     
  • When we have a left pointer less than a right pointer, swap the values at these locations in the array.
     
  • Move the left pointer to the right pointer by 1 and the right to the left by 1.
     
  • If the left and right pointers do not meet, repeat step 1.

 

Importance of Quick Sort:

  • It is very useful for sorting arrays.
     
  • QuickSort is not a stable sorting algorithm.
     
  • Its overall time complexity is O(nlogn).
     
  • It is an in-place sorting algorithm.

Complexity of Quick Sort

Complexity of Quick Sort

Merge Sort

It is a sorting algorithm that follows the divide-and-conquer methodology. It recursively breaks down a problem into two or more sub-problems. This recursion is continued until a solution that can be solved easily is found. Now, these sub-problems are combined together to form the array.

Steps of Merge Sort:

  • If the array contains only one element, return from the array.
     
  • Now, divide the complete array into two equal halves, divide until it can not be divided
    further.
     
  • Merge the smaller list into a new list in sorted order.
Steps of Merge Sort

Importance of Merge Sort:

  • It is best used for sorting the linked list.
     
  • It is a stable sorting algorithm.
     
  • It has an overall complexity of O(nlogn).
     
  • It has a space complexity of O(n).

Complexity of Merge Sort

Complexity of Merge Sort

Heap Sort

It is a comparison-based sorting algorithm. It is also said to be the better version of the selection sort. Basically, the list is divided into sorted and unsorted arrays. And on a continuous basis, the unsorted list is shrunk and added to the sorted list. Heap sort first prepares a max heap. Then, the first element is swapped with the last element.

Steps of Heap Sort:

  • Call the heapify() that forms the heap from a list in O(n) operation.
     
  • Swap the first element with the last element.
     
  • Call the shiftDown() to shift the first new element to its appropriate position.
     
  • Repeat the steps until the list is sorted.

Complexity of Heap Sort

Complexity of Heap Sort

Stability of Sorting Algorithm

A stable sorting algorithm preserves the relative order of elements with equal keys. This means that if two elements have the same value, their order remains unchanged after sorting. Stability is crucial in applications where secondary sorting criteria matter.

For example, consider sorting a list of students by marks while maintaining their original order by name if marks are the same. A stable sort ensures that students with the same marks appear in the same order as in the original list.

Examples:

  • Stable Sorting Algorithms: Merge Sort, Bubble Sort, Insertion Sort
     
  • Unstable Sorting Algorithms: Quick Sort, Heap Sort, Selection Sort
     

Stability is important in scenarios like database sorting, lexicographic ordering, and multi-level sorting, where maintaining initial ordering is required for accurate results.

Frequently Asked Questions

What is a sorting algorithm and its types?

A sorting algorithm arranges elements in a specific order. Common types include comparison-based (quicksort, Merge Sort) and non-comparison-based (Radix Sort, Counting Sort) algorithms.

What are the four basic sorting algorithms?

The four basic sorting algorithms are Bubble Sort, Selection Sort, Insertion Sort, and Merge Sort, each differing in time complexity and efficiency.

Which is the fastest sorting algorithm?

QuickSort is generally the fastest for most cases due to its O(n log n) average-case complexity, but Radix Sort can be faster for large numerical data.

What is the main purpose of sorting?

Sorting is the process of arranging data into meaningful order so that you can analyze it more effectively.

Which sorting algorithm is the fastest?

In practice, quicksort is usually the fastest sorting algorithm. Its performance is measured most of the time in O(N × log N).

What is the stability of sorting algorithms?

A stable sorting algorithm maintains the relative order of the items with equal sort keys.

What is the best case in sorting?

The best case is the function that performs the minimum number of steps on input data of n elements. 

Conclusion

In this article, we have learned about sorting algorithms and their characteristics. We explored six common sorting algorithms: Bubble Sort, Selection Sort, Insertion Sort, Quick Sort, Merge Sort, and Heap Sort. We discussed their time and space complexities, stability, in-place vs. out-of-place nature, comparison-based vs. non-comparison-based approaches, and adaptivity. These characteristics help in selecting the most suitable sorting algorithm for specific scenarios and better performance, which could save commuting resources.

Check out these useful blogs - 

Live masterclass