1.
Introduction
2.
Insertion Sort
2.1.
Algorithm
2.2.
Algorithmic Complexities
2.3.
3.
Merge Sort
3.1.
Pseudocode for MergeSort
3.2.
Algorithmic Complexities
3.3.
4.
Combined Algorithm
4.1.
Algorithmic Complexities
5.
FAQs
6.
Key takeaways
Last Updated: Mar 27, 2024

# Sorting by combining Insertion Sort and Merge Sort algorithms

## Introduction

In this blog, we will discuss some insights on both insertion sort and merge sort, and after that, we will discuss how we can use a combined version of insertion sort and merge sort to develop a Sorting algorithm, which has a better space and time complexity.

## Insertion Sort

Insertion Sort is a sorting algorithm where the array is split into two parts, one part is sorted, and the other is unsorted. Here we pick elements from the unsorted part and place them into the correct position in the sorted array.

For example, if we want to sort this array:

You can refer to this blog for a clear understanding of the Insertion sort.

### Algorithm

1. We will iterate in the array from the range [1, N].
2. After that, we will compare the current element with its predecessor.
3. If the current element is smaller than its predecessor, we will compare it with the previous elements.
4. We will move the greater elements one position up to make space for the current element.

### Algorithmic Complexities

Time Complexity: O(N*N)
Insertion sort takes the least time to sort the array if the array is already sorted, and it takes maximum time to sort if the array is in reverse order.

Space Complexity: O(1)

1. If the size of the array is small then, insertion sort works fast
2. Insertion sort takes O(N) when the array is already sorted
3. No extra space is required for the computation of insertion sort.

## Merge Sort

Merge Sort is a sorting algorithm based on the approach of divide and conquer. It breaks the array into two smaller halves of equal size, and then it merges both of the sorted halves. The merge function merges both the array here, assuming that both the halves are sorted.

For example, if we want to sort this array:

You can refer to this blog for a clear understanding of Merge Sort.

### Pseudocode for MergeSort

mergeSort(arr[], l,  r)

``````If r > l
Finding the middle point to divide the array into two halves:
middle m = l + (r-l)/2
Calling mergeSort for the first half:
mergeSort(arr, l, m)
Calling mergeSort for the second half:
mergeSort(arr, m+1, r)
Now, merge the two halves sorted in the above steps:
merge(arr, l, m, r)``````

### Algorithmic Complexities

Time Complexity: O(N*log(N))
The time complexity of merge sort for worst case, best case, and the average case is O(N*log(N)).

Space Complexity: O(N)

1. The division of the main problem into sub-problems has no major cost.
2. Time complexity is better than many sorting algorithms.

## Combined Algorithm

We can combine the advantages of both the sorting algorithms into a single algorithm and the time complexity of the combined algorithm would be

O([N + [K(log(N/K)]]).

Now let’s see how we got the time complexity of this combined algorithm.

1. Division

Firstly we will divide all the N elements into (N/K) groups of size K.

2. Sorting

We will perform an insertion sort for each subarray of size K to sort this subarray.
The total cost of insertion sort is
O(K) for the best case and O(K*K) for worst-case since there are (N/K) blocks of size K therefore, the best case would be (N/K) * K which would be equal to O(N), and for the worst-case O(N/K) * (K*K)  == O(N*K)

3. Merging

After applying insertion sort on the (N/K) subarrays of size K, we will now merge these groups using the merge sort technique.

### Algorithmic Complexities

Time Complexity:

Suppose we do i iterations of merge sort:
(2^i) * K = N  =>  (2^i) = N/K => i*(log(2)) = log(N/K) => i = log(N/K)
Time required to merge = O(N*log(N/K))

Therefore, the total time complexity of the combined algorithm is:
Best Case: N+N*log(N/K)
Worst Case: N*K + N*log(N/K)

Space Complexity: O(N)

If K = 1, the above algorithm will work as a merge sort, which is better in time complexity.
If K = N, the above algorithm will work as an insertion sort, better in space complexity.

Read More - Time Complexity of Sorting Algorithms and Euclid GCD Algorithm

## FAQs

1. What is the space complexity of an algorithm?
Space complexity can be referred to as the extra space used by the algorithm to solve a particular problem.

2. Name any five sorting algorithms?
Five sorting algorithms are merge sort, bubble sort, insertion sort, quick sort, radix sort.

3. Name the sorting algorithm which has the most optimized time complexity?
Merge Sort is the most optimized sorting algorithm. Its time complexity is O(N*log(N)) , the average time complexity of quick sort is O(n*logn) but in worst case its time complexity becomes O(n^2).

## Key takeaways

This article discussed sorting by combining Insertion Sort and Merge Sort algorithms. We have discussed insertion sort and mergeSort, and we have also discussed the time and space complexities of the merge sort, insertion sort. We have also discussed their combined sorting algorithm's time and space complexity.
Recommended Problem - Merge K Sorted Arrays

Until then, Keep Learning and Keep Coding.

Live masterclass