Why Analysis of Algorithms is Important?
The analysis of algorithms is essential for evaluating the efficiency of code in terms of time and space usage. It helps developers understand how algorithms scale with larger inputs, optimize performance, and select the best approach for different problems. By analyzing algorithms, developers can also avoid performance bottlenecks and ensure resource-efficient applications.
Types of Algorithm Analysis
Algorithm analysis typically includes worst-case, best-case, and average-case analysis. The worst-case analysis estimates the maximum time an algorithm will take, ensuring it performs reliably under all conditions. Best-case focuses on the minimal time required, while average-case provides a balanced estimate, useful for most practical scenarios. Together, these analyses offer a comprehensive view of algorithm efficiency.
There are three types of analysis of algorithms are -
- Worst Case
- Average Case
- Best Case
1. Worst Case — This is done usually
During the worst case analysis, first of all, we calculate the upper bound of a running time. In this case, the maximum number of operations could be conducted over all the different input sizes placed under ‘n’.
2. Average Case
This is used moderately. This is useful but difficult method to use for all measurements, hence its use is limited. In this case, we consider all the inputs and then, consider the computing time taken for all the inputs. Then the sum of values is divided by the sum of the total inputs. The average will help us predict almost accurately the distribution throughout the data set.
3. Best Case
This is rarely used. In this case, the lower bound algorithm is used. Here, we must consider the minimum number of operations that can be executed for the input size ’n’. This might help us to get the best behaviour of a specific algorithm.
Question 1
What are the recurrence relation and the worst-case time complexity of Quick Sort?
- T(n) = T(n/2) + O(n), Time complexity = O(n^2)
- T(n) = T(n-1) + O(n), Time complexity = O(n^2)
- T(n) = 2*T(n/2) + O(n), Time complexity = O(n log(n))
- T(n) = T(n/3) + T(2n/3) + O(n), Time complexity = O(n)
Answer: B
The worst case of quicksort happens when the picked pivot element is one of the corner elements in the sorted array. This way, the array's length will reduce by only 1, and the recursive function will be called for an array of length (n-1).
Thus, we get the recurrence relation: T(n) = T(n-1) + O(n).
We will get that time complexity = O(n^2) on solving this recurrence.
Question 2
Which of the following statements is false about comparison-based sorting algorithms?
- The minimum time complexity for comparison-based sorting algorithms is O(n*log(n)).
- We can use the position as a criterion while comparing the two elements to make them stable.
- Counting sort is not a comparison-based sorting algorithm.
- Heapsort is not a comparison-based sorting algorithm.
Answer: D
In the heapsort algorithm, we compare an element with its left and right child in the heapsort algorithm. So, it is a comparison-based sorting algorithm.
Question 3
What is the time complexity of the following function?
int fun(int n){
int cnt = 0;
for (int i = n; i > 0; i /= 2)
for (int j = 0; j < i; j++)
cnt += 1;
return cnt;
}
- O(n*log(n))
- O(n^2)
- O(n)
- O(n*log(n)*log(n))
Answer: C
For each iteration of the first loop, the second loop is executed ‘i’ times.
Thus the total number of times the second loop is executed will be n + n/2 + n/4 + n/8 … 1. The time complexity of the function = O(n + n/2 + n/4 + … + 1) = O(n).
Question 4
Which of the following is the correct recurrence relation for finding the time complexity of the Tower of Hanoi problem with n discs.
- T(n) = 2*T(n-2) + 2
- T(n) = 2*T(n-1) + n
- T(n) = 2*T(n/2) + 1
- T(n) = 2*T(n-1) + 1
Answer: D
In the Tower of Hanoi, we perform the following operations:
- We first move top (n-1) discs from A to B by calling function(n-1, A, C, B)
- We then move the nth disc from A to C in O(1)
- Then we again move the (n-1) discs from B to C by calling function(n-1, B, A, C)
Thus, the recurrence relation is T(n) = T(n-1) + O(1) + T(n-1).
Question 5
Which of the following time complexity is not O(n^2)?
- (15^10)*n + 12099
- n^1.98
- n^3/sqrt(n)
- (2^20)*n
Answer: C
n^3/sqrt(n) = n^3/n^(0.5) = n^(3-0.5) = n^2.5. It is greater than O(n^2), hence it is the correct answer.
Question 6
What is the best-case time complexity of the bubble sort algorithm?
- N^2
- N*log(N)
- N
- N*log(N)^2
Answer: C
We keep a boolean variable in the Bubble Sort algorithm to check if two elements were swapped at least once in the inner loop. If the array is initially sorted, a swap operation will not be performed, thus the function will return after one iteration of the inner loop.
Question 7
Find the time complexity of the below code:
void fun(int n, int arr[])
{
int i = 0;
int j = 0;
for(; i < n; ++i){
while(j < n && arr[i] < arr[j])
j++;
}
}
- N
- N^2
- N*log(N)
- N*log(N)^2
Answer: A
The variables i and j will have a value of at most n after the function is completed. In one operation, either the value of i increases or the value of j increases. Thus the total number of operations required will be 2*n = O(n).
Question 8
Which of the following algorithms have the least worst-case time complexity?
- Bubble sort
- Merge sort
- Quicksort
- Selection sort
Answer: B
The worst-case time complexity of the merge sort algorithm is O(N*log(N)), while for other algorithms, it is O(N^2).
Question 9
Given a sorted array having N integers. Let T(N) be the time taken by the most efficient algorithm to check if there exist any two elements with a sum less than or equal to 1000. Which of the following is true?
- T(N) = O(1)
- N < T(N) < N*log(N)
- N*log(N) < T(N) < NCR(N, 2)
- T(N) = NCR(N, 2)
Answer: A
Since the array is sorted, the sum of the first two elements will be the minimum possible sum. If this sum is <= 1000, then there exist two elements; otherwise, not.
Question 10
Find the time complexities of the following two functions:
int fun1(int n){
if (n <= 1)
return n;
return
2*fun1(n-1);
}
int fun2(int n){
if (n <= 1)
return n;
return
fun2(n-1) + fun2(n-1);
}
- O(2^n), O(2^n)
- O(n), O(2^n)
- O(2^n), O(n)
- O(n), O(n)
Answer: B
The recurrence relation for the first function is:
T(n) = T(n-1) + O(1) which means that T(n) = O(n)
The recurrence relation for the second function is:
T(n) = 2*T(n-1) + O(1) which means that T(n) = O(2^n)
Read More - Time Complexity of Sorting Algorithms
Frequently Asked Questions
How do you analyze an algorithm?
Analyzing an algorithm involves examining its time and space complexity to understand how efficiently it performs as input size grows. This helps in optimizing performance.
What are the 3 algorithm analysis techniques?
The three main techniques are worst-case, best-case, and average-case analysis, which evaluate an algorithm’s efficiency under different input scenarios to determine optimal performance.
What are the two types of analysis of algorithms?
The two main types are asymptotic analysis, which focuses on growth rate, and empirical analysis, which involves testing actual runtime on sample inputs.
Is analysis of algorithms difficult?
Algorithm analysis can be challenging due to complex mathematical concepts and varying performance factors, but it becomes manageable with practice and foundational understanding.
Conclusion
In this article, we discussed various quality multiple-choice questions based on the analysis of algorithms and their time complexity. We hope that this blog has helped you enhance your knowledge of analysis of algorithms and if you would like to learn more, check out our articles on code360. Do upvote our blog to help other ninjas grow. Happy Coding!