Most Asked Sorting Algorithms for Coding Interviews: A Comprehensive Guide
Sorting is a foundational concept in computer science, and proficiency with sorting algorithms is a must for coding interviews. Whether you’re preparing for FAANG, startups, or any technical role, interviewers frequently test your understanding of sorting—not just to see if you can implement them, but to gauge your grasp of time/space complexity, trade-offs, and problem-solving intuition.
Sorting algorithms are used to arrange data in a specific order (e.g., ascending, descending) and form the backbone of many applications: from database indexing and search operations to data analysis and machine learning. In interviews, you might be asked to implement a sorting algorithm from scratch, optimize an existing one, or choose the best algorithm for a given scenario (e.g., “Sort a million integers” or “Sort a linked list”).
This blog breaks down the most commonly asked sorting algorithms in coding interviews. We’ll cover how each works, their time/space complexity, pros/cons, code examples, and real-world use cases. By the end, you’ll be equipped to tackle sorting questions with confidence.
Table of Contents#
- Bubble Sort
- Selection Sort
- Insertion Sort
- Merge Sort
- Quick Sort
- Heap Sort
- Counting Sort
- Radix Sort
- Comparison of Sorting Algorithms
- Interview Tips & Practice Problems
- Conclusion
- References
1. Bubble Sort#
Overview#
Bubble Sort is one of the simplest sorting algorithms, but it’s rarely used in practice due to inefficiency. However, it’s often taught to introduce core sorting concepts like comparison and swapping.
How It Works#
Bubble Sort repeatedly steps through the list, compares adjacent elements, and swaps them if they are in the wrong order. This process “bubbles” the largest unsorted element to its correct position at the end of the list with each pass.
Example: Sort [64, 34, 25, 12, 22, 11, 90]
- Pass 1: Compare 64 & 34 → swap →
[34, 64, 25, 12, 22, 11, 90]; 64 & 25 → swap; ... → 90 bubbles to end. - Pass 2: Largest unsorted element (64) bubbles to second last position.
- Continue until no swaps are needed (array is sorted).
Optimization: Add a flag to check if any swaps occurred in a pass. If not, the array is sorted, and we can exit early.
Complexity#
- Time:
- Best: O(n) (already sorted, with optimization).
- Average/Worst: O(n²) (reverse sorted).
- Space: O(1) (in-place sorting).
Pros & Cons#
- Pros: Simple to understand and implement; in-place; no extra space.
- Cons: Extremely inefficient for large datasets (O(n²) time); poor performance compared to other algorithms.
Code Example (Python)#
def bubble_sort(arr):
n = len(arr)
for i in range(n):
swapped = False
# Last i elements are already in place
for j in range(0, n-i-1):
if arr[j] > arr[j+1]:
arr[j], arr[j+1] = arr[j+1], arr[j] # Swap
swapped = True
if not swapped: # No swaps = sorted
break
return arr
# Example usage
arr = [64, 34, 25, 12, 22, 11, 90]
print(bubble_sort(arr)) # Output: [11, 12, 22, 25, 34, 64, 90]2. Selection Sort#
Overview#
Selection Sort is another simple in-place sorting algorithm. It works by selecting the smallest (or largest) element from the unsorted portion and swapping it with the first unsorted element, gradually building the sorted array.
How It Works#
- Divide the array into sorted (left) and unsorted (right) portions.
- Iterate through the unsorted portion to find the minimum element.
- Swap the minimum element with the first element of the unsorted portion (expanding the sorted portion by 1).
- Repeat until the entire array is sorted.
Example: Sort [64, 34, 25, 12, 22, 11, 90]
- Pass 1: Min is 11 → swap with 64 →
[11, 34, 25, 12, 22, 64, 90]. - Pass 2: Min in unsorted
[34, 25, 12, 22, 64, 90]is 12 → swap with 34 →[11, 12, 25, 34, 22, 64, 90]. - Continue until sorted.
Complexity#
- Time: O(n²) (best/average/worst) – always scans the entire unsorted portion.
- Space: O(1) (in-place).
Pros & Cons#
- Pros: Simple; in-place; fewer swaps than Bubble Sort (O(n) swaps vs. O(n²) for Bubble).
- Cons: Still O(n²) time; not stable (may change order of equal elements); inefficient for large datasets.
Code Example (Python)#
def selection_sort(arr):
n = len(arr)
for i in range(n):
# Find min element in unsorted array
min_idx = i
for j in range(i+1, n):
if arr[j] < arr[min_idx]:
min_idx = j
# Swap min element with first unsorted element
arr[i], arr[min_idx] = arr[min_idx], arr[i]
return arr
# Example usage
arr = [64, 34, 25, 12, 22, 11, 90]
print(selection_sort(arr)) # Output: [11, 12, 22, 25, 34, 64, 90]3. Insertion Sort#
Overview#
Insertion Sort builds the sorted array one element at a time by inserting each element into its correct position in the sorted portion of the array. It’s similar to how you sort a hand of cards.
How It Works#
- Start with the first element as the sorted portion.
- For each subsequent element, compare it with the sorted portion and shift larger elements to the right to make space.
- Insert the element into its correct position.
Example: Sort [64, 34, 25, 12, 22, 11, 90]
- Pass 1: Insert 34 into
[64]→[34, 64, 25, 12, 22, 11, 90]. - Pass 2: Insert 25 into
[34, 64]→[25, 34, 64, 12, 22, 11, 90]. - Continue until all elements are inserted.
Complexity#
- Time:
- Best: O(n) (already sorted – no shifts needed).
- Average/Worst: O(n²) (reverse sorted).
- Space: O(1) (in-place).
Pros & Cons#
- Pros: Efficient for small or nearly sorted datasets (O(n) best case); stable; in-place.
- Cons: O(n²) time for large/unsorted data; not suitable for large arrays.
Code Example (Python)#
def insertion_sort(arr):
for i in range(1, len(arr)):
key = arr[i] # Element to insert
j = i - 1 # Last index of sorted portion
# Shift elements of sorted portion > key to the right
while j >= 0 and key < arr[j]:
arr[j + 1] = arr[j]
j -= 1
arr[j + 1] = key # Insert key in correct position
return arr
# Example usage
arr = [64, 34, 25, 12, 22, 11, 90]
print(insertion_sort(arr)) # Output: [11, 12, 22, 25, 34, 64, 90]4. Merge Sort#
Overview#
Merge Sort is a divide-and-conquer algorithm. It splits the array into two halves, recursively sorts each half, and then merges the sorted halves into a single sorted array.
How It Works#
- Divide: Split the array into two halves until each subarray has 1 element (trivially sorted).
- Conquer: Recursively sort the left and right halves.
- Merge: Combine the sorted halves into a single sorted array using a helper function.
Example: Sort [64, 34, 25, 12, 22, 11, 90]
- Split into
[64, 34, 25]and[12, 22, 11, 90], then further split until single elements. - Merge
[64]&[34]→[34, 64]; merge with[25]→[25, 34, 64]. - Merge right half →
[11, 12, 22, 90]. - Merge
[25, 34, 64]&[11, 12, 22, 90]→ final sorted array.
Complexity#
- Time: O(n log n) (best/average/worst) – splitting takes O(log n) steps, merging takes O(n) per step.
- Space: O(n) (requires extra space to store merged subarrays).
Pros & Cons#
- Pros: Stable (preserves order of equal elements); efficient for large datasets (O(n log n)); works well for linked lists (no extra space for pointers).
- Cons: Extra O(n) space; not in-place.
Code Example (Python)#
def merge_sort(arr):
if len(arr) > 1:
mid = len(arr) // 2 # Midpoint
left = arr[:mid] # Left half
right = arr[mid:] # Right half
merge_sort(left) # Sort left half
merge_sort(right) # Sort right half
# Merge sorted halves
i = j = k = 0
while i < len(left) and j < len(right):
if left[i] < right[j]:
arr[k] = left[i]
i += 1
else:
arr[k] = right[j]
j += 1
k += 1
# Copy remaining elements (if any)
while i < len(left):
arr[k] = left[i]
i += 1
k += 1
while j < len(right):
arr[k] = right[j]
j += 1
k += 1
return arr
# Example usage
arr = [64, 34, 25, 12, 22, 11, 90]
print(merge_sort(arr)) # Output: [11, 12, 22, 25, 34, 64, 90]5. Quick Sort#
Overview#
Quick Sort is a divide-and-conquer algorithm that uses a pivot element to partition the array into two subarrays: elements less than the pivot and elements greater than the pivot. It then recursively sorts the subarrays.
How It Works#
- Choose a pivot: Select an element (e.g., last element, random element, or median).
- Partition: Rearrange the array so elements < pivot are on the left, elements > pivot are on the right.
- Recurse: Sort the left and right subarrays (excluding the pivot, which is now in its correct position).
Pivot Selection: To avoid worst-case O(n²) time (e.g., sorted array with last element as pivot), use strategies like:
- Random pivot selection.
- Median-of-three (first, middle, last elements).
Example: Sort [64, 34, 25, 12, 22, 11, 90] with pivot = 90 (last element).
- Partition: Elements < 90 →
[64, 34, 25, 12, 22, 11], pivot 90 is at the end. - Recurse on
[64, 34, 25, 12, 22, 11]with pivot 11 → partition into[]and[64, 34, 25, 12, 22]. - Continue until sorted.
Complexity#
- Time:
- Average: O(n log n) (balanced partitions).
- Worst: O(n²) (unbalanced partitions, e.g., sorted array with bad pivot).
- Space:
- Average: O(log n) (recursion stack for balanced partitions).
- Worst: O(n) (recursion stack for unbalanced partitions).
Pros & Cons#
- Pros: In-place (O(log n) stack space); average O(n log n) time; efficient for large datasets.
- Cons: Not stable; worst-case O(n²) time (avoidable with good pivot selection); poor for linked lists.
Code Example (Python)#
def quick_sort(arr, low=0, high=None):
if high is None:
high = len(arr) - 1
def partition(arr, low, high):
pivot = arr[high] # Choose last element as pivot
i = low - 1 # Index of smaller element
for j in range(low, high):
if arr[j] <= pivot:
i += 1
arr[i], arr[j] = arr[j], arr[i] # Swap
arr[i + 1], arr[high] = arr[high], arr[i + 1] # Place pivot in correct position
return i + 1 # Pivot index
if low < high:
pi = partition(arr, low, high) # Partition index
quick_sort(arr, low, pi - 1) # Sort left subarray
quick_sort(arr, pi + 1, high) # Sort right subarray
return arr
# Example usage
arr = [64, 34, 25, 12, 22, 11, 90]
print(quick_sort(arr)) # Output: [11, 12, 22, 25, 34, 64, 90]6. Heap Sort#
Overview#
Heap Sort uses a binary heap (a complete binary tree where parent nodes are larger than children for max-heap) to sort elements. It extracts the maximum element from the heap and builds the sorted array in reverse order.
How It Works#
- Build a max-heap: Convert the array into a max-heap (root is the largest element).
- Extract max elements: Swap the root (max element) with the last element of the heap, reduce the heap size by 1, and heapify the root.
- Repeat: Continue until the heap is empty; the array is now sorted.
Heapify: The process of converting a subtree into a heap (ensures parent nodes are larger than children).
Complexity#
- Time: O(n log n) (best/average/worst) – building the heap takes O(n), extracting elements takes O(n log n).
- Space: O(1) (in-place, if heapify is done iteratively).
Pros & Cons#
- Pros: In-place; O(n log n) time; efficient for large datasets.
- Cons: Not stable; slower than Quick Sort in practice (due to cache inefficiency); complex to implement.
Code Example (Python)#
def heap_sort(arr):
n = len(arr)
# Build max-heap
for i in range(n // 2 - 1, -1, -1):
heapify(arr, n, i)
# Extract elements from heap one by one
for i in range(n-1, 0, -1):
arr[i], arr[0] = arr[0], arr[i] # Swap root (max) with last element
heapify(arr, i, 0) # Heapify reduced heap
return arr
def heapify(arr, n, i):
largest = i # Initialize largest as root
left = 2 * i + 1
right = 2 * i + 2
# If left child is larger than root
if left < n and arr[left] > arr[largest]:
largest = left
# If right child is larger than largest so far
if right < n and arr[right] > arr[largest]:
largest = right
# If largest is not root
if largest != i:
arr[i], arr[largest] = arr[largest], arr[i] # Swap
heapify(arr, n, largest) # Recursively heapify the affected sub-tree
# Example usage
arr = [64, 34, 25, 12, 22, 11, 90]
print(heap_sort(arr)) # Output: [11, 12, 22, 25, 34, 64, 90]7. Counting Sort (Non-Comparison-Based)#
Overview#
Counting Sort is a non-comparison-based algorithm designed for integers with a small range (e.g., 0 to k). It counts the occurrences of each integer and uses these counts to determine their positions in the sorted array.
How It Works#
- Count occurrences: Create a count array where
count[i]is the number of timesiappears in the input. - Compute cumulative counts: Modify the count array to store cumulative counts (number of elements ≤ i).
- Place elements: Iterate over the input array in reverse, placing each element in its correct position using the cumulative count array.
Example: Sort [4, 2, 2, 8, 3, 3, 1] (range 1-8).
- Count array:
[0, 1, 2, 2, 1, 0, 0, 0, 1](indices 0-8). - Cumulative counts:
[0, 1, 3, 5, 6, 6, 6, 6, 7]. - Place elements in reverse: 1 → index 0, 3 → index 4, etc.
Complexity#
- Time: O(n + k), where n = number of elements, k = range of integers.
- Space: O(n + k) (for count array and output array).
Pros & Cons#
- Pros: Linear time (faster than O(n log n) for small k); stable.
- Cons: Only works for integers; inefficient if k is large (e.g., k = 10⁶ for n = 10³).
Code Example (Python)#
def counting_sort(arr):
if not arr:
return arr
max_val = max(arr)
min_val = min(arr)
range_val = max_val - min_val + 1
# Initialize count array and output array
count = [0] * range_val
output = [0] * len(arr)
# Store count of each element
for num in arr:
count[num - min_val] += 1
# Compute cumulative count (prefix sum)
for i in range(1, len(count)):
count[i] += count[i-1]
# Place elements in output array (reverse to maintain stability)
for num in reversed(arr):
output[count[num - min_val] - 1] = num
count[num - min_val] -= 1
# Copy output back to original array
for i in range(len(arr)):
arr[i] = output[i]
return arr
# Example usage
arr = [4, 2, 2, 8, 3, 3, 1]
print(counting_sort(arr)) # Output: [1, 2, 2, 3, 3, 4, 8]8. Radix Sort (Non-Comparison-Based)#
Overview#
Radix Sort sorts integers by processing their digits from least significant to most significant (LSD) or vice versa (MSD). It uses Counting Sort as a subroutine to sort digits.
How It Works#
- Find the maximum element: Determine the number of digits (d) in the largest element.
- Sort by each digit: For each digit position (from 0 to d-1), use Counting Sort to sort the array based on that digit.
Example: Sort [170, 45, 75, 90, 802, 24, 2, 66] (LSD).
- Digit 0 (units place): Sort by units →
[170, 90, 802, 2, 24, 45, 75, 66]. - Digit 1 (tens place): Sort by tens →
[802, 2, 24, 45, 66, 170, 75, 90]. - Digit 2 (hundreds place): Sort by hundreds →
[2, 24, 45, 66, 75, 90, 170, 802].
Complexity#
- Time: O(d*(n + k)), where d = number of digits, k = range of digits (0-9 for decimal).
- Space: O(n + k).
Pros & Cons#
- Pros: Linear time for fixed d (e.g., 32-bit integers); stable.
- Cons: Only works for integers/strings with fixed-length digits; requires Counting Sort as subroutine.
Code Example (Python)#
def radix_sort(arr):
if not arr:
return arr
max_val = max(arr)
digit = 1 # Start with least significant digit
while max_val // digit > 0:
counting_sort_by_digit(arr, digit)
digit *= 10
return arr
def counting_sort_by_digit(arr, digit):
n = len(arr)
output = [0] * n
count = [0] * 10 # Digits 0-9
# Store count of digits
for num in arr:
index = (num // digit) % 10
count[index] += 1
# Compute cumulative count
for i in range(1, 10):
count[i] += count[i-1]
# Place elements in output array (reverse for stability)
for num in reversed(arr):
index = (num // digit) % 10
output[count[index] - 1] = num
count[index] -= 1
# Copy output to original array
for i in range(n):
arr[i] = output[i]
# Example usage
arr = [170, 45, 75, 90, 802, 24, 2, 66]
print(radix_sort(arr)) # Output: [2, 24, 45, 66, 75, 90, 170, 802]9. Comparison of Sorting Algorithms#
| Algorithm | Best Time | Avg Time | Worst Time | Space | Stable? | In-Place? | Use Case |
|---|---|---|---|---|---|---|---|
| Bubble Sort | O(n) | O(n²) | O(n²) | O(1) | Yes | Yes | Small/nearly sorted datasets |
| Selection Sort | O(n²) | O(n²) | O(n²) | O(1) | No | Yes | Small datasets, minimal swaps needed |
| Insertion Sort | O(n) | O(n²) | O(n²) | O(1) | Yes | Yes | Small/nearly sorted datasets, linked lists |
| Merge Sort | O(n log n) | O(n log n) | O(n log n) | O(n) | Yes | No | Large datasets, linked lists |
| Quick Sort | O(n log n) | O(n log n) | O(n²) | O(log n) | No | Yes | Large datasets, average-case performance |
| Heap Sort | O(n log n) | O(n log n) | O(n log n) | O(1) | No | Yes | Large datasets, in-place requirement |
| Counting Sort | O(n + k) | O(n + k) | O(n + k) | O(n + k) | Yes | No | Small integer ranges (e.g., 0-100) |
| Radix Sort | O(d(n+k)) | O(d(n+k)) | O(d(n+k)) | O(n + k) | Yes | No | Fixed-digit integers/strings (e.g., phone numbers) |
10. Interview Tips & Practice Problems#
Key Interview Takeaways#
- Understand trade-offs: Interviewers often ask, “Why use Quick Sort over Merge Sort?” (Answer: In-place, better cache performance; Merge Sort is stable with O(n log n) worst case).
- Stability matters: If preserving the order of equal elements is required (e.g., sorting objects by key), choose stable algorithms (Merge Sort, Insertion Sort, Counting Sort).
- In-place vs. extra space: For memory-constrained systems, use in-place algorithms (Quick Sort, Heap Sort, Insertion Sort).
- Pivot selection: For Quick Sort, always mention random/median pivot to avoid worst-case O(n²) time.
Practice Problems#
- Implement Merge Sort for a linked list (LeetCode #148).
- Sort an array of 0s, 1s, and 2s (Dutch National Flag Problem, LeetCode #75).
- Find the k-th largest element using Quick Select (LeetCode #215).
- Sort a nearly sorted array (LeetCode #164).
11. Conclusion#
Sorting algorithms are the building blocks of efficient problem-solving in coding interviews. By mastering the algorithms covered here—from simple O(n²) ones like Bubble Sort to advanced O(n log n) algorithms like Quick Sort and non-comparison-based methods like Radix Sort—you’ll be well-prepared to tackle any sorting-related question.
Remember, the goal isn’t just to memorize code but to understand why each algorithm works, its trade-offs, and when to apply it. Practice implementing these algorithms by hand and analyzing their performance—this will help you ace your next technical interview!
12. References#
- Cormen, T. H., Leiserson, C. E., Rivest, R. L., & Stein, C. (2009). Introduction to Algorithms (3rd ed.). MIT Press.
- GeeksforGeeks. “Sorting Algorithms.” https://www.geeksforgeeks.org/sorting-algorithms/
- LeetCode. “Sorting Tag.” https://leetcode.com/tag/sort/
- Wikipedia. “Sorting Algorithm.” https://en.wikipedia.org/wiki/Sorting_algorithm