Module 7.1

Master Sorting Algorithms

Learn fundamental and advanced sorting techniques in Java. From simple comparison-based sorts to efficient linear-time algorithms, understand time complexity, space efficiency, and when to use each approach for optimal performance.

60 min read
Intermediate
12+ code examples
What You'll Learn
  • Comparison-based O(n log n) sorts
  • Simple O(n²) sorting techniques
  • Non-comparison linear-time sorts
  • Time complexity analysis and tradeoffs
  • In-place vs stable sorting properties
Contents
01

Overview & Complexity Comparison

Sorting is one of the most fundamental algorithms in computer science. Different sorting algorithms have different time complexities, space requirements, and use cases. Understanding these tradeoffs helps you choose the right algorithm for your problem.

Complete Sorting Algorithm Reference
Algorithm Best Case Average Case Worst Case Space Stable In-Place
Bubble Sort O(n) O(n²) O(n²) O(1)
Selection Sort O(n²) O(n²) O(n²) O(1)
Insertion Sort O(n) O(n²) O(n²) O(1)
Merge Sort O(n log n) O(n log n) O(n log n) O(n)
Quick Sort O(n log n) O(n log n) O(n²) O(log n)
Heap Sort O(n log n) O(n log n) O(n log n) O(1)
Counting Sort O(n+k) O(n+k) O(n+k) O(k)
Radix Sort O(d·n) O(d·n) O(d·n) O(n+k)
Legend: k = range of values, d = number of digits, n = array size, log n = base 2 logarithm

Time Complexity Lower Bound

For comparison-based sorting algorithms, the theoretical lower bound is Omega(n log n). This means no comparison-based algorithm can sort better than O(n log n) in the general case. Non-comparison sorts like Counting Sort and Radix Sort break this bound by exploiting properties of the input data.

Stability

A sorting algorithm is stable if it preserves the relative order of equal elements. This matters when sorting objects with multiple fields.

In-Place

An in-place algorithm sorts using O(1) extra space (excluding input array). This is crucial for memory-constrained environments.

Example: Different Algorithm Performance
public class SortingComparison {
  public static void main(String[] args) {
    int[] data = new int[1000000];
    // For 1 million elements:
    
    // Bubble Sort: ~4 seconds (O(n²) = 10¹² operations)
    // Merge Sort:  ~0.02 seconds (O(n log n) ≈ 2×10⁷ operations)
    // Difference: Merge Sort is 200x faster!
    
    // This exponential difference grows with array size
    // For 10M elements: 20,000x faster!
  }
}
Practice Questions

Question: You need to sort an array of 50,000 student records by age. The records also have student ID, and records with the same age must preserve their original order. Which algorithm is most suitable and why?

Show Solution

Solution: Merge Sort is ideal because:

  • Guarantees O(n log n) for large dataset (50,000 items)
  • Stable sorting preserves original order for equal ages
  • Consistent performance regardless of input order
  • Extra O(n) space is acceptable for this scenario

Question: You have 1,000,000 integers ranging from 0-100. Compare time complexity of Quick Sort vs Counting Sort for this input.

Show Solution

Solution:

  • Quick Sort: O(n log n) average = O(1,000,000 × 20) = 20 million operations
  • Counting Sort: O(n+k) = O(1,000,000 + 100) = ~1 million operations
  • Winner: Counting Sort is 20x faster for this specific case!
  • Key insight: Non-comparison sorts excel when input range is small

Question: You're sorting employee records by salary. Two employees have the same salary but different hire dates. The algorithm must preserve hire date order. Which algorithms can you use?

Show Solution

Solution: You must use stable sorting algorithms:

  • Merge Sort: ✓ Stable - Perfect choice
  • Counting Sort: ✓ Stable - If salary range is small
  • Insertion Sort: ✓ Stable - For small datasets
  • Quick Sort: ✗ Unstable - Cannot guarantee order
  • Heap Sort: ✗ Unstable - Will break tie ordering

Question: You're building a database index for 10 million integer IDs with limited RAM (100MB). Which sort would you choose and why?

Show Solution

Solution: This is a trade-off scenario:

  • Best choice: Quick Sort (despite O(n log n) avg) because:
  • Uses O(log n) recursion stack - minimal extra space
  • 10M × 4 bytes = 40MB for integers, only 24 bytes for recursion = ~64MB total
  • Merge Sort rejected: Needs 40MB extra space = 80MB total (exceeds budget)
  • Key lesson: In-place algorithms matter in space-constrained environments
02

Simple Comparison Sorts

These classic sorting algorithms are easy to understand and implement, making them ideal for learning fundamental sorting concepts. Although they have quadratic time complexity, they demonstrate important principles used in more advanced algorithms.

Bubble Sort

Bubble Sort repeatedly steps through the array, compares adjacent elements, and swaps them if they're in the wrong order. The largest unsorted element "bubbles" to its correct position in each pass.

Bubble Sort Implementation
void bubbleSort(int[] arr) {
  int n = arr.length;
  for (int i = 0; i < n - 1; i++) {
    boolean swapped = false;
    for (int j = 0; j < n - 1 - i; j++) {
      if (arr[j] > arr[j + 1]) {
        // Swap
        int temp = arr[j];
        arr[j] = arr[j + 1];
        arr[j + 1] = temp;
        swapped = true;
      }
    }
    if (!swapped) break; // Optimization: already sorted
  }
  // Time: O(n²), Space: O(1), Stable: Yes
}
Practice Questions

Question: Trace bubble sort on array [5, 2, 8, 1]. Show each pass.

Show Solution

Solution:

  • Initial: [5, 2, 8, 1]
  • Pass 1: [2, 5, 1, 8] - 8 is in place
  • Pass 2: [2, 1, 5, 8] - 5 is in place
  • Pass 3: [1, 2, 5, 8] - sorted!

Question: Trace selection sort on [4, 2, 7, 1, 3]. Show each iteration.

Show Solution

Solution:

  • Initial: [4, 2, 7, 1, 3]
  • i=0: Find min (1), swap: [1, 2, 7, 4, 3]
  • i=1: Find min (2), no swap: [1, 2, 7, 4, 3]
  • i=2: Find min (3), swap: [1, 2, 3, 4, 7]
  • i=3: Find min (4), no swap: [1, 2, 3, 4, 7]

Question: How can you optimize Bubble Sort to detect already-sorted arrays?

Show Solution

Solution: Use a flag to track if any swaps occur:

  • If a complete pass happens with no swaps, array is sorted
  • Set `swapped = false` at pass start
  • Set `swapped = true` only when swaps occur
  • If `swapped` is still false after a pass, break early
  • Benefit: Reduces best-case from O(n²) to O(n) for sorted arrays

Question: Compare Bubble, Selection, and Insertion sorts on: (1) tiny array [10 elements], (2) nearly-sorted [1000 elements], (3) reverse-sorted [1000 elements].

Show Solution

Solution Summary:

  • Tiny array: All perform similarly (~100 ops), but Insertion Sort has lowest constant factor
  • Nearly-sorted: Insertion Sort wins (O(n) best-case), Selection Sort worst (always O(n²))
  • Reverse-sorted: All degrade to O(n²), Insertion Sort hardest (max shifts per element)
  • General rule: Prefer Insertion Sort unless you need optimization for specific cases
Insertion Sort

Insertion Sort builds the sorted array one item at a time. It's like sorting playing cards in your hand - you take one card and insert it into the correct position among already-sorted cards.

Insertion Sort Implementation
void insertionSort(int[] arr) {
  int n = arr.length;
  for (int i = 1; i < n; i++) {
    int key = arr[i];
    int j = i - 1;
    
    // Shift elements greater than key one position right
    while (j >= 0 && arr[j] > key) {
      arr[j + 1] = arr[j];
      j--;
    }
    // Insert key at correct position
    arr[j + 1] = key;
  }
  // Time: O(n²), Space: O(1), Stable: Yes
}
When to Use Simple Sorts: Bubble Sort, Selection Sort, and Insertion Sort are practical for small arrays (< 50 elements) or nearly-sorted data. Insertion Sort is especially efficient for small or sorted arrays due to its O(n) best-case complexity.

Question: How can you optimize Insertion Sort for nearly-sorted data?

Show Solution

Solution: Insertion Sort already has O(n) best-case complexity for nearly-sorted data!

  • Each element requires minimal shifting when data is nearly sorted
  • The while loop executes very few iterations per element
  • For sorted arrays, Insertion Sort beats O(n log n) algorithms with much lower overhead

Question: Trace Insertion Sort on [5, 3, 8, 4]. Show array after each insertion.

Show Solution

Solution:

  • Initial: [5, 3, 8, 4]
  • i=1 (insert 3): Compare 3<5, shift 5 → [3, 5, 8, 4]
  • i=2 (insert 8): 8>5, no shift → [3, 5, 8, 4]
  • i=3 (insert 4): 4<8, shift 8; 4<5, shift 5 → [3, 4, 5, 8]

Question: Can binary search optimize Insertion Sort? If so, how much?

Show Solution

Solution: Binary search can find position in O(log n) but doesn't help overall:

  • Comparisons: Reduced from O(n²) to O(n log n) ✓
  • Shifts: Still O(n²) in worst case - shifting takes more time than comparisons!
  • Result: Time complexity remains O(n²) due to shift operations
  • Lesson: Not all optimizations matter - shifts dominate cost

Question: Why is Insertion Sort considered "adaptive" and when does this matter in practice?

Show Solution

Solution: Insertion Sort is adaptive because its performance depends on input disorder:

  • Adaptive measure: Number of inversions (pairs out of order)
  • Best case (O(n)): 0 inversions (already sorted)
  • Worst case (O(n²)): n(n-1)/2 inversions (reverse sorted)
  • Real-world use: Python's Timsort combines Insertion Sort + Merge Sort exactly for this reason
  • When it matters: Online sorting, semi-sorted streams, nearly-sorted databases
03

Merge Sort

Merge Sort is a divide-and-conquer algorithm that guarantees O(n log n) time complexity in all cases. It divides the array into smaller subarrays, sorts them recursively, and merges them back together maintaining order.

Divide and Conquer Strategy

Merge Sort uses the divide-and-conquer paradigm: divide the problem into smaller subproblems, solve them independently, and combine their solutions. This approach guarantees O(n log n) even in the worst case.

Merge Sort Implementation
void mergeSort(int[] arr, int left, int right) {
  if (left < right) {
    int mid = left + (right - left) / 2;
    mergeSort(arr, left, mid);        // Sort left half
    mergeSort(arr, mid + 1, right);   // Sort right half
    merge(arr, left, mid, right);     // Merge sorted halves
  }
}

void merge(int[] arr, int left, int mid, int right) {
  int leftSize = mid - left + 1;
  int rightSize = right - mid;
  
  int[] leftArr = new int[leftSize];
  int[] rightArr = new int[rightSize];
  
  // Copy data to temporary arrays
  for (int i = 0; i < leftSize; i++)
    leftArr[i] = arr[left + i];
  for (int j = 0; j < rightSize; j++)
    rightArr[j] = arr[mid + 1 + j];
  
  // Merge back with sorted order
  int i = 0, j = 0, k = left;
  while (i < leftSize && j < rightSize) {
    if (leftArr[i] <= rightArr[j]) {
      arr[k++] = leftArr[i++];
    } else {
      arr[k++] = rightArr[j++];
    }
  }
  
  // Copy remaining elements
  while (i < leftSize) arr[k++] = leftArr[i++];
  while (j < rightSize) arr[k++] = rightArr[j++];
}
Visual: Merge Sort Process on [38, 27, 43, 3]
DIVIDE Phase
38 27 43 3 38 27 43 3 38 27 43 3 Until single elements
MERGE Pairs
38 27 43 3 27 38 3 43 Compare and merge
FINAL Merge
27 38 3 43 3 27 38 43 Sorted Array!
Practice Questions

Question: Why does Merge Sort have O(n log n) complexity in all cases?

Show Solution

Solution: Due to its recursive structure:

  • Depth: The recursion tree has height log n (dividing array in half each time)
  • Work per level: At each level, we merge O(n) elements total
  • Total: log n levels × O(n) work per level = O(n log n)
  • Key: This holds regardless of input order - no best/worst case distinction!

Question: Merge Sort uses O(n) extra space. Is there a way to merge in-place?

Show Solution

Solution: In-place merging is theoretically possible but impractical:

  • Standard in-place merge requires O(n log n) comparisons and O(n²) moves
  • Defeats the purpose - you lose the speed advantage of Merge Sort
  • Practical conclusion: Use O(n) space - it's worth it for guaranteed O(n log n) time

Question: Trace the merge step for [2, 5] and [1, 6]. Show how elements are placed in output.

Show Solution

Solution:

  • Left: [2, 5], Right: [1, 6]
  • Compare 2 vs 1: 1<2, output [1], advance right
  • Compare 2 vs 6: 2<6, output [1,2], advance left
  • Compare 5 vs 6: 5<6, output [1,2,5], advance left
  • Left empty, copy right: [1,2,5,6]
  • Comparisons needed: 3 out of max 4

Question: Use Master Theorem to prove Merge Sort's T(n) = 2T(n/2) + O(n) = O(n log n).

Show Solution

Solution: Apply Master Theorem: T(n) = aT(n/b) + f(n)

  • a=2, b=2, f(n)=O(n)
  • Compare n^log_b(a) vs f(n): n^log₂(2) = n^1 vs O(n)
  • Equal case (Case 2): n^1 = O(n)
  • Result: T(n) = O(n^log₂(2) × log n) = O(n log n) ✓
  • Key insight: Equal work per recursion level × log n levels
04

Quick Sort

Quick Sort is another divide-and-conquer algorithm that uses a pivot element to partition the array. It has O(n log n) average-case complexity and is widely used in practice due to excellent cache locality and in-place sorting capability.

Partitioning Strategy

Quick Sort selects a pivot element and rearranges the array so that all elements smaller than the pivot are on the left, and all larger elements are on the right. This places the pivot in its final sorted position.

Quick Sort Implementation (Lomuto Partition)
void quickSort(int[] arr, int low, int high) {
  if (low < high) {
    int pi = partition(arr, low, high);
    quickSort(arr, low, pi - 1);   // Sort left of pivot
    quickSort(arr, pi + 1, high);  // Sort right of pivot
  }
}

int partition(int[] arr, int low, int high) {
  int pivot = arr[high];
  int i = low - 1;
  
  for (int j = low; j < high; j++) {
    if (arr[j] < pivot) {
      i++;
      // Swap arr[i] and arr[j]
      int temp = arr[i];
      arr[i] = arr[j];
      arr[j] = temp;
    }
  }
  
  // Place pivot in correct position
  int temp = arr[i + 1];
  arr[i + 1] = arr[high];
  arr[high] = temp;
  
  return i + 1;
}
Visual: Quick Sort Partitioning on [3, 7, 2, 8, 1, 9]
CHOOSE Pivot
3 7 2 8 1 9 Choose rightmost as pivot Pivot = 9 Elements < 9: [3,7,2,8,1] Elements > 9: []
PARTITION
3 2 1 7 8 9 Compare each element with pivot (9) Left: < 9 Right: > 9
RECURSE
[3 2 1 7 8] 9 [ ] Recursively sort left and right Sorted: [1 2 3 7 8] Final: [1 2 3 7 8 9]
Randomized Quick Sort: To avoid worst-case O(n²) with poor pivot selection, choose a random pivot instead of always using the last element. This makes worst-case behavior extremely unlikely in practice.
Practice Questions

Question: Trace partition on [5, 2, 8, 1, 9] with pivot 9. Show partition state after each comparison.

Show Solution

Solution:

  • Initial: [5, 2, 8, 1, |9], i=-1
  • Compare 5 < 9: Swap i++ with 5 → [5, 2, 8, 1, 9], i=0
  • Compare 2 < 9: Swap i++ with 2 → [2, 5, 8, 1, 9], i=1
  • Compare 8 < 9: Swap i++ with 8 → [2, 5, 8, 1, 9], i=2
  • Compare 1 < 9: Swap i++ with 1 → [2, 5, 1, 8, 9], i=3
  • Final: Place pivot → [2, 5, 1, 8, 9] (pivot at index 4)

Question: What input causes Quick Sort to degrade to O(n²)? How can you prevent it?

Show Solution

Solution:

  • Worst case: Already sorted array with rightmost pivot → pivot always separates 1 element
  • Result: n + (n-1) + (n-2)... = O(n²) operations
  • Prevention: Use randomized pivot selection or median-of-three to improve average case
  • Trade-off: Extra pivot selection time is worth O(n log n) guarantee in practice

Question: Compare 3 pivot strategies: first element, last element, median-of-three. What are trade-offs?

Show Solution

Solution:

  • First/Last element: O(1) time, but vulnerable to sorted/reverse-sorted inputs
  • Median-of-three: Compare arr[0], arr[n/2], arr[n-1], choose middle value
  • Finds better pivots, avoids worst-case on sorted arrays
  • Cost: 2 extra comparisons (O(1) → still O(1) overhead)
  • Random pivot: Best theoretical guarantee, O(1) expected depth

Question: Prove that random pivot selection gives O(n log n) expected time using recurrence analysis.

Show Solution

Solution: With random pivot, expected split is 50-50:

  • Recurrence: E[T(n)] = O(n) + E[T(n/2)] + E[T(n/2)]
  • Simplified: E[T(n)] = O(n) + 2E[T(n/2)]
  • Expand: O(n) + 2(O(n/2) + 2E[T(n/4)]) = O(n) + O(n) + 4E[T(n/4)]
  • Pattern: log n levels of O(n) work each
  • Result: E[T(n)] = O(n log n) ✓
05

Heap Sort & Non-Comparison Sorts

Heap Sort guarantees O(n log n) time with O(1) extra space. Non-comparison sorts like Counting Sort and Radix Sort can achieve linear time by exploiting properties of the input data, making them invaluable for specific scenarios.

Heap Sort

Heap Sort uses a max-heap data structure to achieve O(n log n) guaranteed performance with O(1) space. It builds a heap, then repeatedly extracts the maximum element and places it at the end of the sorted array.

Heap Sort Implementation
void heapSort(int[] arr) {
  int n = arr.length;
  
  // Build max heap
  for (int i = n / 2 - 1; i >= 0; i--) {
    heapify(arr, n, i);
  }
  
  // Extract elements from heap one by one
  for (int i = n - 1; i > 0; i--) {
    // Swap root (max) to end
    int temp = arr[0];
    arr[0] = arr[i];
    arr[i] = temp;
    
    // Heapify reduced heap
    heapify(arr, i, 0);
  }
}

void heapify(int[] arr, int n, int i) {
  int largest = i;
  int left = 2 * i + 1;
  int right = 2 * i + 2;
  
  if (left < n && arr[left] > arr[largest])
    largest = left;
  if (right < n && arr[right] > arr[largest])
    largest = right;
  
  if (largest != i) {
    // Swap and continue heapifying
    int temp = arr[i];
    arr[i] = arr[largest];
    arr[largest] = temp;
    heapify(arr, n, largest);
  }
}
Visual: Heap Sort Process on [3, 7, 2, 8, 1]
BUILD Max-Heap
8 7 3 2 1 Max-heap property
EXTRACT Max
8 Sorted Repeat with remaining elements [7, 3, 2, 1] Rebuild heap and repeat
RESULT
1 2 3 7 8 Extract sequence: 8 → 7 → 3 → 2 → 1 Time: O(n log n) Space: O(1) In-place sorting!

Question: Why is Heap Sort preferred over Merge Sort despite both being O(n log n)?

Show Solution

Solution: Heap Sort advantages:

  • Space: O(1) vs Merge Sort's O(n)
  • In-place: Modifies input array without extra allocation
  • Cache: For large datasets, less memory allocation helps CPU cache
  • Downside: Poor cache locality due to random heap access pattern

Question: Build a max-heap from [3, 2, 1, 7, 8]. Show array representation after heapify.

Show Solution

Solution:

  • Initial array: [3, 2, 1, 7, 8]
  • Build heap bottom-up: Start heapifying from index n/2-1 = 1
  • At index 1 (value 2): Children 7,8 > 2, swap with 8 → [3, 8, 1, 7, 2]
  • At index 0 (value 3): Children 8,1, swap with 8 → [8, 3, 1, 7, 2]
  • Verify max-heap property: 8>3, 8>1, 3>7, 3>2 ✓

Question: Starting from max-heap [8, 7, 3, 2, 1], trace extract-max twice. Show heaps after each extraction.

Show Solution

Solution:

  • Initial: [8, 7, 3, 2, 1], extract 8
  • Move 1 to root: [1, 7, 3, 2], sift down 1
  • 1<7, swap: [7, 1, 3, 2], continue sift
  • 1<2, swap: [7, 2, 3, 1]
  • Extract 7: [1, 2, 3], move 3 to root
  • 3>2, no swap: [3, 2, 1] (final heap)

Question: Prove that Heap Sort's time complexity is O(n log n) by analyzing build-heap and extract phases.

Show Solution

Solution: Two phases:

  • Build-heap: O(n) time (not O(n log n)!), uses bottom-up heapify
  • Each element sifts down max (height - level) positions
  • Total: n/2×1 + n/4×2 + n/8×3... = O(n)
  • Extract-max loop: n extractions × O(log n) sift = O(n log n)
  • Total: O(n) + O(n log n) = O(n log n)
Counting Sort (Non-Comparison)

Counting Sort achieves O(n) time by counting occurrences of each value and reconstructing the sorted array. It works for integers in a known range [0, k].

Counting Sort Implementation
void countingSort(int[] arr, int maxVal) {
  int[] count = new int[maxVal + 1];
  
  // Count occurrences
  for (int num : arr) {
    count[num]++;
  }
  
  // Reconstruct sorted array
  int index = 0;
  for (int i = 0; i <= maxVal; i++) {
    while (count[i] > 0) {
      arr[index++] = i;
      count[i]--;
    }
  }
  // Time: O(n + k), Space: O(k), Stable: Yes
}
Visual: Counting Sort on [3, 1, 3, 2, 1]
COUNT Values
Input: [3 1 3 2 1] 0: 0 1: 2 2: 1 3: 2 Count array: 0 1 2 2 1 3 2
CUMULATIVE Sums
Build positions 0: 0 1: 2 2: 3 3: 5 Value 1 appears at indices 0-1 Value 2 appears at index 2 Value 3 appears at indices 3-4
RECONSTRUCT
Output array 1 1 2 3 3 Sorted: [1 1 2 3 3] Time: O(n+k) k = range of values For small k: Faster than O(n log n)
Radix Sort (Non-Comparison)

Radix Sort extends Counting Sort by sorting digit-by-digit. It processes each digit position from least significant to most significant, using a stable sort (like Counting Sort) for each digit pass.

Radix Sort Implementation
void radixSort(int[] arr) {
  int maxNum = getMax(arr);
  
  // Process each digit from ones to most significant
  for (int exp = 1; maxNum / exp > 0; exp *= 10) {
    countingSortByDigit(arr, exp);
  }
}

void countingSortByDigit(int[] arr, int exp) {
  int n = arr.length;
  int[] output = new int[n];
  int[] count = new int[10];
  
  // Count occurrences of each digit
  for (int i = 0; i < n; i++) {
    count[(arr[i] / exp) % 10]++;
  }
  
  // Build cumulative count
  for (int i = 1; i < 10; i++) {
    count[i] += count[i - 1];
  }
  
  // Place elements in output
  for (int i = n - 1; i >= 0; i--) {
    output[count[(arr[i] / exp) % 10] - 1] = arr[i];
    count[(arr[i] / exp) % 10]--;
  }
  
  // Copy to original array
  for (int i = 0; i < n; i++) {
    arr[i] = output[i];
  }
}
Visual: Radix Sort on [170, 45, 75, 90, 2, 802, 24, 2, 66]
SORT by Ones
Sort by last digit 170 90 2 802 45 75 24 2 66 (Sorted by ones: 0,0,2,2,5,6,0,4,5) Digit 0: [170, 90, 2, 802] Digit 2: [2, 802, 24] Digit 4: [45] Digit 5: [75] Digit 6: [66]
SORT by Tens
Sort by tens digit 2 24 45 66 170 75 802 90 (Stable sort maintains relative order) Digit 0: [2, 802] Digit 1: [170] Digit 2: [24] Digit 4: [45] Digit 6-7-9: [66, 75, 90]
SORT by Hundreds
Sort by hundreds digit 2 24 45 66 75 90 170 802 FINAL: SORTED! Time: O(d·n) d = number of digits Space: O(n+k) Stable sort maintained!
When to Use Non-Comparison Sorts:
  • Counting Sort: Integer range [0, k] is small relative to n. Perfect for sorting grades (0-100) or ages.
  • Radix Sort: Very large integers or when you want O(d·n) where d is small (e.g., sorting 32-bit integers).
  • Trade-off: Extra space required, but guaranteed linear time for suitable inputs.
Practice Questions

Question: Trace Radix Sort on [12, 34, 21]. Show state after each digit sort.

Show Solution

Solution:

  • Initial: [12, 34, 21]
  • After ones digit: [21, 12, 34] (sorted by last digit: 1,2,4)
  • After tens digit: [12, 21, 34] (sorted by first digit: 1,2,3)

Question: You have 100,000 3-digit numbers. Which sort and why?

Show Solution

Solution: Radix Sort is optimal:

  • 3-digit numbers: d = 3 (constant)
  • Time: O(3·n) = O(n) - linear!
  • vs Quick Sort: O(n log n) ≈ 17n operations
  • Radix is 17x faster for this specific case

Key Takeaways

Time Complexity Matters

O(n²) algorithms become impractical for large datasets. O(n log n) sorts scale exponentially better

Stability is Important

Stable sorts preserve relative order of equal elements, critical when sorting complex objects

Space-Time Tradeoff

Some O(n log n) sorts use O(n) extra space, while others like Quick Sort work in-place

Divide and Conquer Pattern

Merge Sort and Quick Sort use recursion to solve sorting more efficiently than naive approaches

Non-Comparison Sorting

Counting Sort and Radix Sort break the O(n log n) lower bound by exploiting data properties

Choose Based on Context

Small data (Insertion), unsorted (Quick), guaranteed worst-case (Merge), or non-comparable (Counting)

Knowledge Check

Test your understanding of sorting algorithms:

Question 1 of 6

What is the time complexity of Merge Sort in the worst case?

Question 2 of 6

Which sorting algorithm uses a pivot to partition the array?

Question 3 of 6

Which algorithm has a guaranteed O(n log n) worst-case complexity?

Question 4 of 6

What does it mean for a sorting algorithm to be stable?

Question 5 of 6

Which sorting algorithm can achieve O(n) time complexity?

Question 6 of 6

When would you choose Insertion Sort over Quick Sort?

Answer all questions to check your score