Back to Library

Counting Sort

Counting
Counts occurrences of each distinct element and reconstructs the sorted array.
Out-of-Place
Requires auxiliary memory proportional to the input size (O(n)).
Stable
Preserves the relative order of elements with equal values.

Counting Sort is an efficient, non-comparison-based sorting algorithm that works best when the range of input values (kk) is not significantly greater than the number of elements (nn). Unlike comparison sorts like Merge Sort or Quick Sort, it does not compare elements against each other. Instead, it counts the frequency of each distinct element and calculates their starting positions in the output array using prefix sums. This makes it a stable sort with linear time complexity relative to the input size and range.

Visualize Fullscreen

Demo

20 elements • 4x Speed

No Data

Did you know?

  • Counting Sort and its application to Radix Sort were invented by Harold H. Seward in 1954.
  • It breaks the O(nlogn)O(n \log n) lower bound of comparison-based sorting because it never actually compares two elements to see which is larger.
  • It is often used as a high-speed subroutine within Radix Sort to handle specific digits of larger numbers.
  • The standard version requires integer keys, as array indices cannot be decimals.

How it Works

  • Find the Range: Traverse the array to find the maximum value (kk) to determine the size of the count array.

  • Initialize Count Array: Create an auxiliary array cntArr of size k+1k+1 initialized to zeros.

  • Frequency Count: Traverse the input array. For each element, increment the corresponding index in cntArr.

  • Prefix Sums: Modify cntArr by adding the value of the previous index to the current index. This step determines the position of elements in the sorted output.

  • Build Output: Traverse the input array backwards (to maintain stability). For each element, look up its position in cntArr, place it in the output array, and decrement the count.

Complexity Analysis

Best Case
O(n+k)O(n+k)
Average
O(n+k)O(n+k)
Worst Case
O(n+k)O(n+k)
Space
O(n+k)O(n+k)

Advantages

  • Linear Time Complexity: It runs in O(n+k)O(n+k) time, making it faster than any comparison-based sort (O(nlogn)O(n \log n)) when kk is O(n)O(n).
  • Stable: Preserves the relative order of elements with equal keys, which is essential when used as a subroutine in Radix Sort.
  • Simple Operations: Relies on basic arithmetic and array indexing rather than complex recursion or partitioning.

Disadvantages

  • Restricted Input: It is an integer sorting algorithm and does not work directly on floating-point numbers or strings without mapping.
  • Space Complexity: It is not an in-place algorithm (O(n+k)O(n+k) space). If the range of values (kk) is very large (e.g., sorting a few numbers ranging from 1 to 10910^9), memory usage becomes prohibitive.
  • Inefficient for Large Ranges: If the range kk is significantly larger than nn, the algorithm becomes slower than general comparison sorts.

Implementation

JavaScript counting-sort.js
function countSort(arr) {
    if (arr.length === 0) return [];

    const n = arr.length;
    // 1. Find the maximum value
    const maxVal = Math.max(...arr);

    // 2. Initialize count array
    const cntArr = new Array(maxVal + 1).fill(0);

    // 3. Count frequencies
    for (let v of arr) {
        cntArr[v]++;
    }

    // 4. Compute prefix sums
    for (let i = 1; i <= maxVal; i++) {
        cntArr[i] += cntArr[i - 1];
    }

    // 5. Build output array (backwards for stability)
    const ans = new Array(n);
    for (let i = n - 1; i >= 0; i--) {
        const v = arr[i];
        ans[cntArr[v] - 1] = v;
        cntArr[v]--;
    }

    return ans;
}