# Linear Sorts

## Counting sort

Algorithm (intuitive).

```for i = 0 to k-1 do C[i] = 0 for j = 0 to n do C[A[j]] = C[A[j]]+1 t = 0 for i = 0 to k-1 do for j = 0 to C[i] do t = t + 1 B[t] = i```

Algorithm (stable).

```for i = 0 to k-1 do C[i] = 0 for j = 0 to n do C[A[j]] = C[A[j]]+1 for i = 0 to n do C[i] = C[i-1]+C[i] for j = n downto 1 do B[C[A[j]]] = A[j] C[A[j]] = C[A[j]]-1```

Complexity. Both are $\Theta \left(n+k\right)$.

Algorithm.

```for i = 1 to d do stable-sort A on ith from right```

Complexity. Since we use counting sort, the time is $\Theta \left(d\left(n+r\right)\right)$ or $\Theta \left({log}_{r}s\left(n+r\right)\right)$.

Proposition. The optimal radix is $\frac{n}{lnn}$.

Proof. Use differentiation to optimize:

• $\frac{d}{dr}{log}_{r}s\left(n+r\right)=0$
• $r\approx \frac{n}{ln}r$
• $r=\frac{n}{lnn}$

This produces the optimal running time: $\Theta \left(\frac{nlogs}{logn}\right)$.

## Bucket sort

Algorithm.

```for i = 0 to n-1 do B[i] = 0 for i = 1 to n do put A[i] into B[floor(A[i]*n)] for i = 0 to n-1 do quadratic-sort(B[i]) concat-buckets(B)```

This may be modified for ranges $\left[0,r\right)$ by using $B\left[⌊\frac{A\left[i\right]\cdot n}{r}⌋\right]$. We may use diferentially sized buckets for non-uniform distributions.

Complexity. This takes $\Theta \left(n\right)$ on average (small buckets).