# Linear Sorts

## Counting sort

**Algorithm (intuitive).**

for i = 0 to k-1 do C[i] = 0
for j = 0 to n do C[A[j]] = C[A[j]]+1
t = 0
for i = 0 to k-1 do
for j = 0 to C[i] do
t = t + 1
B[t] = i

**Algorithm (stable).**

for i = 0 to k-1 do C[i] = 0
for j = 0 to n do C[A[j]] = C[A[j]]+1
for i = 0 to n do C[i] = C[i-1]+C[i]
for j = n downto 1 do
B[C[A[j]]] = A[j]
C[A[j]] = C[A[j]]-1

**Complexity.** Both are $\Theta (n+k)$.

## Radix sort

**Algorithm.**

for i = 1 to d do
stable-sort A on ith from right

**Complexity.** Since we use counting sort, the time is $\Theta \left(d(n+r)\right)$ or $\Theta \left({log}_{r}s(n+r)\right)$.

**Proposition.** The optimal radix is $\frac{n}{lnn}$.

**Proof.** Use differentiation to optimize:

- $\frac{d}{dr}{log}_{r}s(n+r)=0$
- $r\approx \frac{n}{ln}r$
- $r=\frac{n}{lnn}$

This produces the optimal running time: $\Theta \left(\frac{nlogs}{logn}\right)$. □

## Bucket sort

**Algorithm.**

for i = 0 to n-1 do B[i] = 0
for i = 1 to n do put A[i] into B[floor(A[i]*n)]
for i = 0 to n-1 do quadratic-sort(B[i])
concat-buckets(B)

This may be modified for ranges $[0,r)$ by using $B[\lfloor \frac{A\left[i\right]\cdot n}{r}\rfloor ]$. We may use diferentially sized buckets for non-uniform distributions.

**Complexity.** This takes $\Theta \left(n\right)$ on average (small buckets).