Linear Sorts

Counting sort

Algorithm (intuitive).

for i = 0 to k-1 do C[i] = 0
for j = 0 to n do C[A[j]] = C[A[j]]+1
t = 0
for i = 0 to k-1 do
	for j = 0 to C[i] do
		t = t + 1
		B[t] = i

Algorithm (stable).

for i = 0 to k-1 do C[i] = 0
for j = 0 to n do C[A[j]] = C[A[j]]+1
for i = 0 to n do C[i] = C[i-1]+C[i]
for j = n downto 1 do
	B[C[A[j]]] = A[j]
	C[A[j]] = C[A[j]]-1

Complexity. Both are Θ(n+k).

Radix sort

Algorithm.

for i = 1 to d do
	stable-sort A on ith from right

Complexity. Since we use counting sort, the time is Θ(d(n+r)) or Θ(logrs(n+r)).

Proposition. The optimal radix is nlnn.

Proof. Use differentiation to optimize:

This produces the optimal running time: Θ(nlogslogn).

Bucket sort

Algorithm.

for i = 0 to n-1 do B[i] = 0
for i = 1 to n do put A[i] into B[floor(A[i]*n)]
for i = 0 to n-1 do quadratic-sort(B[i])
concat-buckets(B)

This may be modified for ranges [0,r) by using B[A[i]nr]. We may use diferentially sized buckets for non-uniform distributions.

Complexity. This takes Θ(n) on average (small buckets).