This view from 60,000 feet of some sorting methods was previously

posted as a response to a query. I have been asked to do the same for

the Shell and Heap sorts. From just the rudimentary descriptions of

all the sorts below (except the QuickSort and Heapsort) you can

probably design implementations of these algorithms.

I feel sure there are better descriptions in books and net tutorials

but here is the way I visualize them:

(1) Bubble--Scan the list continually, finding items out of sequence

and swapping places with adjacent items until no swaps are made. This

gives the appearance of bubbles rising to the top surface.

(2) Selection--Search through the list to find the lowest (or highest)

and move it to the top. Reduce the list size by one and find the next

lowest, etc.

(3) Insertion--like picking up playing cards one at a time and placing

them in your hand in the order of a bridge hand.

(4) QuickSort--I'm sure there are examples of this on the net which

explain it better. This sort, credited to C.A.R. Hoare (1962), is a

"divide and conquer" technique where a random pivot value is chosen

and values are placed on both sides of this pivot. The procedure

continually calls itself in a recursive manner while making the sorted

segments smaller and smaller.

(5) Shell--(Donald L. Shell, 1959) This is a declining interval sort.

Shell's first proposal used 8-4-2-1 as the intervals for comparison.

The whole array is scanned first for items separated by 8 places and

exchanges them if out of order. This is repeated until no item is

moved. The interval is then reduced to 4 and so on to 2 and 1. An

interval of 1 between compared items is exactly like a bubble sort,

but many fewer items are moved.

More modern implementations realize that many items need to be moved

farther than 8 places in the first pass. The first interval is

conventionally 1/2 way down the array, reduced to 1/4..1/8.. until

finally an interval of 1. Less well known modifications of this sort

make only one pass per interval to get all the items at that interval

in their proper place. No implementation is recursive. The latter

modification competes with the QuickSort in speed.

(6) HeapSort (TreeSort3)--The novel, clever concept in this approach

is realizing the geometry of a tree with a linear array. This is

achieved in the following manner:

A[4]

/

A[2]

/ \

/ A[5]

A[1] and so on

\ A[6]

\ /

A[3]

\

A[7]

The parent of A[2] and A[3] is A[1]. A[2] has the children "nodes" of

A[4] and A[5]. All children nodes are ordered in the same geometric

fashion by their values, i.e. A[3] < A[2], A[5] < A[4] ... would be

one way of ordering. Note that if the parent is A[i], the children are

A[2*i] and A[2*i + 1].

This works like a selection sort but all the remaining items need not

be searched each time, rather just traverse the heap (tree) taking the

branch depending on the value being placed. The algorithm that

performs this is frequently called a sift.

After the heap is formed, it is rearranged with the smallest (or

largest) on top at the root, A[1], the next smallest at A[2] and so

on. In the end, A[1]..A[7] are in sorted order.

This is a very rapid sort. At 300Mhz, 17,000 random integers (0..9999)

are sorted before your finger gets of the <Enter> key.