The objective of the sorting algorithm is to rearrange the records
so that their keys are ordered according to some well-defined ordering
rule.

Problem: Given an array of n real number A[1.. n].

Objective: Sort the elements of A in ascending order of their values.

If the file to be sorted will fit into memory or equivalently if it will fit into an array, then the sorting method is called internal. In this method, any record can be accessed easily.

- Sorting files from tape or disk.
- In this method, an external sort algorithm must access records sequentially, or at least in the block.

- Sort in place and use no extra memory except perhaps for a small stack or table.
- Algorithm that use a linked-list representation and so use N extra words of memory for list pointers.
- Algorithms that need enough extra memory space to hold another copy of the array to be sorted.

A sorting algorithm is called stable if it is preserves the relative order of equal keys in the file. Most of the simple algorithm are stable, but most of the well-known sophisticated algorithms are not.

There are two classes of sorting
algorithms namely, O(*n*^{2})-algorithms and
O(*n log n*)-algorithms.
O(*n*^{2})-class includes bubble sort,
insertion sort, selection sort and shell sort.
O(*n log n*)-class
includes heap sort, merge sort and quick sort.

**O( n^{2})
Sorting Algorithms**

**O( n log n)
Sorting Algorithms**

Now we show that comparison-based sorting algorithm has
an Ω(*n *log* n*) worst-case lower bound on its running time
operation in sorting, then this is the best we can do. Note that in a
comparison sort, we use only comparisons between elements to gain
information about an input sequence <*a*_{1}, *a*_{2},
. . . , *a _{n}*>. That is, given two elements

Given all of the input elements are distinct (this is not a
restriction since we are deriving a lower bound), comparisons of the
form
*a _{i}*

Each time a sorting algorithm compares
two elements *a _{i
}*and

As an example, consider the decision
tree for insertion sort operating on given elements
a_{1},
a_{2} and
a_{3}.
There are are 3! = 6
possible permutations of the three input elements, so the decision tree must
have at least 6
leaves.

In general, there are *n*! possible permutations of the
*n*
input elements, so decision tree must have at least *n*! leaves.

The length of the longest path from the root to any of its leaves represents the worst-case number of comparisons the sorting algorithm perform. Consequently, the worst-case number of comparisons corresponds to the height of its tree. A lower bound on the height of the tree is therefore a lower bound on the running time of any comparison sort algorithm.

**Theorem** *
The running time of any comparison-based
algorithm for sorting an n-element sequence is Ω(n lg n) in the worst
case.*

Examples of comparison-based algorithms (in CLR) are insertion sort, selection sort, merge sort, quicksort, heapsort, and treesort.

** Proof** Consider a decision tree of height

*n*!
≤ 2^{h
}

Taking logarithms on both sides

(lg(*n*!) ≤ *h
*

Since the *lg* function is monotonically increasing, from
Stirling's
approximation we have

*
n*! > (*n*/*e*)^{n}
where *e* = 2.71828 . . .

* h* ≥ (*n*/*e*)^{n
}which is Ω(*n *lg* n*)