Advertisement

Search Topics....

Sunday, August 28, 2011

Time Complexity Factor Big O of Sorting Algorithms

Sorting algorithms used in computer science are often classified by:

Computational complexity (worst, average and best behaviour) of element comparisons in terms of the size of the list (n). For typical sorting algorithms good behavior is O(nlogn) and bad behavior is .  Ideal behavior for a sort is , but this is not possible in the average case.Comparison-based sorting algorithms, which evaluate the elements of the list via an abstract key comparison operation, need at least comparisons for most inputs.
Computational complexity of swaps (for "in place" algorithms).

Memory usage (and use of other computer resources). In particular, some sorting algorithms are "in place". This means that they need only or memory beyond the items being sorted and they don't need to create auxiliary locations for data to be temporarily stored, as in other sorting algorithms.

Recursion. Some algorithms are either recursive or non-recursive, while others may be both (e.g., merge sort).

Stability: stable sorting algorithms maintain the relative order of records with equal keys (i.e., values). See below for more information.

Whether or not they are a comparison sort. A comparison sort examines the data only by comparing two elements with a comparison operator.

General method: insertion, exchange, selection, merging, etc.. Exchange sorts include bubble sort and quicksort. Selection sorts include shaker sort and heapsort.

Adaptability: Whether or not the pre sortedness of the input affects the running time. Algorithms that take this into account are known to be adaptive.

Courtesy : Wikipedia 

No comments: