排序方式: 共有5条查询结果,搜索用时 0 毫秒
1
1.
A new hybrid of Distributive Partitioning Sorting is described and tested against Quicksort on uniformly distributed items. Pointer sort versions of both algorithms are also tested. 相似文献
2.
Ralph Neininger 《Random Structures and Algorithms》2015,46(2):346-361
The complexity of the Quicksort algorithm is usually measured by the number of key comparisons used during its execution. When operating on a list of n data, permuted uniformly at random, the appropriately normalized complexity Yn is known to converge almost surely to a non‐degenerate random limit Y. This assumes a natural embedding of all Yn on one probability space, e.g., via random binary search trees. In this note a central limit theorem for the error term in the latter almost sure convergence is shown: where denotes a standard normal random variable. © 2013 Wiley Periodicals, Inc. Random Struct. Alg., 46, 346–361, 2015 相似文献
3.
Peter M. G. Apers 《BIT Numerical Mathematics》1978,18(2):125-132
A sorting algorithm called Recursive Samplesort is described and analyzed. An extensive average case analysis demonstrates that Recursive Samplesort is faster than both Samplesort and Quicksort, with respect to certain linear combinations of the number of comparisons and move instructions needed. 相似文献
4.
A new sorting algorithm, Double Distributive Partitioning, is introduced and compared against Sedgewick's quicksort. It is shown that the Double Distributive Partitioning algorithm runs, for all practical purposes, inO(n) time for many distributions of keys. Furthermore, the combined number of comparisons, additions, and assignments required to sort by the new method on these distributions is always less than quicksort. 相似文献
5.
Jing-Chao Chen 《Journal of Algorithms in Cognition, Informatics and Logic》1998,28(2):258-271
This paper presents an efficient and practical sorting algorithm, called Quadripartite Sort. It lies between MergeSort and QuickSort. This algorithm sortsnelements using bounded workspace andn log n + 1.75ncomparisons in the worst case. By empirical testing, we conjecture that this algorithm needs approximatelyn log n − ncomparisons on average. When usingm-way merging strategy, wheremis a larger constant, this algorithm becomes an in-place sorting algorithm whose comparison plus exchange total is absolutely minimum among known constant workspace algorithms. For example, using a 256-way merging, the comparison plus exchange total required is approximately 1.2495n log n + O(n) in the worst case. 相似文献
1