For Merge sort worst case is O(n*log(n)), for Quick sort: O(n2). For other cases (avg, best) both have O(n*log(n)). However Quick sort is space constant where Merge sort depends on the structure you're sorting.
虽然快速排序通常是比合并排序更好的选择,但从理论上讲,合并排序肯定是更好的选择。最明显的时间就是你的算法运行速度超过 O (n ^ 2)的时候。快速排序通常比这个更快,但是考虑到理论上最糟糕的可能输入,它可以在 O (n ^ 2)中运行,这比最糟糕的合并排序还要糟糕。
Quicksort is also more complicated than mergesort, especially if you want to write a really solid implementation, and so if you're aiming for simplicity and maintainability, merge sort becomes a promising alternative with very little performance loss.
but merge sort is always 0(nlogn) , whereas quicksort for bad partitions, ie skewed partitions like 1 element-10 element (which can happen due to sorted or reverse sorted list ) can lead to a 0(n^2)..
..所以我们有随机快排,我们随机选择轴心,避免这样的偏斜分区,从而使整个 n ^ 2场景无效
无论如何,即使对于像3-4这样中等偏斜的分区,我们有一个 nlog (7/4) n,
理想情况下,我们想要1-1分割,因此 O (nlog (2) n)的整个2。
因此它是 O (nlogn) ,几乎总是和 merge sort 不同的是,隐藏在“ big-oh”符号下的常量对于快速排序比对于合并排序更好。.它不像合并排序那样占用额外的空间。
The answer would slightly tilt towards quicksort w.r.t to changes brought with DualPivotQuickSort for primitive values . It is used in JAVA 7 to sort in Java.util. 数组
It is proved that for the Dual-Pivot Quicksort the average number of
comparisons is 2*n*ln(n), the average number of swaps is 0.8*n*ln(n),
whereas classical Quicksort algorithm has 2*n*ln(n) and 1*n*ln(n)
respectively. Full mathematical proof see in attached proof.txt
and proof_add.txt files. Theoretical results are also confirmed
by experimental counting of the operations.