The average complexity of deterministic and randomized parallel comparison sorting algorithms

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In practice, the average time of (deterministic or randomized) sorting algorithms seems to be more relevant than the worst case time of deterministic algorithms. Still, the many known complexity bounds for parallel comparison sorting include no nontrivial lower bounds for the average time required to sort by comparisons n elements with p processors (via deterministic or randomized algorithms). We show that for p ≥ n this time is Θ (log n/log(1 + p/n)), (it is easy to show that for p ≤ n the time is Θ (n log n/p) = Θ (log n/(p/n)). Therefore even the average case behaviour of randomized algorithms is not more efficient than the worst case behaviour of deterministic ones.
Original languageAmerican English
Title of host publication28th Annual Symposium on Foundations of Computer Science (sfcs 1987)
PublisherIEEE
Pages489-498
Number of pages10
ISBN (Print)0-8186-0807-2
DOIs
StatePublished - 14 Oct 1987
Event28th Annual Symposium on Foundations of Computer Science (sfcs 1987) - Los Angeles, CA, USA
Duration: 12 Oct 198714 Oct 1987

Conference

Conference28th Annual Symposium on Foundations of Computer Science (sfcs 1987)
Period12/10/8714/10/87

Keywords

  • Sorting
  • Phase change random access memory
  • Computer science
  • Parallel algorithms
  • Read-write memory
  • Decision trees

Fingerprint

Dive into the research topics of 'The average complexity of deterministic and randomized parallel comparison sorting algorithms'. Together they form a unique fingerprint.

Cite this