Converting high probability into nearly-constant time - with applications to parallel hashing

Yossi Matias, Uzi Vishkin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We present a new paradigm for efficient randomized parallel algorithms that needs ō(log' n) time, where Ō(x) means 'O(Z) expected'. It leads to: (1) constructing a perfect hash function for n elements in Ō (log" n log(log" n)) time and Ō (n) operations; (2) an algorithm for generating a random per-mutation in O(log∗ n) time, using n processors or in Ō (log" n Iog(log" n)) time and Ō (n) operations; and (3) an efficient optimize. consider a parallel algorithm that runs in t time using p processors; since at each time unit some of the processors may be idle, we let z, the total number of actual operations, be the sum over all non-idle processors at every time unit; assuming the algorithm belongs to a certain kind, it can be adapted to run in Ō (t+log" n log(log" n)) time (additive cwerhead!) using z/(t + log∗ n log(log∗ n)) processors. We ak get an optimal integer sorting adgorithm. Givel ntegers from the domain [1, .n], it runs in Ō( log/log n K ) time.

Original languageEnglish
Title of host publicationProceedings of the 23rd Annual ACM Symposium on Theory of Computing, STOC 1991
PublisherAssociation for Computing Machinery
Pages307-316
Number of pages10
ISBN (Electronic)0897913973
StatePublished - 3 Jan 1991
Event23rd Annual ACM Symposium on Theory of Computing, STOC 1991 - New Orleans, United States
Duration: 5 May 19918 May 1991

Publication series

NameProceedings of the Annual ACM Symposium on Theory of Computing
VolumePart F130073
ISSN (Print)0737-8017

Conference

Conference23rd Annual ACM Symposium on Theory of Computing, STOC 1991
Country/TerritoryUnited States
CityNew Orleans
Period5/05/918/05/91

Funding

FundersFunder number
National Science Foundation

    Fingerprint

    Dive into the research topics of 'Converting high probability into nearly-constant time - with applications to parallel hashing'. Together they form a unique fingerprint.

    Cite this