The everlasting database: Statistical validity at a fair price

Blake Woodworth, Vitaly Feldman, Saharon Rosset, Nathan Srebro

Research output: Contribution to journalConference articlepeer-review

2 Scopus citations

Abstract

The problem of handling adaptivity in data analysis, intentional or not, permeates a variety of fields, including test-set overfitting in ML challenges and the accumulation of invalid scientific discoveries. We propose a mechanism for answering an arbitrarily long sequence of potentially adaptive statistical queries, by charging a price for each query and using the proceeds to collect additional samples. Crucially, we guarantee statistical validity without any assumptions on how the queries are generated. We also ensure with high probability that the cost for M non-adaptive queries is O(log M), while the cost to a potentially adaptive user who makes M queries that do not depend on any others is O(pM).

Original languageEnglish
Pages (from-to)6531-6540
Number of pages10
JournalAdvances in Neural Information Processing Systems
Volume2018-December
StatePublished - 2018
Event32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada
Duration: 2 Dec 20188 Dec 2018

Funding

FundersFunder number
National Science Foundation1754881

    Fingerprint

    Dive into the research topics of 'The everlasting database: Statistical validity at a fair price'. Together they form a unique fingerprint.

    Cite this