Making Progress Based on False Discoveries

Roi Livni*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We consider Stochastic Convex Optimization as a case-study for Adaptive Data Analysis. A basic question is how many samples are needed in order to compute accurate estimates of O(1/2) gradients queried by gradient descent. We provide two intermediate answers to this question. First, we show that for a general analyst (not necessarily gradient descent) ω(1/3) samples are required, which is more than the number of sample required to simply optimize the population loss. Our construction builds upon a new lower bound (that may be of interest of its own right) for an analyst that may ask several non adaptive questions in a batch of fixed and known T rounds of adaptivity and requires a fraction of true discoveries. We show that for such an analyst ω( T/2) samples are necessary. Second, we show that, under certain assumptions on the oracle, in an interaction with gradient descent ω(1/2.5) samples are necessary. Which is again suboptimal in terms of optimization. Our assumptions are that the oracle has only first order access and is post-hoc generalizing. First order access means that it can only compute the gradients of the sampled function at points queried by the algorithm. Our assumption of post-hoc generalization follows from existing lower bounds for statistical queries. More generally then, we provide a generic reduction from the standard setting of statistical queries to the problem of estimating gradients queried by gradient descent. Overall these results are in contrast with classical bounds that show that with O(1/2) samples one can optimize the population risk to accuracy of O but, as it turns out, with spurious gradients.

Original languageEnglish
Title of host publication15th Innovations in Theoretical Computer Science Conference, ITCS 2024
EditorsVenkatesan Guruswami
PublisherSchloss Dagstuhl- Leibniz-Zentrum fur Informatik GmbH, Dagstuhl Publishing
ISBN (Electronic)9783959773096
DOIs
StatePublished - Jan 2024
Event15th Innovations in Theoretical Computer Science Conference, ITCS 2024 - Berkeley, United States
Duration: 30 Jan 20242 Feb 2024

Publication series

NameLeibniz International Proceedings in Informatics, LIPIcs
Volume287
ISSN (Print)1868-8969

Conference

Conference15th Innovations in Theoretical Computer Science Conference, ITCS 2024
Country/TerritoryUnited States
CityBerkeley
Period30/01/242/02/24

Funding

FundersFunder number
European CommissionFOG-101116258 FoG
Israel Science Foundation2188/20

    Keywords

    • Adaptive Data Analysis
    • Learning Theory
    • Stochastic Convex Optimization

    Fingerprint

    Dive into the research topics of 'Making Progress Based on False Discoveries'. Together they form a unique fingerprint.

    Cite this