Acceleration with a ball optimization oracle

Yair Carmon*, Arun Jambulapati*, Qijia Jiang*, Yujia Jin*, Yin Tat Lee, Aaron Sidford*, Kevin Tian*

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

19 Scopus citations

Abstract

Consider an oracle which takes a point x and returns the minimizer of a convex function f in an l2 ball of radius r around x. It is straightforward to show that roughly r-1 log 1e calls to the oracle suffice to find an e-approximate minimizer of f in an l2 unit ball. Perhaps surprisingly, this is not optimal: we design an accelerated algorithm which attains an e-approximate minimizer with roughly r-2/3 log 1e oracle queries, and give a matching lower bound. Further, we implement ball optimization oracles for functions with locally stable Hessians using a variant of Newton’s method and, in certain cases, stochastic first-order methods. The resulting algorithm applies to a number of problems of practical and theoretical import, improving upon previous results for logistic and l8 regression and achieving guarantees comparable to the state-of-the-art for lp regression.

Original languageEnglish
JournalAdvances in Neural Information Processing Systems
Volume2020-December
StatePublished - 2020
Event34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
Duration: 6 Dec 202012 Dec 2020

Funding

FundersFunder number
PayPal and Microsoft
National Science FoundationCCF-1955039, DMS-1839116, CCF-1740551, DMS-2023166, CCF-1844855, CCF-1749609
Microsoft Research

    Fingerprint

    Dive into the research topics of 'Acceleration with a ball optimization oracle'. Together they form a unique fingerprint.

    Cite this