RECAPP: Crafting a More Efficient Catalyst for Convex Optimization

Yair Carmon, Arun Jambulapati, Yujia Jin*, Aaron Sidford

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

4 Scopus citations

Abstract

The accelerated proximal point algorithm (APPA), also known as “Catalyst”, is a well-established reduction from convex optimization to approximate proximal point computation (i.e., regularized minimization). This reduction is conceptually elegant and yields strong convergence rate guarantees. However, these rates feature an extraneous logarithmic term arising from the need to compute each proximal point to high accuracy. In this work, we propose a novel Relaxed Error Criterion for Accelerated Proximal Point (RECAPP) that eliminates the need for high accuracy subproblem solutions. We apply RECAPP to two canonical problems: finite-sum and max-structured minimization. For finite-sum problems, we match the best known complexity, previously obtained by carefully-designed problem-specific algorithms. For minimizing maxy f(x, y) where f is convex in x and strongly-concave in y, we improve on the best known (Catalyst-based) bound by a logarithmic factor.

Original languageEnglish
Pages (from-to)2658-2685
Number of pages28
JournalProceedings of Machine Learning Research
Volume162
StatePublished - 2022
Event39th International Conference on Machine Learning, ICML 2022 - Baltimore, United States
Duration: 17 Jul 202223 Jul 2022

Funding

FundersFunder number
Blavatnik Family Foundation
Israeli Science Foundation2486/21
Microsoft Research
NSF
Stanford Graduate Fellowship
National Science FoundationCCF-1955039, CCF-1844855
Stanford University
Microsoft Research
Blavatnik Family Foundation
Israel Science Foundation

    Fingerprint

    Dive into the research topics of 'RECAPP: Crafting a More Efficient Catalyst for Convex Optimization'. Together they form a unique fingerprint.

    Cite this