An elementary approach to tight worst case complexity analysis of gradient based methods

Marc Teboulle*, Yakov Vaisbourd

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

1 Scopus citations

Abstract

This work presents a novel analysis that allows to achieve tight complexity bounds of gradient-based methods for convex optimization. We start by identifying some of the pitfalls rooted in the classical complexity analysis of the gradient descent method, and show how they can be remedied. Our methodology hinges on elementary and direct arguments in the spirit of the classical analysis. It allows us to establish some new (and reproduce known) tight complexity results for several fundamental algorithms including, gradient descent, proximal point and proximal gradient methods which previously could be proven only through computer-assisted convergence proof arguments.

Original languageEnglish
Pages (from-to)63-96
Number of pages34
JournalMathematical Programming
Volume201
Issue number1-2
DOIs
StatePublished - Sep 2023

Funding

FundersFunder number
Israel Science Foundation2619-20, 1844-16

    Keywords

    • Composite minimization
    • Convex minimization
    • Global rate of convergence
    • Gradient descent
    • Performance estimation problem
    • Proximal schemes
    • Worst-case complexity analysis

    Fingerprint

    Dive into the research topics of 'An elementary approach to tight worst case complexity analysis of gradient based methods'. Together they form a unique fingerprint.

    Cite this