First-order methods for nonconvex quadratic minimization

Yair Carmon, John C. Duchi

Research output: Contribution to journalArticlepeer-review

22 Scopus citations

Abstract

We consider minimization of indefinite quadratics with either trust-region (norm) constraints or cubic regularization. Despite the nonconvexity of these problems we prove that, under mild assumptions, gradient descent converges to their global solutions and give a nonasymptotic rate of convergence for the cubic variant. We also consider Krylov subspace solutions and establish sharp convergence guarantees to the solutions of both trust-region and cubic-regularized problems. Our rates mirror the behavior of these methods on convex quadratics and eigenvector problems, highlighting their scalability. When we use Krylov subspace solutions to approximate the cubic-regularized Newton step, our results recover the strongest known convergence guarantees to approximate second-order stationary points of general smooth nonconvex functions.

Original languageEnglish
Pages (from-to)395-436
Number of pages42
JournalSIAM Review
Volume62
Issue number2
DOIs
StatePublished - 2020
Externally publishedYes

Funding

FundersFunder number
Stanford University
Stanford Artificial Intelligence Lab-Toyota Center For AI Research
Fundação para a Ciência e a TecnologiaIncentivo/FIS/LA0010/2013
National Science FoundationNSF-CAREER-1553086
Office of Naval ResearchN00014-19-2288

    Keywords

    • Cubic regularization
    • Global optimization
    • Gradient descent
    • Krylov subspace methods
    • Newton's method
    • Nonasymptotic convergence
    • Nonconvex quadratics
    • Trust-region methods

    Fingerprint

    Dive into the research topics of 'First-order methods for nonconvex quadratic minimization'. Together they form a unique fingerprint.

    Cite this