Gradient descent finds the cubic-regularized nonconvex Newton step

Yair Carmon, John Duchi

Research output: Contribution to journalArticlepeer-review

26 Scopus citations


We consider the minimization of a nonconvex quadratic form regularized by a cubic term, which may exhibit saddle points and a suboptimal local minimum. Nonetheless, we prove that, under mild assumptions, gradient descent approximates the global minimum to within ε accuracy in O(ε1 log(1/ε)) steps for large ε and O(log(1/ε)) steps for small ε (compared to a condition number we define), with at most logarithmic dependence on the problem dimension. When we use gradient descent to approximate the cubic-regularized Newton step, our result implies a rate of convergence to second-order stationary points of general smooth nonconvex functions.

Original languageEnglish
Pages (from-to)2146-2178
Number of pages33
JournalSIAM Journal on Optimization
Issue number3
StatePublished - 2019
Externally publishedYes


FundersFunder number
SAIL-Toyota Center for AI Research
Stanford Graduate Fellowship
National Science FoundationNSF-CAREER-1553086
Stanford University
Stanford Artificial Intelligence Lab-Toyota Center For AI Research


    • Cubic regularization
    • Global optimization
    • Gradient descent
    • Newton's method
    • Nonasymptotic rate of convergence
    • Nonconvex quadratics
    • Power method
    • Trust region methods


    Dive into the research topics of 'Gradient descent finds the cubic-regularized nonconvex Newton step'. Together they form a unique fingerprint.

    Cite this