Abstract
We consider minimization of indefinite quadratics with either trust-region (norm) constraints or cubic regularization. Despite the nonconvexity of these problems we prove that, under mild assumptions, gradient descent converges to their global solutions and give a nonasymptotic rate of convergence for the cubic variant. We also consider Krylov subspace solutions and establish sharp convergence guarantees to the solutions of both trust-region and cubic-regularized problems. Our rates mirror the behavior of these methods on convex quadratics and eigenvector problems, highlighting their scalability. When we use Krylov subspace solutions to approximate the cubic-regularized Newton step, our results recover the strongest known convergence guarantees to approximate second-order stationary points of general smooth nonconvex functions.
Original language | English |
---|---|
Pages (from-to) | 395-436 |
Number of pages | 42 |
Journal | SIAM Review |
Volume | 62 |
Issue number | 2 |
DOIs | |
State | Published - 2020 |
Externally published | Yes |
Keywords
- Cubic regularization
- Global optimization
- Gradient descent
- Krylov subspace methods
- Newton's method
- Nonasymptotic convergence
- Nonconvex quadratics
- Trust-region methods