Abstract
We consider the problem of minimizing the sum of a smooth nonconvex function and a nonsmooth convex function over a compact embedded submanifold. We describe an algorithm, which we refer to as ``dynamic smoothing gradient descent on manifolds"" (DSGM), that is based on applying Riemmanian gradient steps on a series of smooth approximations of the objective function that are determined by a diminishing sequence of smoothing parameters. The DSGM algorithm is simple and can be easily employed for a broad class of problems without any complex adjustments. We show that all accumulation points of the sequence generated by the method are stationary. We devise a convergence rate of O( 1 k1/3 ) in terms of an optimality measure that can be easily computed. Numerical experiments illustrate the potential of the DSGM method.
Original language | English |
---|---|
Pages (from-to) | 1473-1493 |
Number of pages | 21 |
Journal | SIAM Journal on Optimization |
Volume | 33 |
Issue number | 3 |
DOIs | |
State | Published - 2023 |
Keywords
- dynamic smoothing
- first-order methods
- manifold optimization
- rate of convergence