A dynamic smoothing technique for a class of nonsmooth optimization problems on manifolds

Amir Beck, Israel Rosset

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the problem of minimizing the sum of a smooth nonconvex function and a nonsmooth convex function over a compact embedded submanifold. We describe an algorithm, which we refer to as ``dynamic smoothing gradient descent on manifolds"" (DSGM), that is based on applying Riemmanian gradient steps on a series of smooth approximations of the objective function that are determined by a diminishing sequence of smoothing parameters. The DSGM algorithm is simple and can be easily employed for a broad class of problems without any complex adjustments. We show that all accumulation points of the sequence generated by the method are stationary. We devise a convergence rate of O( 1 k1/3 ) in terms of an optimality measure that can be easily computed. Numerical experiments illustrate the potential of the DSGM method.

Original languageEnglish
Pages (from-to)1473-1493
Number of pages21
JournalSIAM Journal on Optimization
Volume33
Issue number3
DOIs
StatePublished - 2023

Funding

FundersFunder number
Israel Science Foundation926/21

    Keywords

    • dynamic smoothing
    • first-order methods
    • manifold optimization
    • rate of convergence

    Fingerprint

    Dive into the research topics of 'A dynamic smoothing technique for a class of nonsmooth optimization problems on manifolds'. Together they form a unique fingerprint.

    Cite this