TY - JOUR
T1 - Nonconvex lagrangian-based optimization
T2 - Monitoring schemes and global convergence
AU - Bolte, Jérôme
AU - Sabach, Shoham
AU - Teboulle, Marc
N1 - Publisher Copyright:
© 2018 INFORMS.
PY - 2018
Y1 - 2018
N2 - We introduce a novel approach addressing global analysis of a difficult class of nonconvex-nonsmooth optimization problems within the important framework of Lagrangian-based methods. This genuine nonlinear class captures many problems in modern disparate fields of applications. It features complex geometries, qualification conditions, and other regularity properties do not hold everywhere. To address these issues, we work along several research lines to develop an original general Lagrangian methodology, which can deal, all at once, with the above obstacles. A first innovative feature of our approach is to introduce the concept of Lagrangian sequences for a broad class of algorithms. Central to this methodology is the idea of turning an arbitrary descent method into a multiplier method. Second, we provide these methods with a transitional regime allowing us to identify in finitely many steps a zone where we can tune the step-sizes of the algorithm for the final converging regime. Then, despite the min-max nature of Lagrangian methods, using an original Lyapunov method we prove that each bounded sequence generated by the resulting monitoring schemes are globally convergent to a critical point for some fundamental Lagrangian-based methods in the broad semialgebraic setting, which to the best of our knowledge, are the first of this kind.
AB - We introduce a novel approach addressing global analysis of a difficult class of nonconvex-nonsmooth optimization problems within the important framework of Lagrangian-based methods. This genuine nonlinear class captures many problems in modern disparate fields of applications. It features complex geometries, qualification conditions, and other regularity properties do not hold everywhere. To address these issues, we work along several research lines to develop an original general Lagrangian methodology, which can deal, all at once, with the above obstacles. A first innovative feature of our approach is to introduce the concept of Lagrangian sequences for a broad class of algorithms. Central to this methodology is the idea of turning an arbitrary descent method into a multiplier method. Second, we provide these methods with a transitional regime allowing us to identify in finitely many steps a zone where we can tune the step-sizes of the algorithm for the final converging regime. Then, despite the min-max nature of Lagrangian methods, using an original Lyapunov method we prove that each bounded sequence generated by the resulting monitoring schemes are globally convergent to a critical point for some fundamental Lagrangian-based methods in the broad semialgebraic setting, which to the best of our knowledge, are the first of this kind.
KW - Global convergence
KW - Lagrangian-based methods
KW - Nonconvex and nonsmooth minimization
KW - Nonlinear composite minimization
KW - Nonsmooth Kurdyka-Łojasiewicz property
KW - Proximal method of multipliers
KW - Semialgebraic optimization
UR - http://www.scopus.com/inward/record.url?scp=85048741163&partnerID=8YFLogxK
U2 - 10.1287/moor.2017.0900
DO - 10.1287/moor.2017.0900
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85048741163
SN - 0364-765X
VL - 43
SP - 1210
EP - 1232
JO - Mathematics of Operations Research
JF - Mathematics of Operations Research
IS - 4
ER -