TY - JOUR
T1 - A Dynamic Alternating Direction of Multipliers for Nonconvex Minimization with Nonlinear Functional Equality Constraints
AU - Cohen, Eyal
AU - Hallak, Nadav
AU - Teboulle, Marc
N1 - Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2022/6
Y1 - 2022/6
N2 - This paper studies the minimization of a broad class of nonsmooth nonconvex objective functions subject to nonlinear functional equality constraints, where the gradients of the differentiable parts in the objective and the constraints are only locally Lipschitz continuous. We propose a specific proximal linearized alternating direction method of multipliers in which the proximal parameter is generated dynamically, and we design an explicit and tractable backtracking procedure to generate it. We prove subsequent convergence of the method to a critical point of the problem, and global convergence when the problem’s data are semialgebraic. These results are obtained with no dependency on the explicit manner in which the proximal parameter is generated. As a byproduct of our analysis, we also obtain global convergence guarantees for the proximal gradient method with a dynamic proximal parameter under local Lipschitz continuity of the gradient of the smooth part of the nonlinear sum composite minimization model.
AB - This paper studies the minimization of a broad class of nonsmooth nonconvex objective functions subject to nonlinear functional equality constraints, where the gradients of the differentiable parts in the objective and the constraints are only locally Lipschitz continuous. We propose a specific proximal linearized alternating direction method of multipliers in which the proximal parameter is generated dynamically, and we design an explicit and tractable backtracking procedure to generate it. We prove subsequent convergence of the method to a critical point of the problem, and global convergence when the problem’s data are semialgebraic. These results are obtained with no dependency on the explicit manner in which the proximal parameter is generated. As a byproduct of our analysis, we also obtain global convergence guarantees for the proximal gradient method with a dynamic proximal parameter under local Lipschitz continuity of the gradient of the smooth part of the nonlinear sum composite minimization model.
KW - Augmented Lagrangian-based methods
KW - Global convergence
KW - Kurdyka-Lojasiewicz property
KW - Nonconvex and nonsmooth minimization
KW - Proximal gradient method
UR - http://www.scopus.com/inward/record.url?scp=85114874423&partnerID=8YFLogxK
U2 - 10.1007/s10957-021-01929-5
DO - 10.1007/s10957-021-01929-5
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85114874423
SN - 0022-3239
VL - 193
SP - 324
EP - 353
JO - Journal of Optimization Theory and Applications
JF - Journal of Optimization Theory and Applications
IS - 1-3
ER -