TY - JOUR
T1 - On Linear Convergence of Non-Euclidean Gradient Methods without Strong Convexity and Lipschitz Gradient Continuity
AU - Bauschke, Heinz H.
AU - Bolte, Jérôme
AU - Chen, Jiawei
AU - Teboulle, Marc
AU - Wang, Xianfu
N1 - Publisher Copyright:
© 2019, Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2019/9/15
Y1 - 2019/9/15
N2 - The gradient method is well known to globally converge linearly when the objective function is strongly convex and admits a Lipschitz continuous gradient. In many applications, both assumptions are often too stringent, precluding the use of gradient methods. In the early 1960s, after the amazing breakthrough of Łojasiewicz on gradient inequalities, it was observed that uniform convexity assumptions could be relaxed and replaced by these inequalities. On the other hand, very recently, it has been shown that the Lipschitz gradient continuity can be lifted and replaced by a class of functions satisfying a non-Euclidean descent property expressed in terms of a Bregman distance. In this note, we combine these two ideas to introduce a class of non-Euclidean gradient-like inequalities, allowing to prove linear convergence of a Bregman gradient method for nonconvex minimization, even when neither strong convexity nor Lipschitz gradient continuity holds.
AB - The gradient method is well known to globally converge linearly when the objective function is strongly convex and admits a Lipschitz continuous gradient. In many applications, both assumptions are often too stringent, precluding the use of gradient methods. In the early 1960s, after the amazing breakthrough of Łojasiewicz on gradient inequalities, it was observed that uniform convexity assumptions could be relaxed and replaced by these inequalities. On the other hand, very recently, it has been shown that the Lipschitz gradient continuity can be lifted and replaced by a class of functions satisfying a non-Euclidean descent property expressed in terms of a Bregman distance. In this note, we combine these two ideas to introduce a class of non-Euclidean gradient-like inequalities, allowing to prove linear convergence of a Bregman gradient method for nonconvex minimization, even when neither strong convexity nor Lipschitz gradient continuity holds.
KW - Bregman distance
KW - Descent lemma without Lipschitz gradient
KW - Gradient dominated inequality
KW - Linear rate of convergence
KW - Lipschitz-like convexity condition
KW - Non-Euclidean gradient methods
KW - Nonconvex minimization
KW - Łojasiewicz gradient inequality
UR - http://www.scopus.com/inward/record.url?scp=85064279341&partnerID=8YFLogxK
U2 - 10.1007/s10957-019-01516-9
DO - 10.1007/s10957-019-01516-9
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85064279341
SN - 0022-3239
VL - 182
SP - 1068
EP - 1087
JO - Journal of Optimization Theory and Applications
JF - Journal of Optimization Theory and Applications
IS - 3
ER -