TY - JOUR
T1 - An Accelerated Coordinate Gradient Descent Algorithm for Non-separable Composite Optimization
AU - Aberdam, Aviad
AU - Beck, Amir
N1 - Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2022/6
Y1 - 2022/6
N2 - Coordinate descent algorithms are popular in machine learning and large-scale data analysis problems due to their low computational cost iterative schemes and their improved performances. In this work, we define a monotone accelerated coordinate gradient descent-type method for problems consisting of minimizing f+ g, where f is quadratic and g is nonsmooth and non-separable and has a low-complexity proximal mapping. The algorithm is enabled by employing the forward–backward envelope, a composite envelope that possess an exact smooth reformulation of f+ g. We prove the algorithm achieves a convergence rate of O(1 / k1.5) in terms of the original objective function, improving current coordinate descent-type algorithms. In addition, we describe an adaptive variant of the algorithm that backtracks the spectral information and coordinate Lipschitz constants of the problem. We numerically examine our algorithms on various settings, including two-dimensional total-variation-based image inpainting problems, showing a clear advantage in performance over current coordinate descent-type methods.
AB - Coordinate descent algorithms are popular in machine learning and large-scale data analysis problems due to their low computational cost iterative schemes and their improved performances. In this work, we define a monotone accelerated coordinate gradient descent-type method for problems consisting of minimizing f+ g, where f is quadratic and g is nonsmooth and non-separable and has a low-complexity proximal mapping. The algorithm is enabled by employing the forward–backward envelope, a composite envelope that possess an exact smooth reformulation of f+ g. We prove the algorithm achieves a convergence rate of O(1 / k1.5) in terms of the original objective function, improving current coordinate descent-type algorithms. In addition, we describe an adaptive variant of the algorithm that backtracks the spectral information and coordinate Lipschitz constants of the problem. We numerically examine our algorithms on various settings, including two-dimensional total-variation-based image inpainting problems, showing a clear advantage in performance over current coordinate descent-type methods.
KW - Composite functions
KW - Convex optimization
KW - Coordinate gradient descent
KW - Forward–backward envelope
UR - http://www.scopus.com/inward/record.url?scp=85119823632&partnerID=8YFLogxK
U2 - 10.1007/s10957-021-01957-1
DO - 10.1007/s10957-021-01957-1
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85119823632
SN - 0022-3239
VL - 193
SP - 219
EP - 246
JO - Journal of Optimization Theory and Applications
JF - Journal of Optimization Theory and Applications
IS - 1-3
ER -