On the convergence of alternating minimization for convex programming with applications to iteratively reweighted least squares and decomposition schemes

Amir Beck*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

160 Scopus citations

Abstract

This paper is concerned with the alternating minimization (AM) method for solving convex minimization problems where the decision variables vector is split into two blocks. The objective function is a sum of a differentiable convex function and a separable (possibly) nonsmooth extended real-valued convex function, and consequently constraints can be incorporated. We analyze the convergence rate of the method and establish a nonasymptotic sublinear rate of convergence where the multiplicative constant depends on the minimal block Lipschitz constant. We then analyze the iteratively reweighted least squares (IRLS) method for solving convex problems involving sums of norms. Based on the results derived for the AM method, we establish a nonasymptotic sublinear rate of convergence of the IRLS method. In addition, we show an asymptotic rate of convergence whose efficiency estimate does not depend on the data of the problem. Finally, we study the convergence properties of a decomposition-based approach designed to solve a composite convex model.

Original languageEnglish
Pages (from-to)185-209
Number of pages25
JournalSIAM Journal on Optimization
Volume25
Issue number1
DOIs
StatePublished - 2015
Externally publishedYes

Funding

FundersFunder number
Israel Science Foundation253/12

    Keywords

    • Alternating minimization
    • Convex optimization
    • Iteratively reweighted least squares
    • Rate of convergence

    Fingerprint

    Dive into the research topics of 'On the convergence of alternating minimization for convex programming with applications to iteratively reweighted least squares and decomposition schemes'. Together they form a unique fingerprint.

    Cite this