On the convergence of block coordinate descent type methods

Amir Beck, Luba Tetruashvili

Research output: Contribution to journalArticlepeer-review

369 Scopus citations

Abstract

In this paper we study smooth convex programming problems where the decision variables vector is split into several blocks of variables. We analyze the block coordinate gradient projection method in which each iteration consists of performing a gradient projection step with respect to a certain block taken in a cyclic order. Global sublinear rate of convergence of this method is established and it is shown that it can be accelerated when the problem is unconstrained. In the unconstrained setting we also prove a sublinear rate of convergence result for the so-called alternating minimization method when the number of blocks is two. When the objective function is also assumed to be strongly convex, linear rate of convergence is established.

Original languageEnglish
Pages (from-to)2037-2060
Number of pages24
JournalSIAM Journal on Optimization
Volume23
Issue number4
DOIs
StatePublished - 2013
Externally publishedYes

Keywords

  • Alternating minimization
  • Block descent methods
  • Convex optimization
  • Rate of convergence

Fingerprint

Dive into the research topics of 'On the convergence of block coordinate descent type methods'. Together they form a unique fingerprint.

Cite this