The cyclic block conditional gradient method for convex optimization problems

Amir Beck, Edouard Pauwels, Shoham Sabach

Research output: Contribution to journalArticlepeer-review

18 Scopus citations

Abstract

In this paper we study the convex problem of optimizing the sum of a smooth function and a compactly supported nonsmooth term with a specific separable form. We analyze the block version of the generalized conditional gradient method when the blocks are chosen in a cyclic order. A global sublinear rate of convergence is established for two different stepsize strategies commonly used in this class of methods. Numerical comparisons of the proposed method to both the classical conditional gradient algorithm and its random block version demonstrate the effectiveness of the cyclic block update rule.

Original languageEnglish
Pages (from-to)2024-2049
Number of pages26
JournalSIAM Journal on Optimization
Volume25
Issue number4
DOIs
StatePublished - 2015
Externally publishedYes

Funding

FundersFunder number
Israel Science Foundation253/12

    Keywords

    • Conditional gradient
    • Cyclic block decomposition
    • Iteration complexity
    • Linear oracle
    • Nonsmooth convex minimization
    • Support vector machine

    Fingerprint

    Dive into the research topics of 'The cyclic block conditional gradient method for convex optimization problems'. Together they form a unique fingerprint.

    Cite this