Sparsity constrained nonlinear optimization: Optimality conditions and algorithms

Amir Beck, Yonina C. Eldar

Research output: Contribution to journalArticlepeer-review

211 Scopus citations

Abstract

This paper treats the problem of minimizing a general continuously differentiable function subject to sparsity constraints. We present and analyze several different optimality criteriawhich are based on the notions of stationarity and coordinatewise optimality. These conditions are then used to derive three numerical algorithms aimed at finding points satisfying the resulting optimality criteria: the iterative hard thresholding method and the greedy and partial sparse- implex methods. The first algorithm is essentially a gradient projection method, while the remaining two algorithms are of a coordinate descent type. The theoretical convergence of these echniques and their relations to the derived optimality conditions are studied. The algorithms and results are illustrated by several numerical examples.

Original languageEnglish
Pages (from-to)1480-1509
Number of pages30
JournalSIAM Journal on Optimization
Volume23
Issue number3
DOIs
StatePublished - 2013
Externally publishedYes

Keywords

  • Compressed sensing
  • Numerical methods
  • Optimality conditions
  • Sparsity constrained problems
  • Stationarity

Fingerprint

Dive into the research topics of 'Sparsity constrained nonlinear optimization: Optimality conditions and algorithms'. Together they form a unique fingerprint.

Cite this