Interior gradient and epsilon-subgradient descent methods for constrained convex minimization

A. Auslender*, M. Teboulle

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


We extend epsilon-subgradient descent methods for unconstrained nonsmooth convex minimization to constrained problems over polyhedral sets, in particular over ℝ+p. This is done by replacing the usual squared quadratic regularization term used in subgradient schemes by the logarithmic-quadratic distancelike function recently introduced by the authors. We then obtain interior ε-subgradient descent methods, which allow us to provide a natural extension of bundle methods and Polyak's subgradient projection methods for nonsmooth convex minimization. Furthermore, similar extensions are considered for smooth constrained minimization to produce interior gradient descent methods. Global convergence as well as an improved global efficiency estimate are established for these methods within a unifying principle and minimal assumptions on the problem's data.

Original languageEnglish
Pages (from-to)1-26
Number of pages26
JournalMathematics of Operations Research
Issue number1
StatePublished - Feb 2004


  • Bundle methods
  • Convex programming
  • Logarithmic-quadratic proximal methods
  • Nondifferentiable convex optimization
  • Projected gradient methods
  • Subgradient algorithms


Dive into the research topics of 'Interior gradient and epsilon-subgradient descent methods for constrained convex minimization'. Together they form a unique fingerprint.

Cite this