We extend epsilon-subgradient descent methods for unconstrained nonsmooth convex minimization to constrained problems over polyhedral sets, in particular over ℝ+p. This is done by replacing the usual squared quadratic regularization term used in subgradient schemes by the logarithmic-quadratic distancelike function recently introduced by the authors. We then obtain interior ε-subgradient descent methods, which allow us to provide a natural extension of bundle methods and Polyak's subgradient projection methods for nonsmooth convex minimization. Furthermore, similar extensions are considered for smooth constrained minimization to produce interior gradient descent methods. Global convergence as well as an improved global efficiency estimate are established for these methods within a unifying principle and minimal assumptions on the problem's data.
- Bundle methods
- Convex programming
- Logarithmic-quadratic proximal methods
- Nondifferentiable convex optimization
- Projected gradient methods
- Subgradient algorithms