Abstract
We analyze proximal methods based on entropy-like distances for the minimization of convex functions subject to nonnegativity constraints. We prove global convergence results for the methods with approximate minimization steps and an ergodic convergence result for the case of finding a zero of a maximal monotone operator. We also consider linearly constrained convex problems and establish a quadratic convergence rate result for linear programs. Our analysis allows us to simplify and extend the available convergence results for these methods.
Original language | English |
---|---|
Pages (from-to) | 1069-1083 |
Number of pages | 15 |
Journal | SIAM Journal on Optimization |
Volume | 7 |
Issue number | 4 |
DOIs | |
State | Published - Nov 1997 |
Keywords
- Convex optimization
- Maximal monotone operator
- Proximal methods