Abstract
This paper presents two classes of decomposition algorithms based on the proximal method of multipliers (PMM) introduced in the mid-1970s by Rockafellar for convex minimization. We first show that the PMM framework is at the root of many past and recent decomposition schemes suggested in the literature allowing for an elementary analysis of these methods through a unified scheme. We then prove various sublinear global convergence rate results for the two classes of PMM based decomposition algorithms for function values and constraints violation. Furthermore, under a mild assumption on the problem's data we derive rate of convergence results in terms of the original primal function values for both classes. As a by-product of our analysis we also obtain convergence of the sequences produced by the two algorithm classes to optimal primal-dual solutions.
Original language | English |
---|---|
Pages (from-to) | 269-297 |
Number of pages | 29 |
Journal | SIAM Journal on Optimization |
Volume | 24 |
Issue number | 1 |
DOIs | |
State | Published - 2014 |
Keywords
- Alternating direction of multipliers
- Augmented Lagrangians
- Efficiency estimates
- Nonasymptotic rate of convergence
- Nonsmooth convex minimization
- Primal-dual methods
- Proximal method of multipliers