TY - JOUR
T1 - Performance of first-order methods for smooth convex minimization
T2 - A novel approach
AU - Drori, Yoel
AU - Teboulle, Marc
PY - 2014/6
Y1 - 2014/6
N2 - We introduce a novel approach for analyzing the worst-case performance of first-order black-box optimization methods. We focus on smooth unconstrained convex minimization over the Euclidean space. Our approach relies on the observation that by definition, the worst-case behavior of a black-box optimization method is by itself an optimization problem, which we call the performance estimation problem (PEP). We formulate and analyze the PEP for two classes of first-order algorithms. We first apply this approach on the classical gradient method and derive a new and tight analytical bound on its performance. We then consider a broader class of first-order black-box methods, which among others, include the so-called heavy-ball method and the fast gradient schemes. We show that for this broader class, it is possible to derive new bounds on the performance of these methods by solving an adequately relaxed convex semidefinite PEP. Finally, we show an efficient procedure for finding optimal step sizes which results in a first-order black-box method that achieves best worst-case performance.
AB - We introduce a novel approach for analyzing the worst-case performance of first-order black-box optimization methods. We focus on smooth unconstrained convex minimization over the Euclidean space. Our approach relies on the observation that by definition, the worst-case behavior of a black-box optimization method is by itself an optimization problem, which we call the performance estimation problem (PEP). We formulate and analyze the PEP for two classes of first-order algorithms. We first apply this approach on the classical gradient method and derive a new and tight analytical bound on its performance. We then consider a broader class of first-order black-box methods, which among others, include the so-called heavy-ball method and the fast gradient schemes. We show that for this broader class, it is possible to derive new bounds on the performance of these methods by solving an adequately relaxed convex semidefinite PEP. Finally, we show an efficient procedure for finding optimal step sizes which results in a first-order black-box method that achieves best worst-case performance.
KW - Complexity
KW - Duality
KW - Fast gradient schemes
KW - Heavy Ball method
KW - Performance of first-order algorithms
KW - Rate of convergence
KW - Semidefinite relaxations
KW - Smooth convex minimization
UR - http://www.scopus.com/inward/record.url?scp=84901840826&partnerID=8YFLogxK
U2 - 10.1007/s10107-013-0653-0
DO - 10.1007/s10107-013-0653-0
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:84901840826
SN - 0025-5610
VL - 145
SP - 451
EP - 482
JO - Mathematical Programming
JF - Mathematical Programming
IS - 1-2
ER -