## Abstract

We characterize the complexity of minimizing max_{i}∈[N_{]} f_{i}(x) for convex, Lipschitz functions f_{1}, . . ., f_{N}. For non-smooth functions, existing methods require O(Nε^{-2}) queries to a first-order oracle to compute an ε-suboptimal point and O^{e}(Nε^{-1}) queries if the f_{i} are O(1/ε)-smooth. We develop methods with improved complexity bounds of O^{e}(Nε^{-2/3} + ε^{-8/3}) in the non-smooth case and O^{e}(Nε^{-2/3} + √Nε^{-1}) in the O(1/ε)-smooth case. Our methods consist of a recently proposed ball optimization oracle acceleration algorithm (which we refine) and a careful implementation of said oracle for the softmax function. We also prove an oracle complexity lower bound scaling as Ω(Nε^{-2/3}), showing that our dependence on N is optimal up to polylogarithmic factors.

Original language | English |
---|---|

Pages (from-to) | 866-882 |

Number of pages | 17 |

Journal | Proceedings of Machine Learning Research |

Volume | 134 |

State | Published - 2021 |

Event | 34th Conference on Learning Theory, COLT 2021 - Boulder, United States Duration: 15 Aug 2021 → 19 Aug 2021 |

### Funding

Funders | Funder number |
---|---|

Yandex Machine Learning Initiative for Machine Learning | |

National Science Foundation | CCF-1955039, CCF-1844855 |

Stanford University | |

Microsoft Research | |

Blavatnik Family Foundation |

## Keywords

- Ball optimization oracle
- Convex optimization
- Min-max problems
- Monteiro-Svaiter acceleration
- Stochastic first-order methods