A Rank-1 Sketch for Matrix Multiplicative Weights

Yair Carmon, John C. Duchi, Sidford Aaron, Tian Kevin

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

We show that a simple randomized sketch of the matrix multiplicative weight (MMW) update enjoys (in expectation) the same regret bounds as MMW, up to a small constant factor. Unlike MMW, where every step requires full matrix exponentiation, our steps require only a single product of the form $e^A b$, which the Lanczos method approximates efficiently. Our key technique is to view the sketch as a randomized mirror projection, and perform mirror descent analysis on the expected projection. Our sketch solves the online eigenvector problem, improving the best known complexity bounds by $ n)$. We also apply this sketch to semidefinite programming in saddle-point form, yielding a simple primal-dual scheme with guarantees matching the best in the literature.
Original languageEnglish
Title of host publicationProceedings of the Thirty-Second Conference on Learning Theory (COLT)
EditorsAlina Beygelzimer, Daniel Hsu
Place of PublicationPhoenix, USA
PublisherPMLR
Pages589-623
Number of pages35
Volume99
StatePublished - 1 Jan 2019
Externally publishedYes
Event32nd Annual Conference on Learning Theory, COLT 2019 - Phoenix, United States
Duration: 25 Jun 201928 Jun 2019
Conference number: 32

Publication series

NameProceedings of Machine Learning Research
PublisherPMLR
ISSN (Electronic)2640-3498

Conference

Conference32nd Annual Conference on Learning Theory, COLT 2019
Abbreviated titleCOLT 2019
Country/TerritoryUnited States
CityPhoenix
Period25/06/1928/06/19

Fingerprint

Dive into the research topics of 'A Rank-1 Sketch for Matrix Multiplicative Weights'. Together they form a unique fingerprint.

Cite this