A Rank-1 Sketch for Matrix Multiplicative Weights

Yair Carmon, John C. Duchi, Aaron Sidford, Kevin Tian

Research output: Contribution to journalConference articlepeer-review

13 Scopus citations

Abstract

We show that a simple randomized sketch of the matrix multiplicative weight (MMW) update enjoys (in expectation) the same regret bounds as MMW, up to a small constant factor. Unlike MMW, where every step requires full matrix exponentiation, our steps require only a single product of the form eAb, which the Lanczos method approximates efficiently. Our key technique is to view the sketch as a randomized mirror projection, and perform mirror descent analysis on the expected projection. Our sketch solves the online eigenvector problem, improving the best known complexity bounds by Ω(log5n). We also apply this sketch to semidefinite programming in saddle-point form, yielding a simple primal-dual scheme with guarantees matching the best in the literature.

Original languageEnglish
Pages (from-to)589-623
Number of pages35
JournalProceedings of Machine Learning Research
Volume99
StatePublished - 2019
Externally publishedYes
Event32nd Conference on Learning Theory, COLT 2019 - Phoenix, United States
Duration: 25 Jun 201928 Jun 2019

Funding

FundersFunder number
National Science Foundation1553086
Alfred P. Sloan FoundationDGE-1656518, ONR-YIP N00014-19-1-2288, CCF-1844855

    Keywords

    • Lanczos method
    • Online learning
    • matrix exponential
    • mirror descent
    • spectrahedron

    Fingerprint

    Dive into the research topics of 'A Rank-1 Sketch for Matrix Multiplicative Weights'. Together they form a unique fingerprint.

    Cite this