Generalization Bounds for Data-Driven Numerical Linear Algebra

Peter Bartlett, Piotr Indyk, Tal Wagner

Research output: Contribution to journalConference articlepeer-review

5 Scopus citations

Abstract

Data-driven algorithms can adapt their internal structure or parameters to inputs from unknown application-specific distributions, by learning from a training sample of inputs. Several recent works have applied this approach to problems in numerical linear algebra, obtaining significant empirical gains in performance. However, no theoretical explanation for their success was known. In this work we prove generalization bounds for those algorithms, within the PAC-learning framework for data-driven algorithm selection proposed by Gupta and Roughgarden (SICOMP 2017). Our main results are closely matching upper and lower bounds on the fat shattering dimension of the learning-based low rank approximation algorithm of Indyk et al. (NeurIPS 2019). Our techniques are general, and provide generalization bounds for many other recently proposed data-driven algorithms in numerical linear algebra, covering both sketching-based and multigrid-based methods. This considerably broadens the class of data-driven algorithms for which a PAC-learning analysis is available.

Original languageEnglish
Pages (from-to)2013-2024
Number of pages12
JournalProceedings of Machine Learning Research
Volume178
StatePublished - 2022
Externally publishedYes
Event35th Conference on Learning Theory, COLT 2022 - London, United Kingdom
Duration: 2 Jul 20225 Jul 2022

Funding

FundersFunder number
GIST-MIT
National Science FoundationDMS-2022448

    Keywords

    • PAC-learning
    • data-driven algorithms
    • fat shattering dimension
    • low rank approximation
    • multigrid
    • numerical linear algebra
    • pseudo-dimension
    • sketching

    Fingerprint

    Dive into the research topics of 'Generalization Bounds for Data-Driven Numerical Linear Algebra'. Together they form a unique fingerprint.

    Cite this