Faster Kernel Matrix Algebra via Density Estimation

Arturs Backurs*, Piotr Indyk*, Cameron Musco*, Tal Wagner*

*Corresponding author for this work

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

5 Scopus citations

Abstract

We study fast algorithms for computing fundamental properties of a positive semidefinite kernel matrix K ∈ ℝn×n corresponding to n points x1, ..., xn ∈ Rd. In particular, we consider estimating the sum of kernel matrix entries, along with its top eigenvalue and eigenvector. We show that the sum of matrix entries can be estimated to 1 + ε relative error in time sublinear in n and linear in d for many popular kernels, including the Gaussian, exponential, and rational quadratic. For these kernels, we also show that the top eigenvalue (and an approximate eigenvector) can be approximated to 1 + ε relative error in time subquadratic in n and linear in d. Our results represent significant advances in the best known runtimes for these problems. They leverage the positive definiteness of the kernel matrix, along with a recent line of work on efficient kernel density estimation.

Original languageEnglish
Title of host publicationProceedings of the 38th International Conference on Machine Learning, ICML 2021
PublisherML Research Press
Pages500-510
Number of pages11
ISBN (Electronic)9781713845065
StatePublished - 2021
Externally publishedYes
Event38th International Conference on Machine Learning, ICML 2021 - Virtual, Online
Duration: 18 Jul 202124 Jul 2021

Publication series

NameProceedings of Machine Learning Research
Volume139
ISSN (Electronic)2640-3498

Conference

Conference38th International Conference on Machine Learning, ICML 2021
CityVirtual, Online
Period18/07/2124/07/21

Fingerprint

Dive into the research topics of 'Faster Kernel Matrix Algebra via Density Estimation'. Together they form a unique fingerprint.

Cite this