A Perturbation-Based Kernel Approximation Framework

Roy Mitz, Yoel Shkolnisky

Research output: Contribution to journalArticlepeer-review

Abstract

Kernel methods are powerful tools in various data analysis tasks. Yet, in many cases, their time and space complexity render them impractical for large datasets. Various kernel approximation methods were proposed to overcome this issue, with the most prominent method being the Nyström method. In this paper, we derive a perturbation-based kernel approximation framework building upon results from classical perturbation theory. We provide an error analysis for this framework, and prove that in fact, it generalizes the Nyström method and several of its variants. Furthermore, we show that our framework gives rise to new kernel approximation schemes, that can be tuned to take advantage of the structure of the approximated kernel matrix. We support our theoretical results numerically and demonstrate the advantages of our approximation framework on both synthetic and real-world data.

Original languageEnglish
JournalJournal of Machine Learning Research
Volume23
StatePublished - 1 Apr 2022

Keywords

  • kernel approximation
  • kernel-based non-linear dimensionality reduction
  • Nyström method
  • perturbation theory

Fingerprint

Dive into the research topics of 'A Perturbation-Based Kernel Approximation Framework'. Together they form a unique fingerprint.

Cite this