Abstract
Nonlinear dimensionality reduction methods often include the construction of kernels for embedding the high-dimensional data points. Standard methods for extending the embedding coordinates (such as the Nyström method) also rely on spectral decomposition of kernels. It is desirable that these kernels capture most of the data sets’ information using only a few leading modes of the spectrum. In this work we propose multi-scale kernels, which are constructed as combinations of Gaussian kernels, to be used for kernel-based extension schemes. We review the kernels’ spectral properties and show that their first few modes capture more information compared to the standard Gaussian kernel. Their application is demonstrated on a synthetic data-set and also applied to a real-life example that models daily electricity profiles and predicts the average day-ahead behavior.
Original language | English |
---|---|
Pages (from-to) | 165-177 |
Number of pages | 13 |
Journal | Applied Mathematics and Computation |
Volume | 319 |
DOIs | |
State | Published - 15 Feb 2018 |
Externally published | Yes |
Keywords
- Dimensionality reduction
- Function extension
- Kernel methods
- Manifold learning