Abstract
A distance matrix A ∈ Rn×m represents all pairwise distances, Aij = d(xi, yj), between two point sets x1, ..., xn and y1, ..., ym in an arbitrary metric space (Z, d). Such matrices arise in various computational contexts such as learning image manifolds, handwriting recognition, and multi-dimensional unfolding. In this work we study algorithms for low-rank approximation of distance matrices. Recent work by Bakshi and Woodruff (NeurIPS 2018) showed it is possible to compute a rank-k approximation of a distance matrix in time O((n + m)1+γ) · poly (k, 1/ε), where ε > 0 is an error parameter and γ > 0 is an arbitrarily small constant. Notably, their bound is sublinear in the matrix size, which is unachievable for general matrices. We present an algorithm that is both simpler and more efficient. It reads only O((n + m)k/ε) entries of the input matrix, and has a running time of O(n + m) · poly (k, 1/ε). We complement the sample complexity of our algorithm with a matching lower bound on the number of entries that must be read by any algorithm. We provide experimental results to validate the approximation quality and running time of our algorithm.
Original language | English |
---|---|
Pages (from-to) | 1723-1751 |
Number of pages | 29 |
Journal | Proceedings of Machine Learning Research |
Volume | 99 |
State | Published - 2019 |
Externally published | Yes |
Event | 32nd Conference on Learning Theory, COLT 2019 - Phoenix, United States Duration: 25 Jun 2019 → 28 Jun 2019 |
Funding
Funders | Funder number |
---|---|
Ainesh Bakshi | |
National Science Foundation | CCF-1815840 |
Simons Foundation | |
Simons Institute for the Theory of Computing, University of California Berkeley |
Keywords
- Distance Matrix
- Low-rank Approximation