TY - JOUR
T1 - Manifold Learning with Arbitrary Norms
AU - Kileel, Joe
AU - Moscovich, Amit
AU - Zelesko, Nathan
AU - Singer, Amit
N1 - Publisher Copyright:
© 2021, The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature.
PY - 2021/10
Y1 - 2021/10
N2 - Manifold learning methods play a prominent role in nonlinear dimensionality reduction and other tasks involving high-dimensional data sets with low intrinsic dimensionality. Many of these methods are graph-based: they associate a vertex with each data point and a weighted edge with each pair. Existing theory shows that the Laplacian matrix of the graph converges to the Laplace–Beltrami operator of the data manifold, under the assumption that the pairwise affinities are based on the Euclidean norm. In this paper, we determine the limiting differential operator for graph Laplacians constructed using any norm. Our proof involves an interplay between the second fundamental form of the manifold and the convex geometry of the given norm’s unit ball. To demonstrate the potential benefits of non-Euclidean norms in manifold learning, we consider the task of mapping the motion of large molecules with continuous variability. In a numerical simulation we show that a modified Laplacian eigenmaps algorithm, based on the Earthmover’s distance, outperforms the classic Euclidean Laplacian eigenmaps, both in terms of computational cost and the sample size needed to recover the intrinsic geometry.
AB - Manifold learning methods play a prominent role in nonlinear dimensionality reduction and other tasks involving high-dimensional data sets with low intrinsic dimensionality. Many of these methods are graph-based: they associate a vertex with each data point and a weighted edge with each pair. Existing theory shows that the Laplacian matrix of the graph converges to the Laplace–Beltrami operator of the data manifold, under the assumption that the pairwise affinities are based on the Euclidean norm. In this paper, we determine the limiting differential operator for graph Laplacians constructed using any norm. Our proof involves an interplay between the second fundamental form of the manifold and the convex geometry of the given norm’s unit ball. To demonstrate the potential benefits of non-Euclidean norms in manifold learning, we consider the task of mapping the motion of large molecules with continuous variability. In a numerical simulation we show that a modified Laplacian eigenmaps algorithm, based on the Earthmover’s distance, outperforms the classic Euclidean Laplacian eigenmaps, both in terms of computational cost and the sample size needed to recover the intrinsic geometry.
KW - Convex body
KW - Diffusion maps
KW - Dimensionality reduction
KW - Laplacian eigenmaps
KW - Riemannian geometry
KW - Second-order differential operator
UR - http://www.scopus.com/inward/record.url?scp=85114668758&partnerID=8YFLogxK
U2 - 10.1007/s00041-021-09879-2
DO - 10.1007/s00041-021-09879-2
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:85114668758
SN - 1069-5869
VL - 27
JO - Journal of Fourier Analysis and Applications
JF - Journal of Fourier Analysis and Applications
IS - 5
M1 - 82
ER -