TY - GEN
T1 - Polynomial tensor sketch for element-wise function of low-rank matrix
AU - Han, Insu
AU - Avron, Haim
AU - Shin, Jinwoo
N1 - Publisher Copyright:
© International Conference on Machine Learning, ICML 2020. All rights reserved.
PY - 2020
Y1 - 2020
N2 - This paper studies how to sketch element-wise functions of low-rank matrices. Formally, given low-rank matrix A = [Aij ] and scalar non-linear function f, we aim for finding an approximated low-rank representation of the (possibly highrank) matrix [f(Aij )]. To this end, we propose an efficient sketching-based algorithm whose complexity is significantly lower than the number of entries of A, i.e., it runs without accessing all entries of [f(Aij )] explicitly. The main idea underlying our method is to combine a polynomial approximation of f with the existing tensor sketch scheme for approximating monomials of entries of A. To balance the errors of the two approximation components in an optimal manner, we propose a novel regression formula to find polynomial coefficients given A and f. In particular, we utilize a coreset-based regression with a rigorous approximation guarantee. Finally, we demonstrate the applicability and superiority of the proposed scheme under various machine learning tasks.
AB - This paper studies how to sketch element-wise functions of low-rank matrices. Formally, given low-rank matrix A = [Aij ] and scalar non-linear function f, we aim for finding an approximated low-rank representation of the (possibly highrank) matrix [f(Aij )]. To this end, we propose an efficient sketching-based algorithm whose complexity is significantly lower than the number of entries of A, i.e., it runs without accessing all entries of [f(Aij )] explicitly. The main idea underlying our method is to combine a polynomial approximation of f with the existing tensor sketch scheme for approximating monomials of entries of A. To balance the errors of the two approximation components in an optimal manner, we propose a novel regression formula to find polynomial coefficients given A and f. In particular, we utilize a coreset-based regression with a rigorous approximation guarantee. Finally, we demonstrate the applicability and superiority of the proposed scheme under various machine learning tasks.
UR - http://www.scopus.com/inward/record.url?scp=85105257607&partnerID=8YFLogxK
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:85105257607
T3 - 37th International Conference on Machine Learning, ICML 2020
SP - 3942
EP - 3951
BT - 37th International Conference on Machine Learning, ICML 2020
A2 - Daume, Hal
A2 - Singh, Aarti
PB - International Machine Learning Society (IMLS)
T2 - 37th International Conference on Machine Learning, ICML 2020
Y2 - 13 July 2020 through 18 July 2020
ER -