Low-rank updates of matrix square roots

Shany Shmueli, Petros Drineas, Haim Avron*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Models in which the covariance matrix has the structure of a sparse matrix plus a low rank perturbation are ubiquitous in data science applications. It is often desirable for algorithms to take advantage of such structures, avoiding costly matrix computations that often require cubic time and quadratic storage. This is often accomplished by performing operations that maintain such structures, for example, matrix inversion via the Sherman–Morrison–Woodbury formula. In this article, we consider the matrix square root and inverse square root operations. Given a low rank perturbation to a matrix, we argue that a low-rank approximate correction to the (inverse) square root exists. We do so by establishing a geometric decay bound on the true correction's eigenvalues. We then proceed to frame the correction as the solution of an algebraic Riccati equation, and discuss how a low-rank solution to that equation can be computed. We analyze the approximation error incurred when approximately solving the algebraic Riccati equation, providing spectral and Frobenius norm forward and backward error bounds. Finally, we describe several applications of our algorithms, and demonstrate their utility in numerical experiments.

Original languageEnglish
Article numbere2528
JournalNumerical Linear Algebra with Applications
Volume31
Issue number1
DOIs
StatePublished - Jan 2024

Funding

FundersFunder number
National Science Foundation10001390, 10001415
United States-Israel Binational Science Foundation2017698
Israel Science Foundation1272/17

    Keywords

    • low rank perturbations
    • low rank updates
    • matrix functions
    • matrix square root

    Fingerprint

    Dive into the research topics of 'Low-rank updates of matrix square roots'. Together they form a unique fingerprint.

    Cite this