## Abstract

We study robust subspace estimation in the streaming and distributed settings. Given a set of n data points {a_{i}}^{n}_{i}_{=1} in R^{d} and an integer k, we wish to find a linear subspace S of dimension k for which^{∑}_{i} M(dist(S, a_{i})) is minimized, where dist(S, x):= min_{y}∈_{S} ∥x − y∥_{2}, and M(·) is some loss function. When M is the identity function, S gives a subspace that is more robust to outliers than that provided by the truncated SVD. Though the problem is NP-hard, it is approximable within a (1 + ϵ) factor in polynomial time when k and ϵ are constant. We give the first sublinear approximation algorithm for this problem in the turnstile streaming and arbitrary partition distributed models, achieving the same time guarantees as in the offline case. Our algorithm is the first based entirely on oblivious dimensionality reduction, and significantly simplifies prior methods for this problem, which held in neither the streaming nor distributed models.

Original language | English |
---|---|

Pages (from-to) | 10683-10693 |

Number of pages | 11 |

Journal | Advances in Neural Information Processing Systems |

Volume | 2018-December |

State | Published - 2018 |

Externally published | Yes |

Event | 32nd Conference on Neural Information Processing Systems, NeurIPS 2018 - Montreal, Canada Duration: 2 Dec 2018 → 8 Dec 2018 |