Principal component regression (PCR) is a useful method for regularizing least squares approximations. Although conceptually simple, straightforward implementations of PCR have high computational costs and so are inappropriate for large scale problems. In this paper, we propose efficient algorithms for computing approximate PCR solutions that, on one hand, are high quality approximations to the true PCR solutions (when viewed as minimizer of a constrained optimization problem) and, on the other hand, entertain rigorous risk bounds (when viewed as statistical estimators). In particular, we propose an input sparsity time algorithms for approximate PCR. We also consider computing an approximate PCR in the streaming model and kernel PCR. Empirical results demonstrate the excellent performance of our proposed methods.
- Compressed least squares
- Least squares
- Linear regression
- Principal component regression
- Randomized numerical linear algebra