Motivated by the desire to extend fast randomized techniques to nonlinear lp regression, we consider a class of structured regression problems. These problems involve Vandermonde matrices which arise naturally in various statistical modeling settings, including classical polynomial fitting problems, additive models and approximations to recently developed randomized techniques for scalable kernel methods. We show that this structure can be exploited to further accelerate the solution of the regression problem, achieving running times that are faster than "input sparsity". We present empirical results confirming both the practical value of our modeling framework, as well as speedup benefits of randomized regression.
|Journal||Advances in Neural Information Processing Systems|
|State||Published - 2013|
|Event||27th Annual Conference on Neural Information Processing Systems, NIPS 2013 - Lake Tahoe, NV, United States|
Duration: 5 Dec 2013 → 10 Dec 2013