Bi-level path following for cross validated solution of kernel quantile regression

Saharon Rosset*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

17 Scopus citations


We show how to follow the path of cross validated solutions to families of regularized optimization problems, defined by a combination of a parameterized loss function and a regularization term. A primary example is kernel quantile regression, where the parameter of the loss function is the quantile being estimated. Even though the bi-level optimization problem we encounter for every quantile is non-convex, the manner in which the optimal cross-validated solution evolves with the parameter of the loss function allows tracking of this solution. We prove this property, construct the resulting algorithm, and demonstrate it on real and artificial data. This algorithm allows us to efficiently solve the whole family of bi-level problems. We show how it can be extended to cover other modeling problems, like support vector regression, and alternative in-sample model selection approaches.1

Original languageEnglish
Pages (from-to)2473-2505
Number of pages33
JournalJournal of Machine Learning Research
StatePublished - 2009


Dive into the research topics of 'Bi-level path following for cross validated solution of kernel quantile regression'. Together they form a unique fingerprint.

Cite this