Isotonic Modeling with Non-Differentiable Loss Functions with Application to Lasso Regularization

Research output: Contribution to journalArticlepeer-review

6 Scopus citations


In this paper we present an algorithmic approach for fitting isotonic models under convex, yet non-differentiable, loss functions. It is a generalization of the greedy non-regret approach proposed by Luss and Rosset (2014) for differentiable loss functions, taking into account the sub-gradiental extensions required. We prove that our suggested algorithm solves the isotonic modeling problem while maintaining favorable computational and statistical properties. As our suggested algorithm may be used for any non-differentiable loss function, we focus our interest on isotonic modeling for either regression or two-class classification with appropriate log-likelihood loss and lasso penalty on the fitted values. This combination allows us to maintain the non-parametric nature of isotonic modeling, while controlling model complexity through regularization. We demonstrate the efficiency and usefulness of this approach on both synthetic and real world data. An implementation of our suggested solution is publicly available from the first author's website (

Original languageEnglish
Article number7117430
Pages (from-to)308-321
Number of pages14
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Issue number2
StatePublished - 1 Feb 2016


  • GIRP
  • convex optimization
  • isotonic regression
  • nonparametric regression
  • regularization path


Dive into the research topics of 'Isotonic Modeling with Non-Differentiable Loss Functions with Application to Lasso Regularization'. Together they form a unique fingerprint.

Cite this