Margin maximizing loss functions

Saharon Rosset, Ji Zhu, Trevor Hastie

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review


Margin maximizing properties play an important role in the analysis of classification models, such as boosting and support vector machines. Margin maximization is theoretically interesting because it facilitates generalization error analysis, and practically interesting because it presents a clear geometric interpretation of the models being built. We formulate and prove a sufficient condition for the solutions of regularized loss functions to converge to margin maximizing separators, as the regularization vanishes. This condition covers the hinge loss of SVM, the exponential loss of AdaBoost and logistic regression loss. We also generalize it to multi-class classification problems, and present margin maximizing multiclass versions of logistic regression and support vector machines.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 16 - Proceedings of the 2003 Conference, NIPS 2003
PublisherNeural information processing systems foundation
ISBN (Print)0262201526, 9780262201520
StatePublished - 2004
Externally publishedYes
Event17th Annual Conference on Neural Information Processing Systems, NIPS 2003 - Vancouver, BC, Canada
Duration: 8 Dec 200313 Dec 2003

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258


Conference17th Annual Conference on Neural Information Processing Systems, NIPS 2003
CityVancouver, BC


Dive into the research topics of 'Margin maximizing loss functions'. Together they form a unique fingerprint.

Cite this