Boosting and support vector machines as optimal separators

Saharon Rosset*, Ji Zhu, Trevor Hastie

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

In this paper we study boosting methods from a new perspective. We build on recent work by Efron et al. to show that boosting approximately (and in some cases exactly) minimizes its loss criterion with an L1 constraint. For the two most commonly used loss criteria (exponential and logistic log-likelihood), we further show that as the constraint diminishes, or equivalently as the boosting iterations proceed, the solution converges - in the separable case - to an "L1-optimal" separating hyper-plane. This "L1-optimal" separating hyper-plane has the property of maximizing the minimal margin of the training data, as defined in the boosting literature. We illustrate through examples the regularized and asymptotic behavior of the solutions to the classifcation problem with both loss criteria.

Original languageEnglish
Pages (from-to)1-7
Number of pages7
JournalProceedings of SPIE - The International Society for Optical Engineering
Volume5010
DOIs
StatePublished - 2003
Externally publishedYes
EventDocument Recognition and Retrieval X - Santa Clara, CA, United States
Duration: 22 Jan 200324 Jan 2003

Fingerprint

Dive into the research topics of 'Boosting and support vector machines as optimal separators'. Together they form a unique fingerprint.

Cite this