Boosting with multi-way branching in decision trees

Yishay Mansour, David McAllester

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

It is known that decision tree learning can be viewed as a form of boosting. However, existing boosting theorems for decision tree learning allow only binary-branching trees and the generalization to multi-branching trees is not immediate. Practical decision tree algorithms, such as CART and C4.5, implement a trade-off between the number of branches and the improvement in tree quality as measured by an index function. Here we give a boosting justification for a particular quantitative trade-off curve. Our main theorem states, in essence, that if we require an improvement proportional to the log of the number of branches then top-down greedy construction of decision trees remains an effective boosting algorithm.

Original languageEnglish
Title of host publicationAdvances in Neural Information Processing Systems 12 - Proceedings of the 1999 Conference, NIPS 1999
PublisherNeural information processing systems foundation
Pages300-306
Number of pages7
ISBN (Print)0262194503, 9780262194501
StatePublished - 2000
Externally publishedYes
Event13th Annual Neural Information Processing Systems Conference, NIPS 1999 - Denver, CO, United States
Duration: 29 Nov 19994 Dec 1999

Publication series

NameAdvances in Neural Information Processing Systems
ISSN (Print)1049-5258

Conference

Conference13th Annual Neural Information Processing Systems Conference, NIPS 1999
Country/TerritoryUnited States
CityDenver, CO
Period29/11/994/12/99

Fingerprint

Dive into the research topics of 'Boosting with multi-way branching in decision trees'. Together they form a unique fingerprint.

Cite this