Boosted mixture of experts: An ensemble learning scheme

Ran Avnimelech*, Nathan Intrator

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

We present a new supervised learning procedure for ensemble machines, in which outputs of predictors, trained on different distributions, are combined by a dynamic classifier combination model. This procedure may be viewed as either a version of mixture of experts (Jacobs, Jordan, Nowlan, & Hinton, 1991), applied to classification, or a variant of the boosting algorithm (Schapire, 1990). As a variant of the mixture of experts, it can be made appropriate for general classification and regression problems by initializing the partition of the data set to different experts in a boostlike manner. If viewed as a variant of the boosting algorithm, its main gain is the use of a dynamic combination model for the outputs of the networks. Results are demonstrated on a synthetic example and a digit recognition task from the NIST database and compared with classical ensemble approaches.

Original languageEnglish
Pages (from-to)483-497
Number of pages15
JournalNeural Computation
Volume11
Issue number2
DOIs
StatePublished - 15 Feb 1999

Fingerprint

Dive into the research topics of 'Boosted mixture of experts: An ensemble learning scheme'. Together they form a unique fingerprint.

Cite this