Bayes and Tukey meet at the center point

Ran Gilad-Bachrach*, Amir Navot, Naftali Tishby

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

The Bayes classifier achieves the minimal error rate by constructing a weighted majority over all concepts in the concept class. The Bayes Point [1] uses the single concept in the class which has the minimal error. This way, the Bayes Point avoids some of the deficiencies of the Bayes classifier. We prove a bound on the generalization error for Bayes Point Machines when learning linear classifiers, and show that it is at most ∼ 1.71 times the generalization error of the Bayes classifier, independent of the input dimension and length of training. We show that when learning linear classifiers, the Bayes Point is almost identical to the Tukey Median [2] and Center Point [3]. We extend these definitions beyond linear classifiers and define the Bayes Depth of a classifier. We prove generalization bound in terms of this new definition. Finally we provide a new concentration of measure inequality for multivariate random variables to the Tukey Median.

Original languageEnglish
Pages (from-to)549-563
Number of pages15
JournalLecture Notes in Computer Science
Volume3120
DOIs
StatePublished - 2004
Externally publishedYes
Event17th Annual Conference on Learning Theory, COLT 2004 - Banff, Canada
Duration: 1 Jul 20044 Jul 2004

Fingerprint

Dive into the research topics of 'Bayes and Tukey meet at the center point'. Together they form a unique fingerprint.

Cite this