Abstract
A fundamental question in learning theory is the quantification of the basic tradeoff between the complexity of a model and its predictive accuracy. One valid way of quantifying this tradeoff, known as the "Information Bottleneck", is to measure both the complexity of the model and its prediction accuracy by using Shannon's mutual information. In this paper we show that the Information Bottleneck framework answers a well defined and known coding problem and at same time it provides a general relationship between complexity and prediction accuarcy, measured by mutual information. We study the nature of this complexity-accuracy tradeoff and discuss some of its theoretical properties. Furthermore, we present relations to classical information theoretic problems, such as rate-distortion theory, cost-capacity tradeoff and source coding with side information.
Original language | English |
---|---|
Pages (from-to) | 595-609 |
Number of pages | 15 |
Journal | Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics) |
Volume | 2777 |
DOIs | |
State | Published - 2003 |
Externally published | Yes |
Event | 16th Annual Conference on Learning Theory and 7th Kernel Workshop, COLT/Kernel 2003 - Washington, DC, United States Duration: 24 Aug 2003 → 27 Aug 2003 |