An information theoretic tradeoff between complexity and accuracy

Ran Gilad-Bachrach, Amir Navot, Naftali Tishby

Research output: Contribution to journalConference articlepeer-review


A fundamental question in learning theory is the quantification of the basic tradeoff between the complexity of a model and its predictive accuracy. One valid way of quantifying this tradeoff, known as the "Information Bottleneck", is to measure both the complexity of the model and its prediction accuracy by using Shannon's mutual information. In this paper we show that the Information Bottleneck framework answers a well defined and known coding problem and at same time it provides a general relationship between complexity and prediction accuarcy, measured by mutual information. We study the nature of this complexity-accuracy tradeoff and discuss some of its theoretical properties. Furthermore, we present relations to classical information theoretic problems, such as rate-distortion theory, cost-capacity tradeoff and source coding with side information.

Original languageEnglish
Pages (from-to)595-609
Number of pages15
JournalLecture Notes in Computer Science
StatePublished - 2003
Externally publishedYes
Event16th Annual Conference on Learning Theory and 7th Kernel Workshop, COLT/Kernel 2003 - Washington, DC, United States
Duration: 24 Aug 200327 Aug 2003


Dive into the research topics of 'An information theoretic tradeoff between complexity and accuracy'. Together they form a unique fingerprint.

Cite this