TY - GEN

T1 - Information bottleneck for Gaussian variables

AU - Chechik, Gal

AU - Globerson, Amir

AU - Tishby, Naftali

AU - Weiss, Yair

PY - 2004

Y1 - 2004

N2 - The problem of extracting the relevant aspects of data was addressed through the information bottleneck (IB) method, by (soft) clustering one variable while preserving information about another - relevance - variable. An interesting question addressed in the current work is the extension of these ideas to obtain continuous representations that preserve relevant information, rather than discrete clusters. We give a formal definition of the general continuous IB problem and obtain an analytic solution for the optimal representation for the important case of multivariate Gaussian variables. The obtained optimal representation is a noisy linear projection to eigenvectors of the normalized correlation matrix ∑x|y∑-1 x , which is also the basis obtained in Canonical Correlation Analysis. However, in Gaussian IB, the compression tradeoff parameter uniquely determines the dimension, as well as the scale of each eigenvector. This introduces a novel interpretation where solutions of different ranks lie on a continuum parametrized by the compression level. Our analysis also provides an analytic expression for the optimal tradeoff - The information curve - in terms of the eigenvalue spectrum.

AB - The problem of extracting the relevant aspects of data was addressed through the information bottleneck (IB) method, by (soft) clustering one variable while preserving information about another - relevance - variable. An interesting question addressed in the current work is the extension of these ideas to obtain continuous representations that preserve relevant information, rather than discrete clusters. We give a formal definition of the general continuous IB problem and obtain an analytic solution for the optimal representation for the important case of multivariate Gaussian variables. The obtained optimal representation is a noisy linear projection to eigenvectors of the normalized correlation matrix ∑x|y∑-1 x , which is also the basis obtained in Canonical Correlation Analysis. However, in Gaussian IB, the compression tradeoff parameter uniquely determines the dimension, as well as the scale of each eigenvector. This introduces a novel interpretation where solutions of different ranks lie on a continuum parametrized by the compression level. Our analysis also provides an analytic expression for the optimal tradeoff - The information curve - in terms of the eigenvalue spectrum.

UR - http://www.scopus.com/inward/record.url?scp=4444252081&partnerID=8YFLogxK

M3 - פרסום בספר כנס

AN - SCOPUS:4444252081

SN - 0262201526

SN - 9780262201520

T3 - Advances in Neural Information Processing Systems

BT - Advances in Neural Information Processing Systems 16 - Proceedings of the 2003 Conference, NIPS 2003

PB - Neural information processing systems foundation

Y2 - 8 December 2003 through 13 December 2003

ER -