A Generalization of the Entropy Power Inequality with Applications

Research output: Contribution to journalArticlepeer-review

Abstract

We prove the following generalization of the Entropy Power Inequality: [formula omitted] where h(.) denotes (joint-) differential-entropy, x = x1…xnis a random vector with independent components, [formula omitted] is a Gaussian vector with independent components such that h(xi~) = h(xi), i = 1…n, and A is any matrix. This generalization of the entropy-power inequality is applied to show that a non-Gaussian vector with independent components becomes “closer” to Gaussianity after a linear transformation, where the distance to Gaussianity is measured by the information divergence. Another application is a lower bound, greater than zero, for the mutual-information between nonoverlapping spectral components of a non-Gaussian white process. Finally, we describe a dual generalization of the Fisher Information Inequality.

Original languageEnglish
Pages (from-to)1723-1728
Number of pages6
JournalIEEE Transactions on Information Theory
Volume39
Issue number5
DOIs
StatePublished - Sep 1993

Keywords

  • Entropy power inequality
  • Fisher information inequality
  • divergence
  • non-Gaussianity

Fingerprint

Dive into the research topics of 'A Generalization of the Entropy Power Inequality with Applications'. Together they form a unique fingerprint.

Cite this