TY - JOUR
T1 - A Generalization of the Entropy Power Inequality with Applications
AU - Zamir, Ram
AU - Feder, Meir
N1 - Funding Information:
Manuscript received July 29, 1993; revised February 10, 1993. This work was supported in part by the Wolfson Research Awards administrated by the Israel Academy of Science and Humanities. This paper was presented in part at the International Symposium on Information Theory, San Antonio, TX, January 1993. The authors are with the Department of Electrical Engineering-Systems, Tel-Aviv University, Tel-Aviv 69978, Israel. IEEE Log Number 9211273.
PY - 1993/9
Y1 - 1993/9
N2 - We prove the following generalization of the Entropy Power Inequality: [formula omitted] where h(.) denotes (joint-) differential-entropy, x = x1…xnis a random vector with independent components, [formula omitted] is a Gaussian vector with independent components such that h(xi~) = h(xi), i = 1…n, and A is any matrix. This generalization of the entropy-power inequality is applied to show that a non-Gaussian vector with independent components becomes “closer” to Gaussianity after a linear transformation, where the distance to Gaussianity is measured by the information divergence. Another application is a lower bound, greater than zero, for the mutual-information between nonoverlapping spectral components of a non-Gaussian white process. Finally, we describe a dual generalization of the Fisher Information Inequality.
AB - We prove the following generalization of the Entropy Power Inequality: [formula omitted] where h(.) denotes (joint-) differential-entropy, x = x1…xnis a random vector with independent components, [formula omitted] is a Gaussian vector with independent components such that h(xi~) = h(xi), i = 1…n, and A is any matrix. This generalization of the entropy-power inequality is applied to show that a non-Gaussian vector with independent components becomes “closer” to Gaussianity after a linear transformation, where the distance to Gaussianity is measured by the information divergence. Another application is a lower bound, greater than zero, for the mutual-information between nonoverlapping spectral components of a non-Gaussian white process. Finally, we describe a dual generalization of the Fisher Information Inequality.
KW - Entropy power inequality
KW - Fisher information inequality
KW - divergence
KW - non-Gaussianity
UR - http://www.scopus.com/inward/record.url?scp=0027659563&partnerID=8YFLogxK
U2 - 10.1109/18.259666
DO - 10.1109/18.259666
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:0027659563
SN - 0018-9448
VL - 39
SP - 1723
EP - 1728
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 5
ER -