A proof of the Fisher Information Inequality via a data processing argument

Ram Zamir*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

121 Scopus citations

Abstract

The Fisher information J(X) of a random variable X under a translation parameter appears in information theory in the classical proof of the Entropy-Power Inequality (EPI). It enters the proof of the EPI via the De-Bruijn identity, where it measures the variation of the differential entropy under a Gaussian perturbation, and via the convolution inequality J(X + Y) -1 ≥ J(X) -1 + J(Y) -1 (for independent X and Y), known as the Fisher Information Inequality (FII). The FII is proved in the literature directly, in a rather involved way. We give an alternative derivation of the FII, as a simple consequence of a "data-processing inequality" for the Cramer-Rao lower bound on parameter estimation.

Original languageEnglish
Pages (from-to)1246-1250
Number of pages5
JournalIEEE Transactions on Information Theory
Volume44
Issue number3
DOIs
StatePublished - 1998

Funding

FundersFunder number
Israel Academy of Sciences and Humanities

    Keywords

    • Cramer-Rao bound
    • Data processing inequality
    • Entropy-power inequality
    • Fisher information
    • Linear modeling
    • Non-Gaussian noise
    • Prefiltering

    Fingerprint

    Dive into the research topics of 'A proof of the Fisher Information Inequality via a data processing argument'. Together they form a unique fingerprint.

    Cite this