## Abstract

The Fisher information J(X) of a random variable X under a translation parameter appears in information theory in the classical proof of the Entropy-Power Inequality (EPI). It enters the proof of the EPI via the De-Bruijn identity, where it measures the variation of the differential entropy under a Gaussian perturbation, and via the convolution inequality J(X + Y) ^{-1} ≥ J(X) ^{-1} + J(Y) ^{-1} (for independent X and Y), known as the Fisher Information Inequality (FII). The FII is proved in the literature directly, in a rather involved way. We give an alternative derivation of the FII, as a simple consequence of a "data-processing inequality" for the Cramer-Rao lower bound on parameter estimation.

Original language | English |
---|---|

Pages (from-to) | 1246-1250 |

Number of pages | 5 |

Journal | IEEE Transactions on Information Theory |

Volume | 44 |

Issue number | 3 |

DOIs | |

State | Published - 1998 |

## Keywords

- Cramer-Rao bound
- Data processing inequality
- Entropy-power inequality
- Fisher information
- Linear modeling
- Non-Gaussian noise
- Prefiltering