The Strong Data Processing Inequality Under the Heat Flow

Bo'az Klartag*, Or Ordentlich

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Let ν and μ be probability distributions on ℝn , and νs, μs be their evolution under the heat flow, that is, the probability distributions resulting from convolving their density with the density of an isotropic Gaussian random vector with variance s in each entry. This paper studies the rate of decay of s ⭲ D(νs||μs) for various divergences, including the χ 2 and Kullback-Leibler (KL) divergences. We prove upper and lower bounds on the strong data-processing inequality (SDPI) coefficients corresponding to the source \mu and the Gaussian channel. We also prove generalizations of de Bruijn’s identity, and Costa’s result on the concavity in s of the differential entropy of νs . As a byproduct of our analysis, we obtain new lower bounds on the mutual information between X and Y=X+√s Z , where Z is a standard Gaussian vector in ℝn , independent of X, and on the minimum mean-square error (MMSE) in estimating X from Y, in terms of the Poincaré constant of X.

Original languageEnglish
Pages (from-to)3317-3333
Number of pages17
JournalIEEE Transactions on Information Theory
Volume71
Issue number5
DOIs
StatePublished - 2025
Externally publishedYes

Funding

FundersFunder number
Israel Science Foundation1641/21, 674/24

    Keywords

    • additive white Gaussian noise channel
    • de Bruijn’s identity
    • maximal correlation
    • Strong data processing inequality (SDPI)

    Fingerprint

    Dive into the research topics of 'The Strong Data Processing Inequality Under the Heat Flow'. Together they form a unique fingerprint.

    Cite this