Gaussian codes and shannon bounds for multiple descriptions

Ram Zamir*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

68 Scopus citations

Abstract

A pair of well-known inequalities due to Shannon upper/lowerbound the rate-distortion function of a real source by the rate-distortion function of the Gaussian source with the same variance/entropy. We extend these bounds to multiple descriptions, a problem for which a general "single-letter̊ solution is not known. We show that the set DX(R1, R2) of achievable marginal (d1, d2) and central (d0) mean-squared errors in decoding X from two descriptions at rates R1 and R2 satisfies D* (σx2, R1, R2) ⊆ DX(R1, R2) ⊆ D*(Px, R1 R2) where σx2, and Px are the variance and the entropy-power of X, respectively, and D* (σ2, R1, R2) is the multiple description distortion region for a Gaussian source with variance σ2 found by Ozarow. We further show that like in the single description case, a Gaussian random code achieves the outer bound in the limit as d1, d2 → 0, thus the outer bound is asymptotically tight at high resolution conditions.

Original languageEnglish
Pages (from-to)2629-2636
Number of pages8
JournalIEEE Transactions on Information Theory
Volume45
Issue number7
DOIs
StatePublished - 1999

Keywords

  • Gaussian codes
  • High resolution
  • Multiple descriptions
  • Shannon lower bound

Fingerprint

Dive into the research topics of 'Gaussian codes and shannon bounds for multiple descriptions'. Together they form a unique fingerprint.

Cite this