The rate loss in the wyner-ziv problem

Ram Zamir*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

The rate-distortion function for source coding with side information at the decoder (the "Wyner-Ziv problem") is given in terms of an auxiliary random variable, which forms a Markov chain with the source and the side information. This Markov chain structure, typical to the solution of multiterminal source coding problems, corresponds to a loss in coding rate with respect to the conditional rate-distortion function, i.e., to the case where the encoder is fully informed. We show that for difference (or balanced) distortion measures, this loss is bounded by a universal constant, which is the minimax capacity of a suitable additive-noise channel. Furthermore, in the worst case, this loss is equal to the maximin redundancy over the rate-distortion function of the additive noise "test" channel. For example, the loss in the Wyner-Ziv problem is less than 0.5 bit/sample in the squared-error distortion case, and it is less than 0.22 bit for a binary source with Hamming distance. These results have implications also in universal quantization with side information, and in more general multiterminal source coding problems.

Original languageEnglish
Pages (from-to)2073-2084
Number of pages12
JournalIEEE Transactions on Information Theory
Volume42
Issue number6 PART 2
DOIs
StatePublished - 1996

Keywords

  • Additive-noise test channel
  • Conditional rate distortion
  • Markov chain
  • Minimax capacity
  • Side information

Fingerprint

Dive into the research topics of 'The rate loss in the wyner-ziv problem'. Together they form a unique fingerprint.

Cite this