TY - JOUR

T1 - The rate loss in the wyner-ziv problem

AU - Zamir, Ram

N1 - Funding Information:
Manuscript received July 15, 1995; revised May 26, 1996. This work was performed while the author was at Cornell University, Ithaca, NY, and supported in part by the Wolfson Research Awards administered by the Israel Academy of Science and Humanities. The material in this paper was presented in part at the Information Theory Workshops, at Rydzyna, Poland, June 1995, and Haifa, Israel, June 1996.

PY - 1996

Y1 - 1996

N2 - The rate-distortion function for source coding with side information at the decoder (the "Wyner-Ziv problem") is given in terms of an auxiliary random variable, which forms a Markov chain with the source and the side information. This Markov chain structure, typical to the solution of multiterminal source coding problems, corresponds to a loss in coding rate with respect to the conditional rate-distortion function, i.e., to the case where the encoder is fully informed. We show that for difference (or balanced) distortion measures, this loss is bounded by a universal constant, which is the minimax capacity of a suitable additive-noise channel. Furthermore, in the worst case, this loss is equal to the maximin redundancy over the rate-distortion function of the additive noise "test" channel. For example, the loss in the Wyner-Ziv problem is less than 0.5 bit/sample in the squared-error distortion case, and it is less than 0.22 bit for a binary source with Hamming distance. These results have implications also in universal quantization with side information, and in more general multiterminal source coding problems.

AB - The rate-distortion function for source coding with side information at the decoder (the "Wyner-Ziv problem") is given in terms of an auxiliary random variable, which forms a Markov chain with the source and the side information. This Markov chain structure, typical to the solution of multiterminal source coding problems, corresponds to a loss in coding rate with respect to the conditional rate-distortion function, i.e., to the case where the encoder is fully informed. We show that for difference (or balanced) distortion measures, this loss is bounded by a universal constant, which is the minimax capacity of a suitable additive-noise channel. Furthermore, in the worst case, this loss is equal to the maximin redundancy over the rate-distortion function of the additive noise "test" channel. For example, the loss in the Wyner-Ziv problem is less than 0.5 bit/sample in the squared-error distortion case, and it is less than 0.22 bit for a binary source with Hamming distance. These results have implications also in universal quantization with side information, and in more general multiterminal source coding problems.

KW - Additive-noise test channel

KW - Conditional rate distortion

KW - Markov chain

KW - Minimax capacity

KW - Side information

UR - http://www.scopus.com/inward/record.url?scp=0000863792&partnerID=8YFLogxK

U2 - 10.1109/18.556597

DO - 10.1109/18.556597

M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???

AN - SCOPUS:0000863792

SN - 0018-9448

VL - 42

SP - 2073

EP - 2084

JO - IEEE Transactions on Information Theory

JF - IEEE Transactions on Information Theory

IS - 6 PART 2

ER -