The Ziv-Zakai-Rényi Bound for Joint Source-Channel Coding

Sergey Tridenski, Ram Zamir, Amir Ingber

Research output: Contribution to journalArticlepeer-review

10 Scopus citations

Abstract

Shannon's capacity and rate-distortion function, combined with the separation principle, provide tight bounds for the minimum possible distortion in joint source-channel coding. These bounds, however, are usually achievable only in the limit of a large block length. In their 1973 paper, Ziv and Zakai introduced a family of alternative capacity and rate-distortion functions, based on functionals satisfying the data-processing inequality, which potentially give tighter bounds for systems with a small block length. There is a considerable freedom as to how to choose those functionals, and the ways of finding the best possible functionals yielding the best bounds for a given source-channel combination are not specified. We examine recently conjectured high SNR asymptotic expressions for the Ziv-Zakai bounds, based on the Rényi-divergence functional. We derive nonasymptotic bounds on the Ziv-Zakai-Rényi rate-distortion function and capacity for a broad class of sources and additive noise channels, which hold for arbitrary SNR and prove the conjectured asymptotic expressions in the limit of a small distortion/high SNR. The results lead to new bounds on the best achievable distortion in finite dimensional joint source-channel coding. Examples are presented where the new bounds achieve significant improvement upon Shannon's original bounds.

Original languageEnglish
Article number7124492
Pages (from-to)4293-4315
Number of pages23
JournalIEEE Transactions on Information Theory
Volume61
Issue number8
DOIs
StatePublished - Aug 2015

Keywords

  • Joint source-channel coding
  • Renyi divergence
  • Ziv-Zakai
  • finite blocklength

Fingerprint

Dive into the research topics of 'The Ziv-Zakai-Rényi Bound for Joint Source-Channel Coding'. Together they form a unique fingerprint.

Cite this