TY - JOUR
T1 - The Ziv-Zakai-Rényi Bound for Joint Source-Channel Coding
AU - Tridenski, Sergey
AU - Zamir, Ram
AU - Ingber, Amir
N1 - Publisher Copyright:
© 1963-2012 IEEE.
PY - 2015/8
Y1 - 2015/8
N2 - Shannon's capacity and rate-distortion function, combined with the separation principle, provide tight bounds for the minimum possible distortion in joint source-channel coding. These bounds, however, are usually achievable only in the limit of a large block length. In their 1973 paper, Ziv and Zakai introduced a family of alternative capacity and rate-distortion functions, based on functionals satisfying the data-processing inequality, which potentially give tighter bounds for systems with a small block length. There is a considerable freedom as to how to choose those functionals, and the ways of finding the best possible functionals yielding the best bounds for a given source-channel combination are not specified. We examine recently conjectured high SNR asymptotic expressions for the Ziv-Zakai bounds, based on the Rényi-divergence functional. We derive nonasymptotic bounds on the Ziv-Zakai-Rényi rate-distortion function and capacity for a broad class of sources and additive noise channels, which hold for arbitrary SNR and prove the conjectured asymptotic expressions in the limit of a small distortion/high SNR. The results lead to new bounds on the best achievable distortion in finite dimensional joint source-channel coding. Examples are presented where the new bounds achieve significant improvement upon Shannon's original bounds.
AB - Shannon's capacity and rate-distortion function, combined with the separation principle, provide tight bounds for the minimum possible distortion in joint source-channel coding. These bounds, however, are usually achievable only in the limit of a large block length. In their 1973 paper, Ziv and Zakai introduced a family of alternative capacity and rate-distortion functions, based on functionals satisfying the data-processing inequality, which potentially give tighter bounds for systems with a small block length. There is a considerable freedom as to how to choose those functionals, and the ways of finding the best possible functionals yielding the best bounds for a given source-channel combination are not specified. We examine recently conjectured high SNR asymptotic expressions for the Ziv-Zakai bounds, based on the Rényi-divergence functional. We derive nonasymptotic bounds on the Ziv-Zakai-Rényi rate-distortion function and capacity for a broad class of sources and additive noise channels, which hold for arbitrary SNR and prove the conjectured asymptotic expressions in the limit of a small distortion/high SNR. The results lead to new bounds on the best achievable distortion in finite dimensional joint source-channel coding. Examples are presented where the new bounds achieve significant improvement upon Shannon's original bounds.
KW - Joint source-channel coding
KW - Renyi divergence
KW - Ziv-Zakai
KW - finite blocklength
UR - http://www.scopus.com/inward/record.url?scp=84959194176&partnerID=8YFLogxK
U2 - 10.1109/TIT.2015.2445874
DO - 10.1109/TIT.2015.2445874
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:84959194176
SN - 0018-9448
VL - 61
SP - 4293
EP - 4315
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 8
M1 - 7124492
ER -