TY - JOUR
T1 - Systematic lossy source/channel coding
AU - Shamai, Shlomo
AU - Verdú, Sergio
AU - Zamir, Ram
N1 - Funding Information:
Manuscript received August 1, 1996; revised June 15, 1997. This work was supported by the U.S.–Israel Binational Science Foundation under Grant 92–00202, the NSF under Grant NCR95-23805, and the UC Micro program. S. Shamai is with the Electrical Engineering Department, Technion–Israel Institute of Technology, Haifa, 32000, Israel (e-mail: [email protected]. ac.il). S. Verdú is with the Electrical Engineering Department, Princeton University, Princeton, NJ 08544 USA (e-mail: [email protected]). R. Zamir is with the Department of Electrical Engineering–Systems, Tel Aviv University, Tel Aviv 69978, Israel (e-mail: [email protected]). Publisher Item Identifier S 0018-9448(98)00838-4.
PY - 1998
Y1 - 1998
N2 - The fundamental limits of "systematic" communication are analyzed. In systematic transmission, the decoder has access to a noisy version of the uncoded raw data (analog or digital). The coded version of the data is used to reduce the average reproduced distortion D below that provided by the uncoded systematic link and/or increase the rate of information transmission. Unlike the case of arbitrarily reliable error correction (D → 0) for symmetric sources/channels, where systematic codes are known to do as well as nonsystematic codes, we demonstrate that the systematic structure may degrade the performance for non vanishing D. We characterize the achievable average distortion and we find necessary and sufficient conditions under which systematic communication does not incur loss of optimality. The Wyner-Ziv rate distortion theorem plays a fundamental role in our setting. The general result is applied to several scenarios. For a Gaussian bandlimited source and a Gaussian channel, the invariance of the bandwidtlnsignal-to-nosie ratio (SNR, in decibels) product is established, and the optimality of systematic transmission is demonstrated. Bernoulli sources transmitted over binary-symmetric channels and over certain Gaussian channels are also analyzed. It is shown that if nonnegligible bit-error rate is tolerated, systematic encoding is strictly suboptimal.
AB - The fundamental limits of "systematic" communication are analyzed. In systematic transmission, the decoder has access to a noisy version of the uncoded raw data (analog or digital). The coded version of the data is used to reduce the average reproduced distortion D below that provided by the uncoded systematic link and/or increase the rate of information transmission. Unlike the case of arbitrarily reliable error correction (D → 0) for symmetric sources/channels, where systematic codes are known to do as well as nonsystematic codes, we demonstrate that the systematic structure may degrade the performance for non vanishing D. We characterize the achievable average distortion and we find necessary and sufficient conditions under which systematic communication does not incur loss of optimality. The Wyner-Ziv rate distortion theorem plays a fundamental role in our setting. The general result is applied to several scenarios. For a Gaussian bandlimited source and a Gaussian channel, the invariance of the bandwidtlnsignal-to-nosie ratio (SNR, in decibels) product is established, and the optimality of systematic transmission is demonstrated. Bernoulli sources transmitted over binary-symmetric channels and over certain Gaussian channels are also analyzed. It is shown that if nonnegligible bit-error rate is tolerated, systematic encoding is strictly suboptimal.
KW - Gaussian channels and sources
KW - Rate-distortion theory
KW - Source/channel coding
KW - Systematic transmission
KW - Uncoded side information
KW - Wyner-Ziv rate distortion
UR - http://www.scopus.com/inward/record.url?scp=0032025574&partnerID=8YFLogxK
U2 - 10.1109/18.661505
DO - 10.1109/18.661505
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:0032025574
SN - 0018-9448
VL - 44
SP - 564
EP - 579
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 2
ER -