TY - JOUR
T1 - Error exponents of modulo-additive noise channels with side information at the transmitter
AU - Erez, Uri
AU - Zamir, Ram
N1 - Funding Information:
Manuscript received November 18, 1999; revised August 3, 2000. This work was supported in part by the Tel-Aviv University research fund. the material in this paper was presented in part at the Information Theory Workshop, Metsovo, Greece, June 1999. The authors are with the Department of Electrical Engineering–Systems, Tel-Aviv University, Ramat-Aviv 69928, Israel (e-mail: [email protected]; [email protected]). Communicated by S. Shamai, Associate Editor for Shannon Theory. Publisher Item Identifier S 0018-9448(01)00589-2.
PY - 2001/1
Y1 - 2001/1
N2 - Consider the optimum strategy for using channel state ("side") information in transmission over a modulo-additive noise channel, with state-dependent noise, where the receiver does not have access to the side information (SI). Recent work showed that capacity-wise, the optimum transmitter shifts each code letter by a "prediction" of the noise sample based on the SI. We show that this structure achieves also the random-coding error exponent, and, therefore, is optimum at some range of rates below capacity. Specifically, the optimum transmitter predictor minimizes the Rényi entropy of the prediction error; the Rényi order depends on the rate, and goes to one (corresponding to Shannon entropy) for rates close to capacity. In contrast, it is shown that this "prediction strategy" may not be optimal at low transmission rates.
AB - Consider the optimum strategy for using channel state ("side") information in transmission over a modulo-additive noise channel, with state-dependent noise, where the receiver does not have access to the side information (SI). Recent work showed that capacity-wise, the optimum transmitter shifts each code letter by a "prediction" of the noise sample based on the SI. We show that this structure achieves also the random-coding error exponent, and, therefore, is optimum at some range of rates below capacity. Specifically, the optimum transmitter predictor minimizes the Rényi entropy of the prediction error; the Rényi order depends on the rate, and goes to one (corresponding to Shannon entropy) for rates close to capacity. In contrast, it is shown that this "prediction strategy" may not be optimal at low transmission rates.
KW - Error exponent
KW - Prediction
KW - Rényi entropy
KW - Side information (SI)
KW - Time-varying channels
UR - http://www.scopus.com/inward/record.url?scp=0035091411&partnerID=8YFLogxK
U2 - 10.1109/18.904523
DO - 10.1109/18.904523
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:0035091411
SN - 0018-9448
VL - 47
SP - 210
EP - 218
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 1
ER -