TY - JOUR
T1 - Lower Bounds and Approximations for the Information Rate of the ISI Channel
AU - Carmon, Yair
AU - Shamai, Shlomo
N1 - Publisher Copyright:
© 2015 IEEE.
PY - 2015/10
Y1 - 2015/10
N2 - We consider the discrete-time intersymbol interference (ISI) channel model, with additive Gaussian noise and fixed independent identically distributed inputs. In this setting, we investigate the expression put forth by Shamai and Laroia as a conjectured lower bound for the input-output mutual information after application of a minimum mean-square error decision-feedback equalizer receiver. A low-signal to noise ratio (SNR) expansion is used to prove that the conjectured bound does not hold under general conditions, and to characterize inputs for which it is particularly ill-suited. One such input is used to construct a counterexample, indicating that the Shamai-Laroia expression does not always bound even the achievable rate of the channel, thus excluding a natural relaxation of the original conjectured bound. However, this relaxed bound is then shown to hold for any finite entropy input and ISI channel, when the SNR is sufficiently high. We derive two conditions under which the relaxed bound holds, involving compound channel capacity and quasiconvexity arguments. Finally, new simple bounds for the achievable rate are proven, and compared with other known bounds. Information-estimation relations and estimation-theoretic bounds play a key role in establishing our results.
AB - We consider the discrete-time intersymbol interference (ISI) channel model, with additive Gaussian noise and fixed independent identically distributed inputs. In this setting, we investigate the expression put forth by Shamai and Laroia as a conjectured lower bound for the input-output mutual information after application of a minimum mean-square error decision-feedback equalizer receiver. A low-signal to noise ratio (SNR) expansion is used to prove that the conjectured bound does not hold under general conditions, and to characterize inputs for which it is particularly ill-suited. One such input is used to construct a counterexample, indicating that the Shamai-Laroia expression does not always bound even the achievable rate of the channel, thus excluding a natural relaxation of the original conjectured bound. However, this relaxed bound is then shown to hold for any finite entropy input and ISI channel, when the SNR is sufficiently high. We derive two conditions under which the relaxed bound holds, involving compound channel capacity and quasiconvexity arguments. Finally, new simple bounds for the achievable rate are proven, and compared with other known bounds. Information-estimation relations and estimation-theoretic bounds play a key role in establishing our results.
KW - Intersymbol interference
KW - MMSE
KW - Shamai-Laroia approximation
KW - decisionfeedback equalization
KW - mutual information
UR - http://www.scopus.com/inward/record.url?scp=84959449857&partnerID=8YFLogxK
U2 - 10.1109/TIT.2015.2460252
DO - 10.1109/TIT.2015.2460252
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:84959449857
SN - 0018-9448
VL - 61
SP - 5417
EP - 5431
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 10
M1 - 7165647
ER -