TY - JOUR
T1 - On the loss of single-letter characterization
T2 - The dirty multiple access channel
AU - Philosof, Tal
AU - Zamir, Ram
N1 - Funding Information:
Manuscript received March 14, 2008; revised February 23, 2009. Current version published May 20, 2009. The work of R. Zamir was supported in part by BSF under Grant 2004398. The material in this paper was presented in part at Information Theory Workshop, Porto, Portugal, May 2008. The authors are with the Department of Electrical Engineering–Systems, Tel-Aviv University, Ramat-Aviv 699978, Tel-Aviv, Israel. Communicated by H. Yamamoto, Associate Editor for Shannon Theory. Digital Object Identifier 10.1109/TIT.2009.2018174
PY - 2009
Y1 - 2009
N2 - For general memoryless systems, the existing information-theoretic solutions have a "single-letter" form. This reflects the fact that optimum performance can be approached by a random code (or a random binning scheme), generated using independent and identically distributed copies of some scalar distribution. Is that the form of the solution of any (information-theoretic) problem? In fact, some counter examples are known. The most famous one is the "two help one" problem: Körner and Marton showed that if we want to decode the modulo-two sum of two correlated binary sources from their independent encodings, then linear coding is better than random coding. In this paper we provide another counter example, the "doubly-dirty" multiple-access channel (MAC). Like the Körner-Marton problem, this is a multiterminal scenario where side information is distributed among several terminals; each transmitter knows part of the channel interference while the receiver only observes the channel output. We give an explicit solution for the capacity region of the binary doubly-dirty MAC, demonstrate how this region can be approached using a linear coding scheme, and prove that the "best known single-letter region" is strictly contained in it. We also state a conjecture regarding the capacity loss of single-letter characterization in the Gaussian case.
AB - For general memoryless systems, the existing information-theoretic solutions have a "single-letter" form. This reflects the fact that optimum performance can be approached by a random code (or a random binning scheme), generated using independent and identically distributed copies of some scalar distribution. Is that the form of the solution of any (information-theoretic) problem? In fact, some counter examples are known. The most famous one is the "two help one" problem: Körner and Marton showed that if we want to decode the modulo-two sum of two correlated binary sources from their independent encodings, then linear coding is better than random coding. In this paper we provide another counter example, the "doubly-dirty" multiple-access channel (MAC). Like the Körner-Marton problem, this is a multiterminal scenario where side information is distributed among several terminals; each transmitter knows part of the channel interference while the receiver only observes the channel output. We give an explicit solution for the capacity region of the binary doubly-dirty MAC, demonstrate how this region can be approached using a linear coding scheme, and prove that the "best known single-letter region" is strictly contained in it. We also state a conjecture regarding the capacity loss of single-letter characterization in the Gaussian case.
KW - Dirty paper coding
KW - Körner-Marton problem
KW - Lattice strategies
KW - Linear/lattice binning
KW - Multiuser information theory
KW - Random binning
UR - http://www.scopus.com/inward/record.url?scp=66949135426&partnerID=8YFLogxK
U2 - 10.1109/TIT.2009.2018174
DO - 10.1109/TIT.2009.2018174
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:66949135426
SN - 0018-9448
VL - 55
SP - 2442
EP - 2454
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 6
ER -