TY - JOUR
T1 - Mismatched codebooks and the role of entropy-coding in lossy data compression
AU - Kontoyiannis, Ioannis
AU - Zamir, Ram
PY - 2003
Y1 - 2003
N2 - We develop a random coding model for universal quantization. The universal quantizer consists of a (typically) mismatched random codebook followed by optimal entropy-coding. We precisely characterize the rate gain due to entropy-coding and show that it may be arbitrarily large. In the special case of entropy-coded i.i.d. Gaussian codebooks with large variance, we draw a novel connection with the compression performance of entropy-coded dithered lattice quantization. Our main tools are large deviations techniques that allow us to prove an almost sure version of the conditional limit theorem.
AB - We develop a random coding model for universal quantization. The universal quantizer consists of a (typically) mismatched random codebook followed by optimal entropy-coding. We precisely characterize the rate gain due to entropy-coding and show that it may be arbitrarily large. In the special case of entropy-coded i.i.d. Gaussian codebooks with large variance, we draw a novel connection with the compression performance of entropy-coded dithered lattice quantization. Our main tools are large deviations techniques that allow us to prove an almost sure version of the conditional limit theorem.
UR - http://www.scopus.com/inward/record.url?scp=0141973696&partnerID=8YFLogxK
U2 - 10.1109/isit.2003.1228181
DO - 10.1109/isit.2003.1228181
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.conferencearticle???
AN - SCOPUS:0141973696
SN - 2157-8096
SP - 167
JO - IEEE International Symposium on Information Theory - Proceedings
JF - IEEE International Symposium on Information Theory - Proceedings
T2 - Proceedings 2003 IEEE International Symposium on Information Theory (ISIT)
Y2 - 29 June 2003 through 4 July 2003
ER -