Mismatched codebooks and the role of entropy-coding in lossy data compression

Ioannis Kontoyiannis*, Ram Zamir

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

We develop a random coding model for universal quantization. The universal quantizer consists of a (typically) mismatched random codebook followed by optimal entropy-coding. We precisely characterize the rate gain due to entropy-coding and show that it may be arbitrarily large. In the special case of entropy-coded i.i.d. Gaussian codebooks with large variance, we draw a novel connection with the compression performance of entropy-coded dithered lattice quantization. Our main tools are large deviations techniques that allow us to prove an almost sure version of the conditional limit theorem.

Original languageEnglish
Pages (from-to)167
Number of pages1
JournalIEEE International Symposium on Information Theory - Proceedings
DOIs
StatePublished - 2003
EventProceedings 2003 IEEE International Symposium on Information Theory (ISIT) - Yokohama, Japan
Duration: 29 Jun 20034 Jul 2003

Fingerprint

Dive into the research topics of 'Mismatched codebooks and the role of entropy-coding in lossy data compression'. Together they form a unique fingerprint.

Cite this