Abstract
We develop a random coding model for universal quantization. The universal quantizer consists of a (typically) mismatched random codebook followed by optimal entropy-coding. We precisely characterize the rate gain due to entropy-coding and show that it may be arbitrarily large. In the special case of entropy-coded i.i.d. Gaussian codebooks with large variance, we draw a novel connection with the compression performance of entropy-coded dithered lattice quantization. Our main tools are large deviations techniques that allow us to prove an almost sure version of the conditional limit theorem.
Original language | English |
---|---|
Pages (from-to) | 167 |
Number of pages | 1 |
Journal | IEEE International Symposium on Information Theory - Proceedings |
DOIs | |
State | Published - 2003 |
Event | Proceedings 2003 IEEE International Symposium on Information Theory (ISIT) - Yokohama, Japan Duration: 29 Jun 2003 → 4 Jul 2003 |
Funding
Funders | Funder number |
---|---|
National Science Foundation | 0073378 |