The index entropy of a mismatched codebook

Ram Zamir*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Entropy coding is a well-known technique to reduce the rate of a quantizer. It plays a particularly important role in universal quantization, where the quantizer codebook is not matched to the source statistics. We investigate the gain due to entropy coding by considering the entropy of the index of the first codeword, in a mismatched random codebook, that D-matches the source word. We show that the index entropy is strictly lower than the "uncoded" rate of the code, provided that the entropy is conditioned on the codebook. The number of bits saved by conditional entropy coding is equal to the divergence between the "favorite type" (the limiting empirical distribution of the first D-matching codeword) and the codebook-generating distribution. Specific examples are provided.

Original languageEnglish
Pages (from-to)523-528
Number of pages6
JournalIEEE Transactions on Information Theory
Volume48
Issue number2
DOIs
StatePublished - Feb 2002

Keywords

  • Entropy-coded quantization
  • Favorite type
  • Mismatched source coding
  • Universal quantization

Fingerprint

Dive into the research topics of 'The index entropy of a mismatched codebook'. Together they form a unique fingerprint.

Cite this