Causal coding of individual sequences and the Lempel-Ziv differential entropy

Tamás Linder*, Ram Zamir

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

Abstract

In causal source coding, the reconstruction is restricted to be a function of the present and past source samples, while the variable-length code stream may be non-causal. Neuhoff and Gilbert showed that for memoryless sources, optimum performance among an causal lossy source codes is achieved by time-sharing at most two memoryless codes (scalar quantizers) followed by entropy coding. We extend this result to causal coding of individual sequences in the limit of small distortion. The optimum performance of finite-memory variable-rate causal codes in this setting is characterized by a deterministic analogue of differential entropy, which we call "Lempel-Ziv differential entropy." As a by-product, we also provide an individual-sequence version of the Shannon lower bound to the rate-distortion function.

Original languageEnglish
Pages (from-to)558
Number of pages1
JournalIEEE International Symposium on Information Theory - Proceedings
StatePublished - 2004
EventProceedings - 2004 IEEE International Symposium on Information Theory - Chicago, IL, United States
Duration: 27 Jun 20042 Jul 2004

Fingerprint

Dive into the research topics of 'Causal coding of individual sequences and the Lempel-Ziv differential entropy'. Together they form a unique fingerprint.

Cite this