Abstract
In causal source coding, the reconstruction is restricted to be a function of the present and past source samples, while the variable-length code stream may be non-causal. Neuhoff and Gilbert showed that for memoryless sources, optimum performance among an causal lossy source codes is achieved by time-sharing at most two memoryless codes (scalar quantizers) followed by entropy coding. We extend this result to causal coding of individual sequences in the limit of small distortion. The optimum performance of finite-memory variable-rate causal codes in this setting is characterized by a deterministic analogue of differential entropy, which we call "Lempel-Ziv differential entropy." As a by-product, we also provide an individual-sequence version of the Shannon lower bound to the rate-distortion function.
Original language | English |
---|---|
Pages (from-to) | 558 |
Number of pages | 1 |
Journal | IEEE International Symposium on Information Theory - Proceedings |
State | Published - 2004 |
Event | Proceedings - 2004 IEEE International Symposium on Information Theory - Chicago, IL, United States Duration: 27 Jun 2004 → 2 Jul 2004 |