It is well known that independent (separate) encoding of K correlated sources may incur some rate loss compared to joint encoding, even if the decoding is done jointly. This loss is particularly evident in the multiple descriptions problem, where it is the same source that is encoded in each description. We observe that under mild conditions about the source and distortion measure, the sum-rate of K separately encoded individually good descriptions tends to the rate-distortion function of the joint decoder in the limit of vanishing small coding rates of the descriptions. Moreover, we then propose to successively encode the source into K independent descriptions in each round in order to achieve a final distortion D after M rounds. We provide two examples – a Gaussian source with mean-squared error and an exponential source with one-sided error – for which the excess rate vanishes in the limit as the number of rounds M goes to infinity, for any fixed D and K. This result has an interesting interpretation for a multi-round variant of the multiple descriptions problem, where after each round the encoder gets a (block) feedback regarding which of the descriptions arrived: In the limit as the number of rounds M goes to infinity (i.e., many incremental rounds), the total rate of received descriptions approaches the rate-distortion function. We provide theoretical and experimental evidence showing that this phenomenon is in fact more general than in the two examples above.
- Channel coding
- Resource description framework