TY - JOUR
T1 - Multiple-description coding by dithered delta-sigma quantization
AU - Østergaard, Jan
AU - Zamir, Ram
N1 - Funding Information:
Dr. Østergaard has received a Danish Independent Research Council’s Young Researcher’s Award and a fellowship from the Danish Research Council for Technology and Production Sciences.
Funding Information:
Manuscript received August 07, 2007; revised December 15, 2008. Current version published September 23, 2009. The work of J. Østergaard is supported by the Danish Research Council for Technology and Production Sciences under Grant 274-07-0383. The material in this paper was presented in part at the IEEE Data Compression Conference, Snowbird, UT, March 2007. J. Østergaard is with the Department of Electronic Systems, Aalborg University, Aalborg, Denmark (e-mail: janoe@ieee.org). R. Zamir is with the Department of Electrical Engineering–Systems, Tel-Aviv University, Tel-Aviv, Ramat-Aviv 69978, Israel (e-mail: zamir@eng.tau.ac.il). Communicated by M. Effros, Associate Editor for Source Coding. Digital Object Identifier 10.1109/TIT.2009.2027528
PY - 2009
Y1 - 2009
N2 - We address the connection between the multiple-description (MD) problem and Delta-Sigma quantization. The inherent redundancy due to oversampling in Delta-Sigma quantization, and the simple linear-additive noise model resulting from dithered lattice quantization, allow us to construct a symmetric and time-invariant MD coding scheme. We show that the use of a noise-shaping filter makes it possible to trade off central distortion for side distortion. Asymptotically, as the dimension of the lattice vector quantizer and order of the noise-shaping filter approach infinity, the entropy rate of the dithered Delta-Sigma quantization scheme approaches the symmetric two-channel MD rate-distortion function for a memoryless Gaussian source and mean square error (MSE) fidelity criterion, at any side-to-central distortion ratio and any resolution. In the optimal scheme, the infinite-order noise-shaping filter must be minimum phase and have a piecewise flat power spectrum with a single jump discontinuity. An important advantage of the proposed design is that it is symmetric in rate and distortion by construction, so the coding rates of the descriptions are identical and there is therefore no need for source splitting.
AB - We address the connection between the multiple-description (MD) problem and Delta-Sigma quantization. The inherent redundancy due to oversampling in Delta-Sigma quantization, and the simple linear-additive noise model resulting from dithered lattice quantization, allow us to construct a symmetric and time-invariant MD coding scheme. We show that the use of a noise-shaping filter makes it possible to trade off central distortion for side distortion. Asymptotically, as the dimension of the lattice vector quantizer and order of the noise-shaping filter approach infinity, the entropy rate of the dithered Delta-Sigma quantization scheme approaches the symmetric two-channel MD rate-distortion function for a memoryless Gaussian source and mean square error (MSE) fidelity criterion, at any side-to-central distortion ratio and any resolution. In the optimal scheme, the infinite-order noise-shaping filter must be minimum phase and have a piecewise flat power spectrum with a single jump discontinuity. An important advantage of the proposed design is that it is symmetric in rate and distortion by construction, so the coding rates of the descriptions are identical and there is therefore no need for source splitting.
KW - Delta-Sigma modulation
KW - Dithered lattice quantization
KW - Entropy coding
KW - Joint source-channel coding
KW - Multiple-description (MD) coding
KW - Vector quantization
UR - http://www.scopus.com/inward/record.url?scp=70349623914&partnerID=8YFLogxK
U2 - 10.1109/TIT.2009.2027528
DO - 10.1109/TIT.2009.2027528
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:70349623914
SN - 0018-9448
VL - 55
SP - 4661
EP - 4675
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 10
ER -