TY - GEN
T1 - On-The-Fly Stochastic Codebook Re-generation for Sources with Memory
AU - Elshafiy, Ahmed
AU - Namazi, Mahmoud
AU - Zamir, Ram
AU - Rose, Kenneth
N1 - Publisher Copyright:
©2021 IEEE
PY - 2021/4/11
Y1 - 2021/4/11
N2 - This paper proposes a generalized stochastic mechanism for codebook generation in lossy coding settings for sources with memory. Earlier work has shown that the rate-distortion bound can be asymptotically achieved for discrete memoryless sources by a “natural type selection” (NTS) algorithm. In iteration n, the distribution that is most likely to produce the types of a sequence of K codewords of finite length ` that “dmatch” a respective sequence of K source words of length `, (i.e., which satisfy the distortion constraint), is used to regenerate the codebook for iteration n + 1. The resulting sequence of codebook generating distributions converges to the optimal distribution Q∗ that achieves the rate-distortion bound for the memoryless source, asymptotically in `, K, and n. This work generalizes the NTS algorithm to account for sources with memory. The algorithm encodes m`-length source words consisting of ` vectors (or super-symbols) of length m. We show that for finite m and `, the sequence of codebook reproduction distributions Q0,m,`, Q1,m,`, . . . (each computed after observing a sequence of K d-match events) converges to the optimal achievable distribution Q∗m,` (within a set of achievable distributions determined by m and `), asymptotically in K and n. It is further shown that Q∗m,` converges to the optimal reproduction distribution Q∗ that achieves the rate-distortion bound for sources with memory, asymptotically in m and `.
AB - This paper proposes a generalized stochastic mechanism for codebook generation in lossy coding settings for sources with memory. Earlier work has shown that the rate-distortion bound can be asymptotically achieved for discrete memoryless sources by a “natural type selection” (NTS) algorithm. In iteration n, the distribution that is most likely to produce the types of a sequence of K codewords of finite length ` that “dmatch” a respective sequence of K source words of length `, (i.e., which satisfy the distortion constraint), is used to regenerate the codebook for iteration n + 1. The resulting sequence of codebook generating distributions converges to the optimal distribution Q∗ that achieves the rate-distortion bound for the memoryless source, asymptotically in `, K, and n. This work generalizes the NTS algorithm to account for sources with memory. The algorithm encodes m`-length source words consisting of ` vectors (or super-symbols) of length m. We show that for finite m and `, the sequence of codebook reproduction distributions Q0,m,`, Q1,m,`, . . . (each computed after observing a sequence of K d-match events) converges to the optimal achievable distribution Q∗m,` (within a set of achievable distributions determined by m and `), asymptotically in K and n. It is further shown that Q∗m,` converges to the optimal reproduction distribution Q∗ that achieves the rate-distortion bound for sources with memory, asymptotically in m and `.
KW - Natural Type Selection
KW - Random Codebook
KW - Rate-Distortion function
KW - String Matching
UR - http://www.scopus.com/inward/record.url?scp=85113323018&partnerID=8YFLogxK
U2 - 10.1109/ITW46852.2021.9457666
DO - 10.1109/ITW46852.2021.9457666
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:85113323018
T3 - 2020 IEEE Information Theory Workshop, ITW 2020
BT - 2020 IEEE Information Theory Workshop, ITW 2020
PB - Institute of Electrical and Electronics Engineers Inc.
T2 - 2020 IEEE Information Theory Workshop, ITW 2020
Y2 - 11 April 2021 through 15 April 2021
ER -