TY - JOUR
T1 - Finite-memory universal prediction of individual sequences
AU - Meron, Eado
AU - Feder, Meir
N1 - Funding Information:
Manuscript received March 31, 2003; revised March 31, 2004. The work of E. Meron is supported in part by Intel Israel, and by the “Yitzhak and Chaya Weinstein Institute for Research in Signal Processing.” The material in this paper was presented in part at the IEEE International Symposium on Information Theory, Yokohama, Japan, June/July 2003 and at the Data Compression Conference, Snowbird, UT, March 2004.
PY - 2004/7
Y1 - 2004/7
N2 - The problem of predicting the next outcome of an individual binary sequence under the constraint that the universal predictor has a finite memory, is explored. In this analysis, the finite-memory universal predictors are either deterministic or random time-invariant finite-state (FS) machines with K states (K-state machines). The paper provides bounds on the asymptotic achievable regret of these constrained universal predictors as a function of K, the number of their states, for long enough sequences. The specific results are as follows. When the universal predictors are deterministic machines, the comparison class consists of constant predictors, and prediction is with respect to the 0-1 loss function (Hamming distance), we get tight bounds indicating that the optimal asymptotic regret is 1/(2K). In that case of K-state deterministic universal predictors, the constant predictors comparison class, but prediction is with respect to the self-information (code length) and the square-error loss functions, we show an upper bound on the regret (coding redundancy) of O(K-2/3) and a lower bound of θ(K-4/5). For these loss functions, if the predictor is allowed to be a random K-state machine, i.e., a machine with random state transitions, we get a lower bound of θ (1/K) on the regret, with a matching upper bound of O (1/K) for the square-error loss, and an upper bound of O (log K/K)1 for the self-information loss. In addition, we provide results for all these loss functions in the case where the comparison class consists of all predictors that are order-L Markov machines.
AB - The problem of predicting the next outcome of an individual binary sequence under the constraint that the universal predictor has a finite memory, is explored. In this analysis, the finite-memory universal predictors are either deterministic or random time-invariant finite-state (FS) machines with K states (K-state machines). The paper provides bounds on the asymptotic achievable regret of these constrained universal predictors as a function of K, the number of their states, for long enough sequences. The specific results are as follows. When the universal predictors are deterministic machines, the comparison class consists of constant predictors, and prediction is with respect to the 0-1 loss function (Hamming distance), we get tight bounds indicating that the optimal asymptotic regret is 1/(2K). In that case of K-state deterministic universal predictors, the constant predictors comparison class, but prediction is with respect to the self-information (code length) and the square-error loss functions, we show an upper bound on the regret (coding redundancy) of O(K-2/3) and a lower bound of θ(K-4/5). For these loss functions, if the predictor is allowed to be a random K-state machine, i.e., a machine with random state transitions, we get a lower bound of θ (1/K) on the regret, with a matching upper bound of O (1/K) for the square-error loss, and an upper bound of O (log K/K)1 for the self-information loss. In addition, we provide results for all these loss functions in the case where the comparison class consists of all predictors that are order-L Markov machines.
KW - Exponentially decaying memory
KW - FS prediction
KW - Finite-state (FS) machines
KW - Imaginary sliding window
KW - Saturated counter (SC)
KW - Universal coding
KW - Universal prediction
UR - http://www.scopus.com/inward/record.url?scp=3042515352&partnerID=8YFLogxK
U2 - 10.1109/TIT.2004.830749
DO - 10.1109/TIT.2004.830749
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.article???
AN - SCOPUS:3042515352
SN - 0018-9448
VL - 50
SP - 1506
EP - 1523
JO - IEEE Transactions on Information Theory
JF - IEEE Transactions on Information Theory
IS - 7
ER -