Abstract
The problem of assigning a probability to the next outcome of an individual binary sequence under the constraint that the universal predictor has a finite number of states, is explored. The two main loss functions that are considered are the square error loss and the self-information loss. Universal prediction w.r.t. the self-information loss can be combined with arithmetic encoding (see [9]) to construct a universal encoder, thus we essentially explore the universal coding problem. We analyze the performance of randomized time-invariant K-state universal predictors, and provide performance bounds in terms of the number of states K for long enough sequences. In the case where the comparison class consists of constant predictors we provide, for the square error loss, tight bounds indicating that the optimal asymptotic expected redundancy is O (1/K). For the self-information loss we show an upper bound on the coding redundancy of O (log K/K) and a lower bound of O(1/K).
Original language | English |
---|---|
Pages (from-to) | 332-341 |
Number of pages | 10 |
Journal | Proceedings of the Data Compression Conference |
State | Published - 2004 |
Event | Proceedings - DCC 2004 Data Compression Conference - Snowbird, UT., United States Duration: 23 Mar 2004 → 25 Mar 2004 |