TY - GEN
T1 - On Multiple and Hierarchical Universality
AU - Fogel, Yaniv
AU - Feder, Meir
N1 - Publisher Copyright:
© 2022 IEEE.
PY - 2022
Y1 - 2022
N2 - Universal coding, prediction and learning usually consider the case where the data generating mechanism is unknown or non-existent, and the goal of the universal scheme is to compete with the best hypothesis from a given hypothesis class, either on the average or in a worst-case scenario. Multiple universality considers the case where the hypothesis class is also unknown: there are several hypothesis classes with possibly different complexities. In hierarchical universality, the simpler classes are nested within more complex classes. The main challenge is to correctly define the universality problem. We propose several possible definitions and derive their min-max optimal solutions. Interestingly, the proposed solutions can be used to obtain Elias codes for universal representation of the integers. We also utilize this approach for variable-memory Markov models, presenting a new interpretation for the known bound over the regret of the celebrated context-tree weighting algorithm and proposing a 3-part code that (slightly) out-performs it.
AB - Universal coding, prediction and learning usually consider the case where the data generating mechanism is unknown or non-existent, and the goal of the universal scheme is to compete with the best hypothesis from a given hypothesis class, either on the average or in a worst-case scenario. Multiple universality considers the case where the hypothesis class is also unknown: there are several hypothesis classes with possibly different complexities. In hierarchical universality, the simpler classes are nested within more complex classes. The main challenge is to correctly define the universality problem. We propose several possible definitions and derive their min-max optimal solutions. Interestingly, the proposed solutions can be used to obtain Elias codes for universal representation of the integers. We also utilize this approach for variable-memory Markov models, presenting a new interpretation for the known bound over the regret of the celebrated context-tree weighting algorithm and proposing a 3-part code that (slightly) out-performs it.
UR - http://www.scopus.com/inward/record.url?scp=85136246629&partnerID=8YFLogxK
U2 - 10.1109/ISIT50566.2022.9834738
DO - 10.1109/ISIT50566.2022.9834738
M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???
AN - SCOPUS:85136246629
T3 - IEEE International Symposium on Information Theory - Proceedings
SP - 3013
EP - 3018
BT - 2022 IEEE International Symposium on Information Theory, ISIT 2022
PB - Institute of Electrical and Electronics Engineers Inc.
Y2 - 26 June 2022 through 1 July 2022
ER -