Redundancy capacity theorem for on-line learning under a certain form of hypotheses class

Shachar Shayovitz, Meir Feder

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

In this paper we consider the problem of online learning in the stochastic setting under a certain form of hypotheses class. We prove an equivalence between the minimax redundancy and capacity of the channel between the class parameters and the labels conditioned on the data features (side information). Our proof extends Gallager's Redundancy Capacity theorem for universal prediction to on-line learning with the considered form of hypotheses class. Moreover, this result confirms the optimality of previous ad-hoc universal learners, or universal predictors with side information, but more importantly, extends these previous results to more general hypotheses classes.

Original languageEnglish
Title of host publication2018 IEEE Information Theory Workshop, ITW 2018
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9781538635995
DOIs
StatePublished - 15 Jan 2019
Event2018 IEEE Information Theory Workshop, ITW 2018 - Guangzhou, China
Duration: 25 Nov 201829 Nov 2018

Publication series

Name2018 IEEE Information Theory Workshop, ITW 2018

Conference

Conference2018 IEEE Information Theory Workshop, ITW 2018
Country/TerritoryChina
CityGuangzhou
Period25/11/1829/11/18

Fingerprint

Dive into the research topics of 'Redundancy capacity theorem for on-line learning under a certain form of hypotheses class'. Together they form a unique fingerprint.

Cite this