TY - GEN

T1 - Error Exponent in Agnostic PAC Learning

AU - Hendel, Adi

AU - Feder, Meir

N1 - Publisher Copyright:
© 2024 IEEE.

PY - 2024

Y1 - 2024

N2 - Statistical learning theory and the Probably Ap-proximately Correct (PAC) criterion are the common approach to mathematical learning theory. PAC is widely used to ana-lyze learning problems and algorithms, and have been studied thoroughly. Uniform worst case bounds on the convergence rate have been well established using, e.g., VC theory or Radamacher complexity. However, in a typical scenario the performance could be much better. In this paper, we consider PAC learning using a somewhat different tradeoff, the error exponent-a well established analysis method in Information Theory-which describes the exponential behavior of the probability that the risk will exceed a certain threshold as function of the sample size. We focus on binary classification and find, under some stability assumptions, an improved distribution dependent error exponent for a wide range of problems, establishing the exponential behavior of the PAC error probability in agnostic learning. Inter-estingly, under these assumptions, agnostic learning may have the same error exponent as realizable learning. The error exponent criterion can be applied to analyze knowledge distillation, a problem that so far lacks a theoretical analysis.

AB - Statistical learning theory and the Probably Ap-proximately Correct (PAC) criterion are the common approach to mathematical learning theory. PAC is widely used to ana-lyze learning problems and algorithms, and have been studied thoroughly. Uniform worst case bounds on the convergence rate have been well established using, e.g., VC theory or Radamacher complexity. However, in a typical scenario the performance could be much better. In this paper, we consider PAC learning using a somewhat different tradeoff, the error exponent-a well established analysis method in Information Theory-which describes the exponential behavior of the probability that the risk will exceed a certain threshold as function of the sample size. We focus on binary classification and find, under some stability assumptions, an improved distribution dependent error exponent for a wide range of problems, establishing the exponential behavior of the PAC error probability in agnostic learning. Inter-estingly, under these assumptions, agnostic learning may have the same error exponent as realizable learning. The error exponent criterion can be applied to analyze knowledge distillation, a problem that so far lacks a theoretical analysis.

UR - http://www.scopus.com/inward/record.url?scp=85202795789&partnerID=8YFLogxK

U2 - 10.1109/ISIT57864.2024.10619319

DO - 10.1109/ISIT57864.2024.10619319

M3 - ???researchoutput.researchoutputtypes.contributiontobookanthology.conference???

AN - SCOPUS:85202795789

T3 - IEEE International Symposium on Information Theory - Proceedings

SP - 765

EP - 770

BT - 2024 IEEE International Symposium on Information Theory, ISIT 2024 - Proceedings

PB - Institute of Electrical and Electronics Engineers Inc.

T2 - 2024 IEEE International Symposium on Information Theory, ISIT 2024

Y2 - 7 July 2024 through 12 July 2024

ER -