TY - JOUR

T1 - Submultiplicative Glivenko-Cantelli and uniform convergence of revenues

AU - Alon, Noga

AU - Babaioff, Moshe

AU - Gonczarowski, Yannai A.

AU - Mansour, Yishay

AU - Moran, Shay

AU - Yehudayoff, Amir

N1 - Publisher Copyright:
© 2017 Neural information processing systems foundation. All rights reserved.

PY - 2017

Y1 - 2017

N2 - In this work we derive a variant of the classic Glivenko-Cantelli Theorem, which asserts uniform convergence of the empirical Cumulative Distribution Function (CDF) to the CDF of the underlying distribution. Our variant allows for tighter convergence bounds for extreme values of the CDF. We apply our bound in the context of revenue learning, which is a well-studied problem in economics and algorithmic game theory. We derive sample-complexity bounds on the uniform convergence rate of the empirical revenues to the true revenues, assuming a bound on the kth moment of the valuations, for any (possibly fractional) k > 1. For uniform convergence in the limit, we give a complete characterization and a zero-one law: if the first moment of the valuations is finite, then uniform convergence almost surely occurs; conversely, if the first moment is infinite, then uniform convergence almost never occurs.

AB - In this work we derive a variant of the classic Glivenko-Cantelli Theorem, which asserts uniform convergence of the empirical Cumulative Distribution Function (CDF) to the CDF of the underlying distribution. Our variant allows for tighter convergence bounds for extreme values of the CDF. We apply our bound in the context of revenue learning, which is a well-studied problem in economics and algorithmic game theory. We derive sample-complexity bounds on the uniform convergence rate of the empirical revenues to the true revenues, assuming a bound on the kth moment of the valuations, for any (possibly fractional) k > 1. For uniform convergence in the limit, we give a complete characterization and a zero-one law: if the first moment of the valuations is finite, then uniform convergence almost surely occurs; conversely, if the first moment is infinite, then uniform convergence almost never occurs.

UR - http://www.scopus.com/inward/record.url?scp=85046999360&partnerID=8YFLogxK

M3 - מאמר מכנס

AN - SCOPUS:85046999360

VL - 2017-December

SP - 1657

EP - 1666

JO - Advances in Neural Information Processing Systems

JF - Advances in Neural Information Processing Systems

SN - 1049-5258

Y2 - 4 December 2017 through 9 December 2017

ER -