Bregman divergence bounds and universality properties of the logarithmic loss

Amichai Painsky, Gregory W. Wornell

Research output: Contribution to journalArticlepeer-review

13 Scopus citations


A loss function measures the discrepancy between the true values and their estimated fits, for a given instance of data. In classification problems, a loss function is said to be proper if a minimizer of the expected loss is the true underlying probability. We show that for binary classification, the divergence associated with smooth, proper, and convex loss functions is upper bounded by the Kullback-Leibler (KL) divergence, to within a normalization constant. This implies that by minimizing the logarithmic loss associated with the KL divergence, we minimize an upper bound to any choice of loss from this set. As such the logarithmic loss is universal in the sense of providing performance guarantees with respect to a broad class of accuracy measures. Importantly, this notion of universality is not problem-specific, enabling its use in diverse applications, including predictive modeling, data clustering and sample complexity analysis. Generalizations to arbitary finite alphabets are also developed. The derived inequalities extend several well-known f-divergence results.

Original languageEnglish
Article number8930624
Pages (from-to)1658-1673
Number of pages16
JournalIEEE Transactions on Information Theory
Issue number3
StatePublished - Mar 2020


FundersFunder number
National Science FoundationCCF-1717610
Directorate for Computer and Information Science and Engineering1717610


    • Bregman divergences
    • Kullback-Leibler (KL) divergence
    • Pinsker inequality
    • logarithmic loss


    Dive into the research topics of 'Bregman divergence bounds and universality properties of the logarithmic loss'. Together they form a unique fingerprint.

    Cite this