A Comparative Analysis of Discrete Entropy Estimators for Large-Alphabet Problems

Assaf Pinchas*, Irad Ben-Gal, Amichai Painsky

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review


This paper presents a comparative study of entropy estimation in a large-alphabet regime. A variety of entropy estimators have been proposed over the years, where each estimator is designed for a different setup with its own strengths and caveats. As a consequence, no estimator is known to be universally better than the others. This work addresses this gap by comparing twenty-one entropy estimators in the studied regime, starting with the simplest plug-in estimator and leading up to the most recent neural network-based and polynomial approximate estimators. Our findings show that the estimators’ performance highly depends on the underlying distribution. Specifically, we distinguish between three types of distributions, ranging from uniform to degenerate distributions. For each class of distribution, we recommend the most suitable estimator. Further, we propose a sample-dependent approach, which again considers three classes of distribution, and report the top-performing estimators in each class. This approach provides a data-dependent framework for choosing the desired estimator in practical setups.

Original languageEnglish
Article number369
Issue number5
StatePublished - May 2024


FundersFunder number
Koret Foundation
Israel Science Foundation963/21


    • deterministic
    • discrete
    • empirical distribution
    • entropy estimation
    • high dimensions
    • uniform


    Dive into the research topics of 'A Comparative Analysis of Discrete Entropy Estimators for Large-Alphabet Problems'. Together they form a unique fingerprint.

    Cite this