Generalized Good-Turing Improves Missing Mass Estimation

Amichai Painsky*

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

2 Scopus citations


Consider a finite sample from an unknown distribution over a countable alphabet. The missing mass refers to the probability of symbols that do not appear in the sample. Estimating the missing mass is a basic problem in statistics and related fields, which dates back to the early work of Laplace, and the more recent seminal contribution of Good and Turing. In this article, we introduce a generalized Good-Turing (GT) framework for missing mass estimation. We derive an upper-bound for the risk (in terms of mean squared error) and minimize it over the parameters of our framework. Our analysis distinguishes between two setups, depending on the (unknown) alphabet size. When the alphabet size is bounded from above, our risk-bound demonstrates a significant improvement compared to currently known results (which are typically oblivious to the alphabet size). Based on this bound, we introduce a numerically obtained estimator that improves upon GT. When the alphabet size holds no restrictions, we apply our suggested risk-bound and introduce a closed-form estimator that again improves GT performance guarantees. Our suggested framework is easy to apply and does not require additional modeling assumptions. This makes it a favorable choice for practical applications. Supplementary materials for this article are available online.

Original languageEnglish
Pages (from-to)1890-1899
Number of pages10
JournalJournal of the American Statistical Association
Issue number543
StatePublished - 2023


FundersFunder number
Israel Science Foundation963/21


    • Categorical data analysis
    • Frequency of frequencies
    • Minimax estimation
    • Rule of succession


    Dive into the research topics of 'Generalized Good-Turing Improves Missing Mass Estimation'. Together they form a unique fingerprint.

    Cite this