Sampling, metric entropy, and dimensionality reduction

Dmitry Batenkov, Omer Friedland, Yosef Yomdin

Research output: Contribution to journalArticlepeer-review

2 Scopus citations

Abstract

Let Q be a relatively compact subset in a Hilbert space V. For a given ε > 0 let N (ε, Q) be the minimal number of linear measurements, sufficient to reconstruct any x ∈ Q with the accuracy ε. We call N (ε, Q) the sampling ε-entropy of Q. Using dimensionality reduction, as provided by the Johnson-Lindenstrauss lemma, we show that, in an appropriate probabilistic setting, N (ε, Q) is bounded from above by Kolmogorov's ε-entropy H (ε, Q ), defined as H (ε, Q) = log M (ε, Q), with M (ε, Q) being the minimal number of ε-balls covering Q. As the main application, we show that piecewise smooth (piecewise analytic) functions in one and several variables can be sampled with essentially the same accuracy rate as their regular counterparts. For univariate piecewise Ck-smooth functions this result, which settles the so-called Eckhoff conjecture, was recently established in D. Batenkov, Complete Algebraic Reconstruction of Piecewise-smooth Functions from Fourier Data, arXiv:1211.0680, 2012 via a deterministic "algebraic reconstruction" algorithm.

Original languageEnglish
Pages (from-to)786-796
Number of pages11
JournalSIAM Journal on Mathematical Analysis
Volume47
Issue number1
DOIs
StatePublished - 2015
Externally publishedYes

Funding

FundersFunder number
Israel Science Foundation639/09

    Keywords

    • Dimensionality reduction
    • Johnson-Lindenstrauss lemma
    • Metric entropy
    • Sampling

    Fingerprint

    Dive into the research topics of 'Sampling, metric entropy, and dimensionality reduction'. Together they form a unique fingerprint.

    Cite this