Private learning of halfspaces: Simplifying the construction & reducing the sample complexity

Research output: Contribution to journalConference articlepeer-review

9 Scopus citations


We present a differentially private learner for halfspaces over a finite grid G in Rd with sample complexity ˜ d2.5 · 2log* |G|, which improves the state-of-the-art result of [Beimel et al., COLT 2019] by a d2 factor. The building block for our learner is a new differentially private algorithm for approximately solving the linear feasibility problem: Given a feasible collection of m linear constraints of the form Ax = b, the task is to privately identify a solution x that satisfies most of the constraints. Our algorithm is iterative, where each iteration determines the next coordinate of the constructed solution x.

Original languageEnglish
JournalAdvances in Neural Information Processing Systems
StatePublished - 2020
Event34th Conference on Neural Information Processing Systems, NeurIPS 2020 - Virtual, Online
Duration: 6 Dec 202012 Dec 2020


FundersFunder number
Horizon 2020 Framework Programme882396, 1871/19, 993/17
Blavatnik Family Foundation
European Research Council
German-Israeli Foundation for Scientific Research and Development1367/2017
Israel Science Foundation1595/19


    Dive into the research topics of 'Private learning of halfspaces: Simplifying the construction & reducing the sample complexity'. Together they form a unique fingerprint.

    Cite this