Distributed learning, communication complexity and privacy

Maria Florina Balcan, Avrim Blum, Shai Fine, Yishay Mansour

Research output: Contribution to journalConference articlepeer-review

Abstract

We consider the problem of PAC-learning from distributed data and analyze fundamental communication complexity questions involved. We provide general upper and lower bounds on the amount of communication needed to learn well, showing that in addition to VC-dimension and covering number, quantities such as the teaching-dimension and mistake-bound of a class play an important role. We also present tight results for a number of common concept classes including conjunctions, parity functions, and decision lists. For linear separators, we show that for non-concentrated distributions, we can use a version of the Perceptron algorithm to learn with much less communication than the number of updates given by the usual margin bound. We also show how boosting can be performed in a generic manner in the distributed setting to achieve communication with only logarithmic dependence on 1=ε for any concept class, and demonstrate how recent work on agnostic learning from class-conditional queries can be used to achieve low communication in agnostic settings as well. We additionally present an analysis of privacy, considering both differential privacy and a notion of distributional privacy that is especially appealing in this context.

Original languageEnglish
Pages (from-to)26.1-26.22
JournalJournal of Machine Learning Research
Volume23
StatePublished - 2012
Event25th Annual Conference on Learning Theory, COLT 2012 - Edinburgh, United Kingdom
Duration: 25 Jun 201227 Jun 2012

Keywords

  • Communication complexity
  • Distributed learning
  • Privacy

Fingerprint

Dive into the research topics of 'Distributed learning, communication complexity and privacy'. Together they form a unique fingerprint.

Cite this