On Communication Complexity of Classification Problems

Daniel Kane, Roi Livni, Shay Moran, Amir Yehudayoff

Research output: Contribution to journalConference articlepeer-review

13 Scopus citations

Abstract

This work studies distributed learning in the spirit of Yao’s model of communication complexity: consider a two-party setting, where each of the players gets a list of labelled examples and they communicate in order to jointly perform some learning task. To naturally fit into the framework of learning theory, the players can send each other examples (as well as bits) where each example/bit costs one unit of communication. This enables a uniform treatment of infinite classes such as half-spaces in Rd, which are ubiquitous in machine learning. We study several fundamental questions in this model. For example, we provide combinatorial characterizations of the classes that can be learned with efficient communication in the proper-case as well as in the improper-case. These findings imply unconditional separations in this context between various learning tasks, e.g. realizable versus agnostic learning, proper versus improper learning, etcetera. The derivation of these results hinges on a type of decision problems we term “realizability problems” where the goal is deciding whether a distributed input sample is consistent with an hypothesis from a pre-specified class. From a technical perspective, the protocols we devise (i.e. the upper bounds) are based on ideas from machine learning and the impossibility results (i.e. the lower bounds) are based on ideas from communication complexity.

Original languageEnglish
Pages (from-to)1903-1943
Number of pages41
JournalProceedings of Machine Learning Research
Volume99
StatePublished - 2019
Event32nd Conference on Learning Theory, COLT 2019 - Phoenix, United States
Duration: 25 Jun 201928 Jun 2019

Fingerprint

Dive into the research topics of 'On Communication Complexity of Classification Problems'. Together they form a unique fingerprint.

Cite this