TY - JOUR
T1 - On Communication Complexity of Classification Problems
AU - Kane, Daniel
AU - Livni, Roi
AU - Moran, Shay
AU - Yehudayoff, Amir
N1 - Publisher Copyright:
© 2019 D. Kane, R. Livni, S. Moran & A. Yehudayoff.
PY - 2019
Y1 - 2019
N2 - This work studies distributed learning in the spirit of Yao’s model of communication complexity: consider a two-party setting, where each of the players gets a list of labelled examples and they communicate in order to jointly perform some learning task. To naturally fit into the framework of learning theory, the players can send each other examples (as well as bits) where each example/bit costs one unit of communication. This enables a uniform treatment of infinite classes such as half-spaces in Rd, which are ubiquitous in machine learning. We study several fundamental questions in this model. For example, we provide combinatorial characterizations of the classes that can be learned with efficient communication in the proper-case as well as in the improper-case. These findings imply unconditional separations in this context between various learning tasks, e.g. realizable versus agnostic learning, proper versus improper learning, etcetera. The derivation of these results hinges on a type of decision problems we term “realizability problems” where the goal is deciding whether a distributed input sample is consistent with an hypothesis from a pre-specified class. From a technical perspective, the protocols we devise (i.e. the upper bounds) are based on ideas from machine learning and the impossibility results (i.e. the lower bounds) are based on ideas from communication complexity.
AB - This work studies distributed learning in the spirit of Yao’s model of communication complexity: consider a two-party setting, where each of the players gets a list of labelled examples and they communicate in order to jointly perform some learning task. To naturally fit into the framework of learning theory, the players can send each other examples (as well as bits) where each example/bit costs one unit of communication. This enables a uniform treatment of infinite classes such as half-spaces in Rd, which are ubiquitous in machine learning. We study several fundamental questions in this model. For example, we provide combinatorial characterizations of the classes that can be learned with efficient communication in the proper-case as well as in the improper-case. These findings imply unconditional separations in this context between various learning tasks, e.g. realizable versus agnostic learning, proper versus improper learning, etcetera. The derivation of these results hinges on a type of decision problems we term “realizability problems” where the goal is deciding whether a distributed input sample is consistent with an hypothesis from a pre-specified class. From a technical perspective, the protocols we devise (i.e. the upper bounds) are based on ideas from machine learning and the impossibility results (i.e. the lower bounds) are based on ideas from communication complexity.
UR - http://www.scopus.com/inward/record.url?scp=85160813646&partnerID=8YFLogxK
M3 - ???researchoutput.researchoutputtypes.contributiontojournal.conferencearticle???
AN - SCOPUS:85160813646
SN - 2640-3498
VL - 99
SP - 1903
EP - 1943
JO - Proceedings of Machine Learning Research
JF - Proceedings of Machine Learning Research
T2 - 32nd Conference on Learning Theory, COLT 2019
Y2 - 25 June 2019 through 28 June 2019
ER -