Carnegie Mellon University
Browse

Distributed Learning, Communication Complexity and Privacy

Download (302.76 kB)
journal contribution
posted on 2012-06-01, 00:00 authored by Maria-Florina Balcan, Avrim Blum, Shai Fine, Yishay Mansour

We consider the problem of PAC-learning from distributed data and analyze fundamental communication complexity questions involved. We provide general upper and lower bounds on the amount of communication needed to learn well, showing that in addition to VCdimension and covering number, quantities such as the teaching-dimension and mistakebound of a class play an important role. We also present tight results for a number of common concept classes including conjunctions, parity functions, and decision lists. For linear separators, we show that for non-concentrated distributions, we can use a version of the Perceptron algorithm to learn with much less communication than the number of updates given by the usual margin bound. We also show how boosting can be performed in a generic manner in the distributed setting to achieve communication with only logarithmic dependence on 1/∈ for any concept class, and demonstrate how recent work on agnostic learning from class-conditional queries can be used to achieve low communication in agnostic settings as well. We additionally present an analysis of privacy, considering both differential privacy and a notion of distributional privacy that is especially appealing in this context.

History

Publisher Statement

© 2012 M.-F. Balcan, A. Blum, S. Fine & Y. Mansour

Date

2012-06-01

Usage metrics

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC