Publication
JMLR
Paper

Sample complexity bounds on differentially private learning via communication complexity

Abstract

In this work we analyze the sample complexity of classification by differentially private algorithms. Differential privacy is a strong and well-studied notion of privacy introduced by Dwork et al. (2006) that ensures that the output of an algorithm leaks little information about the data point provided by any of the participating individuals. Sample complexity of private PAC and agnostic learning was studied in a number of prior works starting with (Kasiviswanathan et al., 2011) but a number of basic questions still remain open (Beimel et al., 2010; Chaudhuri and Hsu, 2011; Beimel et al., 2013a,b). Our main contribution is an equivalence between the sample complexity of differentiallyprivate learning of a concept class C (or SCDP(C)) and the randomized one-way communication complexity of the evaluation problem for concepts from C. Using this equivalence we prove the following bounds: • SCDP(C) = Ω(LDim(C)), where LDim(C) is the Littlestone's dimension characterizing thenumber of mistakes in the online-mistake-bound learning model (Littlestone, 1987). This result implies that SCDP(C) is different from the VC-dimension of C, resolving one of the main open questions from prior work. • For any t, there exists a class C such that LDim(C) = 2 but SCDP(C) ≥ t. • For any t, there exists a class C such that the sample complexity of (pure) α-differentially private PAC learning is Ω(t/α) but the sample complexity of the relaxed (α, β)-differentially private PAC learning is O(log(1/β)/α). This resolves an open problem from (Beimel et al., 2013b). We also obtain simpler proofs for a number of known related results. Our equivalence builds on a characterization of sample complexity by Beimel et al. (2013a) and our bounds rely on a number of known results from communication complexity.

Date

Publication

JMLR

Authors

Topics

Share