Publication
SIAM Journal on Computing
Paper

Sample complexity bounds on differentially private learning via communication complexity

View publication

Abstract

In this work we analyze the sample complexity of classification by differentially private algorithms. Differential privacy is a strong and well-studied notion of privacy introduced by Dwork et al. [Lecture Notes in Comput. Sci. 3876, Springer, New York, 2006, pp. 265-284] that ensures that the output of an algorithm leaks little information about the data point provided by any of the participating individuals. Sample complexity of private probably approximately correct (PAC) and agnostic learning was studied in a number of prior works starting with Kasiviswanathan et al. [SIAM J. Comput., 40(2011), pp. 793-826]. However, a number of basic questions remain open [A. Beimel, S. P. Kasiviswanathan, and K. Nissim, Lecture Notes in Comput. Sci. 5978, Springer, New York, 2006, pp. 437-454; K. Chaudhuri and D. Hsu, Proceedings of Conference in Learning Theory, 2011, pp. 155-186; A. Beimel, K. Nissim, and U. Stemmer, Proceedings of the 4th Conference on Innovations in Theoretical Computer Science, 2013, pp. 97-110; A. Beimel, K. Nissim, and U. Stemmer, Proceedings of APPROX+RANDOM, 2013, pp. 363-378], most notably whether learning with privacy requires more samples than learning without privacy. We show that the sample complexity of learning with (pure) differential privacy can be arbitrarily higher than the sample complexity of learning without the privacy constraint or the sample complexity of learning with approximate differential privacy. Our second contribution and the main tool is an equivalence between the sample complexity of (pure) differentially private learning of a concept class C (SCDP(C)) and the randomized one-way communication complexity of the evaluation problem for concepts from C. Using this equivalence we prove the following bounds: (a) SCDP(C) = Ω(LDim(C)), where LDim(C) is the Littlestone's dimension characterizing the number of mistakes in the online-mistakebound learning model [N. Littlestone, Machine Learning, 2(1987), pp. 285-318]. Known bounds on LDim(C) then imply that SCDP(C) can be much higher than the Vapnik-Chervonenkis dimension of C. (b) For any t, there exists a class C such that LDim(C) = 2 but SCDP(C) ≥ t. (c) For any t, there exists a class C such that the sample complexity of (pure) α-differentially private PAC learning is Ω(t/α) but the sample complexity of the approximate (α,β)-differentially private PAC learning is O(log(1/β)/α). This resolves an open problem from Beimel, Nissim, and Stemmer [Proceedings of APPROX+RANDOM, 2013, pp. 363-378]. © 2015 Society for Industrial and Applied Mathematics.

Date

Publication

SIAM Journal on Computing

Authors

Topics

Share