Publication
ITA 2014
Conference paper

Optimal bounds on approximation of submodular and XOS functions by juntas

View publication

Abstract

We investigate the approximability of several classes of real-valued functions by functions of a small number of variables (juntas). Our main results are tight bounds on the number of variables required to approximate a function f : {0, 1}n → [0, 1] within ℓ2-error over the uniform distribution: • If f is submodular, then it is -close to a function of O(1/2 log 1/) variables. This is an exponential improvement over previously known results [1]. We note that Ω(1/2) variables are necessary even for linear functions. • If f is fractionally subadditive (XOS) it is -close to a function of 2O (1/2) variables. This result holds for all functions with low totalℓ1-inftuence and is a real-valued analogue of Friedgut's theorem for boolean functions. We show that 2Ω(1/) variables are necessary even for XOS functions. As applications of these results, we provide learning algorithms over the uniform distribution. For XOS functions, we give a PAC learning algorithm that runs in time 21/poly() poly(n). For submodular functions we give an algorithm in the more demanding PMAC learning model [2] which requires a multiplicative (1 + γ) factor approximation with probability at least 1 - over the target distribution. Our uniform distribution algorithm runs in time 21/poly(γ) poly(n) This is the first algorithm in the PMAC model that can achieve a constant approximation factor arbitrarily close to 1 for all submodular functions (even over the uniform distribution). It relies crucially on our approximation by junta result. As follows from the lower bounds in [1] both of these algorithms are close to optimal. We also give applications for proper learning, testing and agnostic learning with value queries of these classes. © 2014 IEEE.

Date

09 Feb 2014

Publication

ITA 2014

Authors

Topics

Share