Publication
FOCS 2013
Conference paper

Optimal bounds on approximation of submodular and XOS functions by juntas

View publication

Abstract

We investigate the approximability of several classes of real-valued functions by functions of a small number of variables (juntas). Our main results are tight bounds on the number of variables required to approximate a function f : {0, 1}n → [0, 1] within ℓ2-error ε over the uniform distribution: • If f is submodular, then it is ε-close to a function of O( 1 /ε2 log 1 /ε ) variables. This is an exponential improvement over previously known results [1]. We note that Ω( 1 /ε2 ) variables are necessary even for linear functions. • If f is fractionally subadditive (XOS) it is ε-close to a function of 2O(1/ε2) variables. This result holds for all functions with low total ℓ1-influence and is a real-valued analogue of Friedgut's theorem for boolean functions. We show that 2Ω(1/ε2) variables are necessary even for XOS functions. As applications of these results, we provide learning algorithms over the uniform distribution. For XOS functions, we give a PAC learning algorithm that runs in time 21/poly(γε)poly(n). For submodular functions we give an algorithm in the more demanding PMAC learning model [2] which requires a multiplicative (1 + γ) factor approximation with probability at least 1-ε over the target distribution. Our uniform distribution algorithm runs in time 21/poly(γε)poly(n). This is the first algorithm in the PMAC model that can achieve a constant approximation factor arbitrarily close to 1 for all submodular functions (even over the uniform distribution). It relies crucially on our approximation by junta result. As follows from the lower bounds in [1] both of these algorithms are close to optimal. We also give applications for proper learning, testing and agnostic learning with value queries of these classes. Copyright © 2013 by The Institute of Electrical and Electronics Engineers, Inc.

Date

01 Dec 2013

Publication

FOCS 2013

Authors

Topics

Share