# Approximate resilience, monotonicity, and the complexity of agnostic learning

## Abstract

A function f is d-resilient if all its Fourier coefficients of degree at most d are zero, i.e. f is uncorrected with all low-degree parities. We study the notion of approximate resilience of Boolean functions, where we say that f is α-approximately d-resilient if /is α-close to a [-1, l]-valued d-resilient function in ℓ1 distance. We show that approximate resilience essentially characterizes the complexity of agnostic learning of a concept class C over the uniform distribution. Roughly speaking, if all functions in a class C are far from being d-resilient then C can be learned agnostically in time no(d) and conversely, if C contains a function close to being d-resilient then agnostic learning of C in the statistical query (SQ) framework of Kearns has complexity of at least nω{d). Focusing on monotone Boolean functions, we exhibit the existence of near-optimal α-approximately ω(a√n)-resilient monotone functions for all α > 0. Prior to our work, it was conceivable even that every monotone function is ω(1)-far from any 1-resilient function. Furthermore, we construct simple, explicit monotone functions based on Tribes and CycleRun that are close to highly resilient functions. Our constructions are based on general resilience analysis and amplification techniques we introduce. These structural results, together with the characterization, imply nearly optimal lower bounds for agnostic learning of monotone juntas, a natural variant of the well-studied junta learning problem. In particular we show that no SQ algorithm can efficiently agnostically learn monotone κ-juntas for any κ = ω(1) and any constant error less than 1/2.