Publication
SODA 2014
Conference paper

Fast algorithms for maximizing submodular functions

View publication

Abstract

There has been much progress recently on improved approximations for problems involving submodular objective functions, and many interesting techniques have been developed. However, the resulting algorithms are often slow and impractical. In this paper we develop algorithms that match the best known approximation guarantees, but with significantly improved running times, for maximizing a monotone submodular function f : 2[n] →; R + subject to various constraints. As in previous work, we measure the number of oracle calls to the objective function which is the dominating term in the running time. Our first result is a simple algorithm that gives a (1 - 1/e - ε)-approximation for a cardinality constraint using O(n/ε log n/ε) queries, and a 1 /(p + 2ℓ + 1 + ε)- Approximation for the intersection of a p-system and ℓ knapsack (linear) constraints usingO(n/ε2 log2 n/ε) queries. This is the first approximation for a system combined with linear constraints. (We also show that the factor of p cannot be improved for maximizing over a p-system.) The main idea behind these algorithms serves as a building block in our more sophisticated algorithms. Our main result is a new variant of the continuous greedy algorithm, which interpolates between the classical greedy algorithm and a truly continuous algorithm. We show how this algorithm can be implemented for matroid and knapsack constraints using Õ(n2) oracle calls to the objective function. (Previous variants and alternative techniques were known to use at least Õ (n4) oracle calls.) This leads to an O(n2/ε4 log2 n/ε)-time (1 - 1/e - ε)-approximation for a matroid constraint. For a knapsack constraint, we develop a more involved (1 - 1/e - ε)-approximation algorithm that runs in time O(n2(1/εlogn)poly(1/ε)). Copyright © 2014 by the Society for Industrial and Applied Mathematics.

Date

Publication

SODA 2014

Authors

Share