Publication
Discrete and Computational Geometry
Paper

Random Sampling with Removal

View publication

Abstract

We study randomized algorithms for constrained optimization, in abstract frameworks that include, in strictly increasing generality: convex programming; LP-type problems; violator spaces; and a setting we introduce, consistent spaces. Such algorithms typically involve a step of finding the optimal solution for a random sample of the constraints. They exploit the condition that, in finite dimension δ, this sample optimum violates a provably small expected fraction of the non-sampled constraints, with the fraction decreasing in the sample size r. We extend such algorithms by considering the technique of removal, where a fixed number k of constraints are removed from the sample according to a fixed rule, with the goal of improving the solution quality. This may have the effect of increasing the number of violated non-sampled constraints. We study this increase, and bound it in a variety of general settings. This work is motivated by, and extends, results on removal as proposed for chance-constrained optimization. For many relevant values of r, δ, and k, we prove matching upper and lower bounds for the expected number of constraints violated by a random sample, after the removal of k constraints. For a large range of values of k, the new upper bounds improve the previously best bounds for LP-type problems, which moreover had only been known in special cases, and not in the generality we consider. Moreover, we show that our results extend from finite to infinite spaces, for chance-constrained optimization.

Date

Publication

Discrete and Computational Geometry

Authors

Share