NeurIPS 2020
Conference paper

Finding second-order stationary points efficiently in smooth nonconvex linearly constrained optimization problems

Download paper


This paper proposes two efficient algorithms for computing approximate secondorder stationary points (SOSPs) of problems with generic smooth non-convex objective functions and generic linear constraints. While finding (approximate) SOSPs for the class of smooth non-convex linearly constrained problems is computationally intractable, we show that generic problem instances in this class can be solved efficiently. Specifically, for a generic problem instance, we show that certain strict complementarity (SC) condition holds for all Karush-KuhnTucker (KKT) solutions. Based on this condition, we design an algorithm named Successive Negative-curvature grAdient Projection (SNAP), which performs either conventional gradient projection or some negative curvature based projection steps to find SOSPs. SNAP is a second-order algorithm that requires Oe(max{1/ϵ2 G, 1/ϵ3 H}) iterations to compute an (ϵG, ϵH)-SOSP, where Oe hides the iteration complexity for eigenvalue-decomposition. Building on SNAP, we propose a first-order algorithm, named SNAP+, that requires O(1/ϵ2.5 ) iterations to compute (ϵ, √ ϵ)-SOSP. The per-iteration computational complexities of our algorithms are polynomial in the number of constraints and problem dimension. To the best of our knowledge, this is the first time that first-order algorithms with polynomial per-iteration complexity and global sublinear rate are designed to find SOSPs of the important class of non-convex problems with linear constraints (almost surely).


06 Dec 2020


NeurIPS 2020