Publication
INFORMS 2021
Talk
Graph Toplogy Invariant Gradient and Sampling Complexity for Decentralized and Stochastic Optimization
Abstract
One fundamental problem in constrained decentralized multi-agent optimization is the trade-off between gradient/sampling complexity and communication complexity. In this paper we propose new algorithms whose gradient and sampling complexities are graph topology invariant, while their communication complexities remain optimal. All the aforementioned gradient and sampling complexities match the lower complexity bounds for centralized convex smooth optimization and are independent of the network structure. To the best of our knowledge, these gradient and sampling complexities have not been obtained before in the literature of decentralized optimization over a constraint feasible set.