Publication
SIOPT
Paper

Graph topology invariant gradient and sampling complexity for decentralized and stochastic optimization

View publication

Abstract

One fundamental problem in constrained decentralized multiagent optimization is the trade-off between gradient/sampling complexity and communication complexity. In this paper, we propose new algorithms whose gradient and sampling complexities are graph topology invariant, while their communication complexities remain optimal. Specifically, for convex smooth deterministic problems, we propose a primal-dual sliding (PDS) algorithm that is able to compute an varepsilon-solution with scrO ((~L/varepsilon )1/2) gradient complexity and scrO ((~L/varepsilon )1/2 + | scrA | /varepsilon ) communication complexity, where ~ L is the smoothness parameter of the objective function and scrA is related to either the graph Laplacian or the transpose of the oriented incidence matrix of the communication network. The complexities can be further improved to scrO ((~L/mu )1/2 log(1/varepsilon )) and scrO ((~L/mu )1/2 log(1/varepsilon ) + | scrA | /varepsilon 1/2), respectively, with the additional assumption of strong convexity modulus mu . We also propose a stochastic variant, namely, the stochastic primal-dual sliding (SPDS) algorithm, for convex smooth problems with stochastic gradients. The SPDS algorithm utilizes the minibatch technique and enables the agents to perform sampling and communication simultaneously. It computes a stochastic varepsilon-solution with scrO ((~L/varepsilon )1/2 + (sigma /varepsilon )2) sampling complexity, which can be further improved to scrO ((~L/mu )1/2 log(1/varepsilon ) + sigma 2/varepsilon ) in the strong convexity case. Here sigma 2 is the variance of the stochastic gradient. The communication complexities of SPDS remain the same as that of the deterministic case.

Date

28 Jul 2023

Publication

SIOPT

Authors

Topics

Share