Publication
INFORMS 2021
Talk

Asynchronous decentralized accelerated stochastic gradient descent

View publication

Abstract

In this talk, we introduce an asynchronous decentralized accelerated stochastic gradient descent algorithm for decentralized stochastic optimization. Considering communication and synchronization costs are the major bottlenecks, we attempt to reduce these costs via randomization techniques. Our major contribution is to develop a class of accelerated randomized decentralized algorithms for solving general convex composite problems. We establish O(1/ε) (resp., O(1/√ε)) communication complexity and O(1/ε2) (resp., O(1/ε)) sampling complexity for solving general convex (resp., strongly convex) problems. It worths mentioning that our proposing algorithm only sublinear depends on the Lipschitz constant if there is a smooth component presented in the objective function.

Date

24 Oct 2021

Publication

INFORMS 2021

Authors

Topics

Share