Publication
IEEE J-SAC
Paper

Laplacian Matrix Sampling for Communication- Efficient Decentralized Learning

View publication

Abstract

We consider the problem of training a given machine learning model by decentralized parallel stochastic gradient descent over training data distributed across multiple nodes, which arises in many application scenarios. Although extensive studies have been conducted on improving the communication efficiency by optimizing what to communicate between nodes (e.g., model compression) and how often to communicate, recent studies have shown that it is also important to customize the communication patterns between each pair of nodes, which is the focus of this work. To this end, we propose a framework and efficient algorithms to design the communication patterns through Laplacian matrix sampling (LMS), which governs not only which nodes should communicate with each other but also what weights the communicated parameters should carry during parameter aggregation. Our framework is designed to minimize the total cost incurred until convergence based on any given cost model that is additive over iterations, with focus on minimizing the communication cost. Besides achieving a theoretically guaranteed performance in the special case of additive homogeneous communication costs, our solution also achieves superior performance under a variety of network settings and cost models in experiments based on real datasets and topologies, saving 24-50% of the cost compared to the state-of-the-art design without compromising the quality of the trained model.

Date

01 Apr 2023

Publication

IEEE J-SAC

Authors

Share