Stephanie Houde, Vignesh Radhakrishna, et al.
NeurIPS 2022
Federated learning (FL) faces challenges of intermittent client availability and computation/communication efficiency. As a result, only a small subset of clients can participate in FL at a given time. It is important to understand how partial client participation affects convergence, but most existing works have either considered idealized participation patterns or obtained results with non-zero optimality error for generic patterns. In this paper, we provide a unified convergence analysis for FL with arbitrary client participation. We first introduce a generalized version of federated averaging (FedAvg) that amplifies parameter updates at an interval of multiple FL rounds. Then, we present a novel analysis that captures the effect of client participation in a single term. By analyzing this term, we obtain convergence upper bounds for a wide range of participation patterns, including both non-stochastic and stochastic cases, which match either the lower bound of stochastic gradient descent (SGD) or the state-of-the-art results in specific settings. We also discuss various insights, recommendations, and experimental results.
Stephanie Houde, Vignesh Radhakrishna, et al.
NeurIPS 2022
Jiaqi Han, Wenbing Huang, et al.
NeurIPS 2022
Stefano Braghin, Liubov Nedoshivina
EuroSys 2025
Bo Zhao, Nima Dehmamy, et al.
NeurIPS 2022