About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
NeurIPS 2022
Workshop
Conditional Moment Alignment for Improved Generalization in Federated Learning
Abstract
In this work, we study model heterogeneous Federated Learning (FL) for classification where different clients have different model architectures. Unlike existing works on model heterogeneity, we neither require access to a public dataset nor do we impose constraints on the model architecture of clients and ensure that the clients' models and data are private. We prove a generalization result, that provides fundamental insights into the role of the representations in FL and propose a theoretically grounded algorithm \textbf{Fed}erated \textbf{C}onditional \textbf{M}oment \textbf{A}lignment (\pap) that aligns class conditional distributions of each client in the feature space.