About cookies on this site Our websites require some cookies to function properly (required). In addition, other cookies may be used with your consent to analyze site usage, improve the user experience and for advertising. For more information, please review your options. By visiting our website, you agree to our processing of information as described in IBM’sprivacy statement. To provide a smooth navigation, your cookie preferences will be shared across the IBM web domains listed here.
Publication
ICASSP 2023
Conference paper
Scalable and Secure Federated XGBoost
Abstract
Federated learning (FL) is the distributed machine learning framework that enables collaborative training across multiple parties while ensuring data privacy. Practical adaptation of XGBoost, the state-of-the-art tree boosting framework, to FL remains nascent due to high cost incurred by conventional privacy-preserving methods. Such limitations can be attributed to the lack of formal analytical model to enable new privacy methods well customized to federated XGBoost. To this end, we propose a novel formulation, termed splitting matrix, in the context of federated XGBoost that mathematically characterizes the role of passive party (PP) having been neglected in the literature. This new formulation facilitates our novel adoption of secure matrix multiplication protocol into federated XGBoost to propose FedXGBoost as a framework for secure XGBoost in federated setting with lossless accuracy and negligible overhead. Extensive experiments confirm our method's favorable performance.