Publication
CLOUD 2022
Conference paper

DeTrust-FL: Privacy-Preserving Federated Learning in Decentralized Trust Setting

View publication

Abstract

Federated learning has emerged as a privacy- preserving machine learning approach where multiple parties can train a single model without sharing their raw training data. Oftentimes, federated learning requires the utilization of multi-party computation techniques to provide strong privacy guarantees by ensuring that an untrusted or curious aggregator cannot obtain isolated replies from parties involved in the training process preventing potential inference attacks. Until recently, it was thought that some of these secure aggregation techniques were enough to fully protect against inference attacks coming from a curious aggregator. However, recent research has demonstrated that a curious aggregator can successfully launch a disaggregation attack obtaining the individual information comping from a target party. This paper presents DeTrust-FL, an efficient privacy-preserving federated learning framework for resolving the conflict between secure aggregation and centralized trust in the crypto-dealer, as well as challenges caused by disaggregation and stealthy target attacks. DeTrust-FL proposes a decentralized trust consensus mechanism and incorporates a recently proposed decentralized functional encryption scheme in which all parties agree on the participation matrix before collaboratively generating decryption key fragments, thereby gaining control and trust over the secure aggregation process in a decentralized setting. Our experimental evaluation demonstrates that DeTrust-FL outperforms state-of-the-art FE-based secure multi-party aggregation solutions in terms of training time and reduces the volume of data transferred without. In contrast to existing approaches, this is achieved without creating any trust dependency on external trusted entities

Date

10 Jul 2022

Publication

CLOUD 2022