Publication
CVPRW 2019
Conference paper

Towards deep neural network training on encrypted data

View publication

Abstract

While deep learning is a valuable tool for solving many tough problems in computer vision, the success of deep learning models is typically determined by: (i) availability of sufficient training data, (ii) access to extensive computational resources, and (iii) expertise in selecting the right model and hyperparameters for the selected task. Often, the availability of data is the hard part due to compliance, legal, and privacy constraints. Cryptographic techniques such as fully homomorphic encryption (FHE) offer a potential solution by enabling processing on encrypted data. While prior work has been done on using FHE for inferencing, training a deep neural network in the encrypted domain is an extremely challenging task due to the computational complexity of the operations involved. In this paper, we evaluate the feasibility of training neural networks on encrypted data in a completely non-interactive way. Our proposed system uses the open-source FHE toolkit HElib to implement a Stochastic Gradient Descent (SGD)-based training of a neural network. We show that encrypted training can be made more computationally efficient by (i) simplifying the network with minimal degradation of accuracy, (ii) choosing appropriate data representation and resolution, and (iii) packing the data elements within the ciphertext in a smart way so as to minimize the number of operations and facilitate parallelization of FHE computations. Based on the above optimizations, we demonstrate that it is possible to achieve more than 50x speed up while training a fully-connected neural network on the MNIST dataset while achieving reasonable accuracy (96%). Though the cost of training a complex deep learning model from scratch on encrypted data is still very high, this work establishes a solid baseline and paves the way for relatively simpler tasks such as fine-tuning of deep learning models based on encrypted data to be implemented in the near future.

Date

Publication

CVPRW 2019