Timezone: »

 
Poster
Fast and Memory Efficient Differentially Private-SGD via JL Projections
Zhiqi Bu · Sivakanth Gopi · Janardhan Kulkarni · Yin Tat Lee · Judy Hanwen Shen · Uthaipon Tantipongpipat

Thu Dec 09 08:30 AM -- 10:00 AM (PST) @

Differentially Private-SGD (DP-SGD) of Abadi et al. and its variations are the only known algorithms for private training of large scale neural networks. This algorithm requires computation of per-sample gradients norms which is extremely slow and memory intensive in practice. In this paper, we present a new framework to design differentially private optimizers called DP-SGD-JL and DP-Adam-JL. Our approach uses Johnson–Lindenstrauss (JL) projections to quickly approximate the per-sample gradient norms without exactly computing them, thus making the training time and memory requirements of our optimizers closer to that of their non-DP versions. Unlike previous attempts to make DP-SGD faster which work only on a subset of network architectures or use compiler techniques, we propose an algorithmic solution which works for any network in a black-box manner which is the main contribution of this paper. To illustrate this, on IMDb dataset, we train a Recurrent Neural Network (RNN) to achieve good privacy-vs-accuracy tradeoff, while being significantly faster than DP-SGD and with a similar memory footprint as non-private SGD.

Author Information

Zhiqi Bu (University of Pennsylvania)
Sivakanth Gopi (Microsoft Research)

Sivakanth Gopi is a senior researcher in the Algorithms group at Microsoft Research Redmond. He is interested in Coding Theory and Differential Privacy.

Janardhan Kulkarni (Microsoft Research)
Yin Tat Lee (UW)
Judy Hanwen Shen (Stanford)
Uthaipon Tantipongpipat (Georgia Tech)

Graduating PhD student in machine learning theory and optimization. Strong background in mathematics and algorithmic foundations of data science with hands-on implementations on real-world datasets. Strive for impact and efficiency while attentive to details. Enjoy public speaking and experienced in leading research projects. Published many theoretical results in academic conferences and developed several optimized algorithms for public use. My research includes • Approximation algorithms in optimal design in statistics, as known as design of experiments (DoE) using combinatorial optimization. Diversity or representative sampling. • Differential privacy – theory of privacy in growing database; its deployment in deep learning models such as RNNs, LSTMs, autoencoders, and GANs; and its application in private synthetic data generation. • Fairness in machine learning – fair principle component analysis (fair PCA) using convex optimization and randomized rounding to obtain low-rank solution to semi-definite programming Other Interests: model compressions; privacy and security in machine learning; fair and explainable/interpretable machine learning

More from the Same Authors