Timezone: »

Simple Baselines Are Strong Performers for Differentially Private Natural Language Processing
Xuechen (Chen) Li · Florian Tramer · Percy Liang · Tatsunori Hashimoto

Tue Dec 14 12:45 PM -- 01:00 PM (PST) @
Event URL: https://openreview.net/forum?id=oOiSJEr2-Tt »

Differentially private learning has seen limited success for deep learning models of text, resulting in a perception that differential privacy may be incompatible with the language model fine-tuning paradigm. We demonstrate that this perception is inaccurate and that with the right setup, high performing private models can be learned on moderately-sized corpora by directly fine-tuning with differentially private optimization.Our work highlights the important role of hyperparameters, task formulations, and pretrained models.Our analyses also show that the low performance of naive differentially private baselines in prior work is attributable to suboptimal choices in these factors.Empirical results reveal that differentially private optimization does not suffer from dimension-dependent performance degradation with pretrained models and achieves performance on-par with state-of-the-art private training procedures and strong non-private baselines.

Author Information

Xuechen (Chen) Li (Stanford University)
Florian Tramer (Google)
Percy Liang (Stanford University)
Percy Liang

Percy Liang is an Assistant Professor of Computer Science at Stanford University (B.S. from MIT, 2004; Ph.D. from UC Berkeley, 2011). His research spans machine learning and natural language processing, with the goal of developing trustworthy agents that can communicate effectively with people and improve over time through interaction. Specific topics include question answering, dialogue, program induction, interactive learning, and reliable machine learning. His awards include the IJCAI Computers and Thought Award (2016), an NSF CAREER Award (2016), a Sloan Research Fellowship (2015), and a Microsoft Research Faculty Fellowship (2014).

Tatsunori Hashimoto (Stanford)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors