Timezone: »

Extremal Mechanisms for Local Differential Privacy
Peter Kairouz · Sewoong Oh · Pramod Viswanath

Tue Dec 09 04:00 PM -- 08:59 PM (PST) @ Level 2, room 210D

Local differential privacy has recently surfaced as a strong measure of privacy in contexts where personal information remains private even from data analysts. Working in a setting where the data providers and data analysts want to maximize the utility of statistical inferences performed on the released data, we study the fundamental tradeoff between local differential privacy and information theoretic utility functions. We introduce a family of extremal privatization mechanisms, which we call staircase mechanisms, and prove that it contains the optimal privatization mechanism that maximizes utility. We further show that for all information theoretic utility functions studied in this paper, maximizing utility is equivalent to solving a linear program, the outcome of which is the optimal staircase mechanism. However, solving this linear program can be computationally expensive since it has a number of variables that is exponential in the data size. To account for this, we show that two simple staircase mechanisms, the binary and randomized response mechanisms, are universally optimal in the high and low privacy regimes, respectively, and well approximate the intermediate regime.

Author Information

Peter Kairouz (Google)

Peter Kairouz is a Google Research Scientist working on decentralized, privacy-preserving, and robust machine learning algorithms. Prior to Google, his research largely focused on building decentralized technologies for anonymous broadcasting over complex networks, understanding the fundamental trade-off between differential privacy and utility of learning algorithms, and leveraging state-of-the-art deep generative models for data-driven privacy and fairness.

Sewoong Oh (University of Washington)
Pramod Viswanath (UIUC)

More from the Same Authors