Skip to yearly menu bar Skip to main content


Poster

Better Private Linear Regression Through Better Private Feature Selection

Travis Dick · Jennifer Gillenwater · Matthew Joseph

Great Hall & Hall B1+B2 (level 1) #1608
[ ]
Tue 12 Dec 8:45 a.m. PST — 10:45 a.m. PST

Abstract:

Existing work on differentially private linear regression typically assumes that end users can precisely set data bounds or algorithmic hyperparameters. End users often struggle to meet these requirements without directly examining the data (and violating privacy). Recent work has attempted to develop solutions that shift these burdens from users to algorithms, but they struggle to provide utility as the feature dimension grows. This work extends these algorithms to higher-dimensional problems by introducing a differentially private feature selection method based on Kendall rank correlation. We prove a utility guarantee for the setting where features are normally distributed and conduct experiments across 25 datasets. We find that adding this private feature selection step before regression significantly broadens the applicability of ``plug-and-play'' private linear regression algorithms at little additional cost to privacy, computation, or decision-making by the end user.

Chat is not available.