Skip to yearly menu bar Skip to main content


Poster

New Lower Bounds for Private Estimation and a Generalized Fingerprinting Lemma

Gautam Kamath · Argyris Mouzakis · Vikrant Singhal

Hall J (level 1) #817

Keywords: [ data privacy ] [ Gaussians ] [ differential privacy ] [ Statistics ] [ covariance estimation ] [ machine learning ] [ mean estimation ] [ lower bounds ] [ learning ]


Abstract: We prove new lower bounds for statistical estimation tasks under the constraint of $(\varepsilon,\delta)$-differential privacy. First, we provide tight lower bounds for private covariance estimation of Gaussian distributions. We show that estimating the covariance matrix in Frobenius norm requires $\Omega(d^2)$ samples, and in spectral norm requires $\Omega(d^{3/2})$ samples, both matching upper bounds up to logarithmic factors. We prove these bounds via our main technical contribution, a broad generalization of the fingerprinting method to exponential families. Additionally, using the private Assouad method of Acharya, Sun, and Zhang, we show a tight $\Omega(d/(\alpha^2 \varepsilon))$ lower bound for estimating the mean of a distribution with bounded covariance to $\alpha$-error in $\ell_2$-distance. Prior known lower bounds for all these problems were either polynomially weaker or held under the stricter condition of $(\varepsilon,0)$-differential privacy.

Chat is not available.