Skip to yearly menu bar Skip to main content


Collaborative Learning of Discrete Distributions under Heterogeneity and Communication Constraints

Xinmeng Huang · Donghwan Lee · Edgar Dobriban · Hamed Hassani

Hall J (level 1) #307

Keywords: [ Communication Constraint ] [ Collaborative Estimation ] [ Sparse Heterogeneity ] [ Discrete Distributions ]

Abstract: In modern machine learning, users often have to collaborate to learn distributions that generate the data. Communication can be a significant bottleneck. Prior work has studied homogeneous users---i.e., whose data follow the same discrete distribution---and has provided optimal communication-efficient methods. However, these methods rely heavily on homogeneity, and are less applicable in the common case when users' discrete distributions are heterogeneous. Here we consider a natural and tractable model of heterogeneity, where users' discrete distributions only vary sparsely, on a small number of entries. We propose a novel two-stage method named SHIFT: First, the users collaborate by communicating with the server to learn a central distribution; relying on methods from robust statistics. Then, the learned central distribution is fine-tuned to estimate the individual distributions of users. We show that our method is minimax optimal in our model of heterogeneity and under communication constraints. Further, we provide experimental results using both synthetic data and $n$-gram frequency estimation in the text domain, which corroborate its efficiency.

Chat is not available.