Skip to yearly menu bar Skip to main content


Poster

Estimation of Entropy in Constant Space with Improved Sample Complexity

Maryam Aliakbarpour · Andrew McGregor · Jelani Nelson · Erik Waingarten

Hall J (level 1) #819

Keywords: [ Data streams ] [ Shannon Entropy ] [ Sample Complexity ]


Abstract: Recent work of Acharya et al.~(NeurIPS 2019) showed how to estimate the entropy of a distribution $\mathcal D$ over an alphabet of size $k$ up to $\pm\epsilon$ additive error by streaming over $(k/\epsilon^3) \cdot \text{polylog}(1/\epsilon)$ i.i.d.\ samples and using only $O(1)$ words of memory. In this work, we give a new constant memory scheme that reduces the sample complexity to $(k/\epsilon^2)\cdot \text{polylog}(1/\epsilon)$. We conjecture that this is optimal up to $\text{polylog}(1/\epsilon)$ factors.

Chat is not available.