Timezone: »

 
Poster
Estimation of Entropy in Constant Space with Improved Sample Complexity
Maryam Aliakbarpour · Andrew McGregor · Jelani Nelson · Erik Waingarten

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #819
Recent work of Acharya et al.~(NeurIPS 2019) showed how to estimate the entropy of a distribution $\mathcal D$ over an alphabet of size $k$ up to $\pm\epsilon$ additive error by streaming over $(k/\epsilon^3) \cdot \text{polylog}(1/\epsilon)$ i.i.d.\ samples and using only $O(1)$ words of memory. In this work, we give a new constant memory scheme that reduces the sample complexity to $(k/\epsilon^2)\cdot \text{polylog}(1/\epsilon)$. We conjecture that this is optimal up to $\text{polylog}(1/\epsilon)$ factors.

Author Information

Maryam Aliakbarpour (University of Massachusetts, Amherst)
Andrew McGregor (University of Massachusetts Amherst)
Jelani Nelson (UC Berkeley)

Jelani Nelson is Professor in the Department of Electrical Engineering and Computer Science at UC Berkeley. His research interests include sketching and streaming algorithms, dimensionality reduction, compressing sensing, and randomized linear algebra. In the past he has been a recipient of the PECASE award, a Sloan Research Fellowship, and an NSF CAREER award. He is also the Founder and President of a 501(c)(3) nonprofit, “AddisCoder Inc.”, which organizes annual summer camps that have provided algorithms training to over 500 high school students in Ethiopia (see addiscoder.com).

Erik Waingarten (Stanford University)

More from the Same Authors