Timezone: »

 
Poster
Estimation of Entropy in Constant Space with Improved Sample Complexity
Maryam Aliakbarpour · Andrew McGregor · Jelani Nelson · Erik Waingarten

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #819
Recent work of Acharya et al.~(NeurIPS 2019) showed how to estimate the entropy of a distribution $\mathcal D$ over an alphabet of size $k$ up to $\pm\epsilon$ additive error by streaming over $(k/\epsilon^3) \cdot \text{polylog}(1/\epsilon)$ i.i.d.\ samples and using only $O(1)$ words of memory. In this work, we give a new constant memory scheme that reduces the sample complexity to $(k/\epsilon^2)\cdot \text{polylog}(1/\epsilon)$. We conjecture that this is optimal up to $\text{polylog}(1/\epsilon)$ factors.

Author Information

Maryam Aliakbarpour (University of Massachusetts, Amherst)
Andrew McGregor (University of Massachusetts Amherst)
Jelani Nelson (UC Berkeley)

Jelani Nelson is a Professor of Electrical Engineering and Computer Sciences at UC Berkeley, and also a Research Scientist at Google (part-time). He is interested in randomized algorithms, sketching and streaming algorithms, dimensionality reduction, and differential privacy. He is a recipient of the ACM Eugene L. Lawler Award for Humanitarian Contributions within Computer Science, a Presidential Early Career Award for Scientist and Engineers (PECASE), and a Sloan Research Fellowship. He is also Founder and President of AddisCoder, Inc., a nonprofit that provides algorithms training to high school students in Ethiopia and Jamaica.

Erik Waingarten (Stanford University)

More from the Same Authors