Skip to yearly menu bar Skip to main content


Poster

Learning from the Wisdom of Crowds by Minimax Entropy

Denny Zhou · John C Platt · Sumit Basu · Yi Mao

Harrah’s Special Events Center 2nd Floor

Abstract:

An important way to make large training sets is to gather noisy labels from crowds of nonexperts. We propose a minimax entropy principle to improve the quality of these labels. Our method assumes that labels are generated by a probability distribution over workers, items, and labels. By maximizing the entropy of this distribution, the method naturally infers item confusability and worker expertise. We infer the ground truth by minimizing the entropy of this distribution, which we show minimizes the Kullback-Leibler (KL) divergence between the probability distribution and the unknown truth. We show that a simple coordinate descent scheme can optimize minimax entropy. Empirically, our results are substantially better than previously published methods for the same problem.

Live content is unavailable. Log in and register to view live content