Timezone: »

 
Poster
Positive-Unlabeled Learning with Non-Negative Risk Estimator
Ryuichi Kiryo · Gang Niu · Marthinus C du Plessis · Masashi Sugiyama

Tue Dec 05 06:30 PM -- 10:30 PM (PST) @ Pacific Ballroom #15

From only positive (P) and unlabeled (U) data, a binary classifier could be trained with PU learning, in which the state of the art is unbiased PU learning. However, if its model is very flexible, empirical risks on training data will go negative, and we will suffer from serious overfitting. In this paper, we propose a non-negative risk estimator for PU learning: when getting minimized, it is more robust against overfitting, and thus we are able to use very flexible models (such as deep neural networks) given limited P data. Moreover, we analyze the bias, consistency, and mean-squared-error reduction of the proposed risk estimator, and bound the estimation error of the resulting empirical risk minimizer. Experiments demonstrate that our risk estimator fixes the overfitting problem of its unbiased counterparts.

Author Information

Ryuichi Kiryo (UTokyo/RIKEN)
Gang Niu (RIKEN)
Gang Niu

Gang Niu is currently an indefinite-term senior research scientist at RIKEN Center for Advanced Intelligence Project.

Marthinus C du Plessis (The University of Tokyo)
Masashi Sugiyama (RIKEN / University of Tokyo)

Related Events (a corresponding poster, oral, or spotlight)

More from the Same Authors