Skip to yearly menu bar Skip to main content


Unsupervised Risk Estimation Using Only Conditional Independence Structure

Jacob Steinhardt · Percy Liang

Area 5+6+7+8 #6

Keywords: [ Learning Theory ]


We show how to estimate a model’s test error from unlabeled data, on distributions very different from the training distribution, while assuming only that certain conditional independencies are preserved between train and test. We do not need to assume that the optimal predictor is the same between train and test, or that the true distribution lies in any parametric family. We can also efficiently compute gradients of the estimated error and hence perform unsupervised discriminative learning. Our technical tool is the method of moments, which allows us to exploit conditional independencies in the absence of a fully-specified model. Our framework encompasses a large family of losses including the log and exponential loss, and extends to structured output settings such as conditional random fields.

Live content is unavailable. Log in and register to view live content