Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Causal Machine Learning for Real-World Impact

Counterfactual Decision Support Under Treatment-Conditional Outcome Measurement Error

Luke Guerdan · Amanda Coston · Kenneth Holstein · Steven Wu


Abstract:

Growing work in algorithmic decision support proposes methods for combining predictive models with human judgment to improve decision quality. A challenge that arises in this setting is predicting the risk of a decision-relevant target outcome under multiple candidate actions. While counterfactual prediction techniques have been developed for these tasks, current approaches do not account for measurement error in observed labels. This is a key limitation because in many domains, observed labels (e.g., medical diagnoses, defendant re-arrest) serve as a proxy for the target outcome of interest (e.g., biological medical outcomes, recidivism). We develop a method for counterfactual prediction of target outcomes observed under treatment-conditional outcome measurement error (TC-OME). Our method minimizes risk with respect to target potential outcomes given access to observational data and estimates of measurement error parameters. We also develop a method for estimating error parameters in cases where these are unknown in advance. Through a synthetic evaluation, we show that our approach achieves performance parity with an oracle model when measurement error parameters are known and retains performance given moderate bias in error parameter estimates.

Chat is not available.