Timezone: »

Adaptive Data Debiasing through Bounded Exploration
Yifan Yang · Yang Liu · Parinaz Naghizadeh

Wed Nov 30 09:00 AM -- 11:00 AM (PST) @ Hall J #324

Biases in existing datasets used to train algorithmic decision rules can raise ethical and economic concerns due to the resulting disparate treatment of different groups. We propose an algorithm for sequentially debiasing such datasets through adaptive and bounded exploration in a classification problem with costly and censored feedback. Exploration in this context means that at times, and to a judiciously-chosen extent, the decision maker deviates from its (current) loss-minimizing rule, and instead accepts some individuals that would otherwise be rejected, so as to reduce statistical data biases. Our proposed algorithm includes parameters that can be used to balance between the ultimate goal of removing data biases -- which will in turn lead to more accurate and fair decisions, and the exploration risks incurred to achieve this goal. We analytically show that such exploration can help debias data in certain distributions. We further investigate how fairness criteria can work in conjunction with our data debiasing algorithm. We illustrate the performance of our algorithm using experiments on synthetic and real-world datasets.

Author Information

Yifan Yang (Ohio State University)
Yang Liu (UC Santa Cruz)
Parinaz Naghizadeh (Ohio State University)

More from the Same Authors