Timezone: »

Conformalized Fairness via Quantile Regression
Meichen Liu · Lei Ding · Dengdeng Yu · Wulong Liu · Linglong Kong · Bei Jiang

Wed Nov 30 02:00 PM -- 04:00 PM (PST) @ Hall J #931

Algorithmic fairness has received increased attention in socially sensitive domains. While rich literature on mean fairness has been established, research on quantile fairness remains sparse but vital. To fulfill great needs and advocate the significance of quantile fairness, we propose a novel framework to learn a real-valued quantile function under the fairness requirement of Demographic Parity with respect to sensitive attributes, such as race or gender, and thereby derive a reliable fair prediction interval. Using optimal transport and functional synchronization techniques, we establish theoretical guarantees of distribution-free coverage and exact fairness for the induced prediction interval constructed by fair quantiles. A hands-on pipeline is provided to incorporate flexible quantile regressions with an efficient fairness adjustment post-processing algorithm. We demonstrate the superior empirical performance of this approach on several benchmark datasets. Our results show the model’s ability to uncover the mechanism underlying the fairness-accuracy trade-off in a wide range of societal and medical applications.

Author Information

Meichen Liu
Lei Ding (University of Alberta)
Dengdeng Yu (University of Texas at Arlington)
Wulong Liu (Huawei Noah's Ark Lab)
Linglong Kong (University of Alberta)
Bei Jiang (University of Alberta)

More from the Same Authors