Timezone: »

Bounded Fairness Transferability subject to Distribution Shift
Reilly Raab · Yatong Chen · Yang Liu

We study the \emph{transferability of fair predictors} (i.e., classifiers or regressors) assuming domain adaptation. Given a predictor that is “fair” on some \emph{source} distribution (of features and labels), is it still fair on a \emph{realized} distribution that differs? We first generalize common notions of static, statistical group-level fairness to a family of premetric functions that measure “induced disparity.” We quantify domain adaptation by bounding group-specific statistical divergences between the source and realized distributions. Next, we explore cases of simplifying assumptions for which bounds on domain adaptation imply bounds on changes to induced disparity. We provide worked examples for two commonly used fairness definitions (i.e., demographic parity and equalized odds) and models of domain adaptation (i.e., covariate shift and label shift) that prove to be special cases of our general method. Finally, we validate our theoretical results with synthetic data.

Author Information

Reilly Raab (UC Santa Cruz)

My current research involves the dynamics of multiagent systems and the alignment of local incentives with global objectives. My background is in physics, with experience in scientific computing, signal processing, and electronics. I spent a few years between undergrad and grad school backpacking abroad, remotely developing software related to automated circuit design.

Yatong Chen (UC Santa Cruz)
Yang Liu (UC Santa Cruz)

More from the Same Authors