Timezone: »

Practical Approaches for Fair Learning with Multitype and Multivariate Sensitive Attributes
Tennison Liu · Alex Chan · Boris van Breugel · Mihaela van der Schaar

Sat Dec 03 09:15 AM -- 09:25 AM (PST) @
Event URL: https://drive.google.com/file/d/1nRciwqP_yWM0MYpyLJwHR5vwxzNGGu5D/view?usp=sharing »

It is important to guarantee that machine learning algorithms deployed in the real world do not result in unfairness or unintended social consequences. Fair ML has largely focused on the protection of single attributes in the simpler setting where both attributes and target outcomes are binary. However, the practical application in many a real-world problem entails the simultaneous protection of multiple sensitive attributes, which are often not simply binary, but continuous or catagorical. To address this more challenging task, we introduce FairCOCCO, a fairness measure built on cross-covariance operators on reproducing kernel Hilbert Spaces. This leads to two practical tools: first, the FairCOCCO Score, a normalised metric that can quantify fairness in settings with single or multiple sensitive attributes of arbitrary type; and second, a subsequent regularisation term that can be incorporated into arbitrary learning objectives to obtain fair predictors. These contributions address crucial gaps in the algorithmic fairness literature, and we empirically demonstrate consistent improvements against state-of-the-art techniques in balancing predictive power and fairness on both synthetic and real-world datasets.

Author Information

Tennison Liu (University of Cambridge)
Alex Chan (University of Cambridge)
Boris van Breugel (University of Cambridge)
Mihaela van der Schaar (University of Cambridge)

More from the Same Authors