Skip to yearly menu bar Skip to main content


Poster
in
Workshop: Algorithmic Fairness through the Lens of Causality and Privacy

Fairness of Interaction in Ranking under Exposure, Selection, and Trust Bias

Zohreh Ovaisi


Abstract: Ranking algorithms in online platforms serve not only users on the demand side, but also items on the supply side. Traditionally, ranking presents items in an order that maximizes their utility to users, by sorting them according to inferred relevance. The fairness of the interaction that different items receive as a result of such a ranking can have ethical, legal, and promotional implications. Interaction, however, is affected by various forms of bias, two of which have received considerable attention: exposure bias and selection bias. Exposure bias, also known as position or presentation bias, occurs due to lower likelihood of observation in lower ranked positions. Selection bias in the data occurs because interaction is not possible with items below an arbitrary cutoff position chosen by the front-end application at deployment time (i.e., showing only the top $k$ items). A less studied third form of bias, trust bias, is equally important, as it makes interaction depend on rank even after observation, by influencing the perceived relevance of items. This paper introduces a flexible fairness metric that captures interaction disparity in the presence of all three biases, and proposes a post-processing framework to trade off fairness and utility which improves fairness toward items while maintaining user utility and outperforms state-of-the-art fair ranking algorithms.

Chat is not available.