Timezone: »

 
Poster
Estimation Bias in Multi-Armed Bandit Algorithms for Search Advertising
Min Xu · Tao Qin · Tie-Yan Liu

Sun Dec 08 02:00 PM -- 06:00 PM (PST) @ Harrah's Special Events Center, 2nd Floor

In search advertising, the search engine needs to select the most profitable advertisements to display, which can be formulated as an instance of online learning with partial feedback, also known as the stochastic multi-armed bandit (MAB) problem. In this paper, we show that the naive application of MAB algorithms to search advertising for advertisement selection will produce sample selection bias that harms the search engine by decreasing expected revenue and “estimation of the largest mean” (ELM) bias that harms the advertisers by increasing game-theoretic player-regret. We then propose simple bias-correction methods with benefits to both the search engine and the advertisers.

Author Information

Min Xu (CMU)
Tao Qin (Microsoft Research)
Tie-Yan Liu (Microsoft Research)

More from the Same Authors