Skip to yearly menu bar Skip to main content


Poster

PSL: Rethinking and Improving Softmax Loss from Pairwise Perspective for Recommendation

Weiqin Yang · Jiawei Chen · Xin Xin · Sheng Zhou · Binbin Hu · Yan Feng · Chun Chen · Can Wang

East Exhibit Hall A-C #1003
[ ] [ Project Page ]
Fri 13 Dec 4:30 p.m. PST — 7:30 p.m. PST

Abstract:

Softmax Loss (SL) is widely applied in Recommender Systems (RS) and has demonstrated effectiveness. This work analyzes SL from a pairwise perspective, revealing two significant limitations: 1) the relationship between SL and conventional ranking metrics like DCG is not sufficiently tight; 2) SL is highly sensitive to false negative instances. Our analysis indicates that these limitations are primarily due to the use of the exponential function.To address these issues, this work extends SL to a new family of loss functions, termed Pairwise Softmax Loss (PSL), which replaces exponential function in SL with other appropriate activation functions. While the revision is light, we highlight three merits of PSL: 1) it serves as a tighter surrogate for DCG with suitable activations; 2) it better balances data contributions; and 3) it acts as a specific BPR loss enhanced by Distributional Robust Optimization (DRO). We further validate the effectiveness and robustness of PSL through empirical experiments.

Live content is unavailable. Log in and register to view live content