Authors:
Andre Sacilotti
;
Rodrigo Souza
and
Marcelo G. Manzato
Affiliation:
Mathematics and Computer Science Institute, University of São Paulo, Av. Trab. Sancarlense 400, São Carlos-SP, Brazil
Keyword(s):
Recommender System, Popularity Bias, Fairness, Calibration.
Abstract:
Calibration is one approach to dealing with unfairness and popularity bias in recommender systems. While popularity bias can shift users towards consuming more mainstream items, unfairness can harm certain users by not recommending items according to their preferences. However, most state-of-art works on calibration focus only on providing fairer recommendations to users, not considering the popularity bias, which can amplify the long tail effect. To fill the research gap, in this work, we propose a calibration approach that aims to meet users’ interests according to different levels of the items’ popularity. In addition, the system seeks to reduce popularity bias and increase the diversity of recommended items. The proposed method works in a post-processing step and was evaluated through metrics that analyze aspects of fairness, popularity, and accuracy through an offline experiment with two different datasets. The system’s efficiency was validated and evaluated with three different
recommendation algorithms, verifying which behaves better and comparing the performance with four other state-of-the-art calibration approaches. As a result, the proposed technique reduced popularity bias and increased diversity and fairness in the two datasets considered.
(More)