Proceedings:
Vol. 14 (2020): Fourteenth International AAAI Conference on Web and Social Media
Volume
Issue:
Vol. 14 (2020): Fourteenth International AAAI Conference on Web and Social Media
Track:
Full Papers
Downloads:
Abstract:
Evaluating (and mitigating) the potential negative effects of algorithms has become a central issue in computer science. While research on algorithmic bias in ranking systems has dealt with disparate exposure of products or individuals, less attention has been devoted to the analysis of the disparate exposure of subgroups of online users.In this paper, we investigate the visibility of minorities in people recommender systems in social networks. Specifically, we consider a bi-populated social network, i.e., a graph where the nodes belong to two different groups (majority and minority) and, by applying state-of-the-art people recommenders, we analyze how disparate visibility can be amplified or mitigated by different levels of homophily within each subgroup.We start our analysis on real-world social graphs, where the two subgroups are defined by sensitive demographic attributes such as gender or age. Our findings suggest that the way and the extent to which people recommenders can produce disparate visibility on the two subgroups, might depend in large part on the level of homophily within the subgroups. % To verify these findings, we move our analysis to synthetic datasets, where we can control characteristics of the input social graph, such as the size of the minority and the level of homophily. Our results show that homophily plays a key role in promoting or reducing visibility for different subgroups under various combinations of dataset characteristics and recommendation algorithms.
DOI:
10.1609/icwsm.v14i1.7288
ICWSM
Vol. 14 (2020): Fourteenth International AAAI Conference on Web and Social Media