Social networks and news outlets entrust content curation to specialised algorithms from the broad family of recommender systems. Companies attempt to increase engagement by connecting users with ideas they are more likely to agree with. Eli Pariser, the author of the term filter bubble, suggested that it might come as a price of narrowing users' outlook. Although empirical studies on algorithmic recommendation showed no reduction in diversity, these algorithms are still a source of concern due to the increased societal polarisation of opinions. Diversity has been widely discussed in the literature, but little attention has been paid to the dynamics of user opinions when influenced by algorithmic curation and social network interaction.
This paper describes our empirical research using an Agent-based modelling (ABM) approach to simulate users' emergent behaviour and track their opinions when getting news from news outlets and social networks. We address under which circumstances algorithmic filtering and social network dynamics affect users' innate opinions and which interventions can mitigate the effect.
The simulation confirmed that an environment curated by a recommender system did not reduce diversity. The same outcome was observed in a simple social network with items shared among users. However, opinions were less susceptible to change: The difference between users' current and innate opinions was lower than in an environment with users randomly selecting items. Finally, we propose a modification to the collaborative filtering algorithm by selecting items in the boundary of users' latitude of acceptance, increasing the chances to challenge users' opinions.