Recommenders play a significant role in our daily lives, making decisions for users on a regular basis. Their widespread adoption necessitates a thorough examination of how users interact with recommenders and the algorithms that drive them. An important form of interaction in these systems are algorithmic affordances: means that provide users with perceptible control over the algorithm by, for instance, providing context (‘find a movie for this profile’), weighing criteria (‘most important is the main actor’), or evaluating results (‘loved this movie’). The assumption is that these algorithmic affordances impact interaction qualities such as transparency, trust, autonomy, and serendipity, and as a result, they impact the user experience. Currently, the precise nature of the relation between algorithmic affordances, their specific implementations in the interface, interaction qualities, and user experience remains unclear. Subjects that will be discussed during the workshop, therefore, include but are not limited to the impact of algorithmic affordances and their implementations on interaction qualities, balances between cognitive overload and transparency in recommender interfaces containing algorithmic affordances; and reasons why research into these types of interfaces sometimes fails to cross the research-practice gap and are not landing in the design practice. As a potential solution the workshop committee proposes a library of examples of algorithmic affordances design patterns and their implementations in recommender interfaces enriched with academic research concerning their impact. The final part of the workshop will be dedicated to formulating guiding principles for such a library.
LINK
Algorithmic affordances are defined as user interaction mechanisms that allow users tangible control over AI algorithms, such as recommender systems. Designing such algorithmic affordances, including assessing their impact, is not straightforward and practitioners state that they lack resources to design adequately for interfaces of AI systems. This could be amended by creating a comprehensive pattern library of algorithmic affordances. This library should provide easy access to patterns, supported by live examples and research on their experiential impact and limitations of use. The Algorithmic Affordances in Recommender Interfaces workshop aimed to address key challenges related to building such a pattern library, including pattern identification features, a framework for systematic impact evaluation, and understanding the interaction between algorithmic affordances and their context of use, especially in education or with users with a low algorithmic literacy. Preliminary solutions were proposed for these challenges.
LINK
In flexible education, recommender systems that support course selection, are considered a viable means to help students in making informed course selections, especially where curricula offer greater flexibility. However, these recommender systems present both potential benefits and looming risks, such as overdependence on technology, biased recommendations, and privacy issues. User control mechanisms in recommender interfaces (or algorithmic affordances) might offer options to address those risks, but they have not been systematically studied yet. This paper presents the outcomes of a design session conducted during the INTERACT23 workshop on Algorithmic Affordances in Recommender Interfaces. This design session yielded insights in how the design of an interface, and specifically the algorithmic affordances in these interfaces, may address the ethical risks and dilemmas of using a recommender in such an impactful context by potentially vulnerable users. Through design and reflection, we discovered a host of design ideas for the interface of a flexible education interface, that can serve as conversation starters for practitioners implementing flexible education. More research is needed to explore these design directions and to gain insights on how they can help to approximate more ethically operating recommender systems.
LINK