© 2025 SURF
This extended abstract introduces the work of the Netherlands AI Media and Democracy lab, especially focusing on the research performed from an AI/computer science perspective at CWI, the Netherlands National Research Center for Mathematics and Computer Science in Amsterdam. We first provide an overview of the general aims and set-up of the lab, and then focuses in on the research areas of the 3 research groups at CWI, outlining there are of research and expected research contributions in the areas between AI and media & democracy
This paper discusses the positioning of higher education in the information or so-called network society. As part of a broader PhD research into media literacy and the success of students in higher education, this theoretical disquisition submits links between information problem-solving skills (IPS-skills), students’ success, social media and the position of student’s assignments and higher education in society. First, I'll explain by using pedagogical theories, that when researching students’ success in higher education in contemporary society, it is important to know why and how students use social media. Secondly, the necessity of IPS-skills is discussed along with the challenges and difficulties. Not only the skills of searching for reliable and useful information are addressed but also the construction of the Internet and the way a part of the Internet works, is discussed, in particular the filter bubble. Thirdly, with the use of the network theory, the role of social media (in the present case: Facebook) in higher education is analysed. Ultimately, this paper complements the pedagogical theory on students’ success in contemporary society. Furthermore, distinguishing education as a distinctive field within the network society will tighten the network theory.
LINK
Vorgetragen bei der Eröffnung der Digikonferenz der Bundeszentrale für politische Bildung im Mousonturm in Frankfurt am Main, 5. März 2020.
MULTIFILE
Stricter environmental policies, increased energy prices and depletion of resources are forcing industries to look for bio-based and low carbon footprint products. For industries, flax is interesting resource since it is light, strong, environmental friendly and renewable. From flax plant to fiber products involves biochemical and mechanical processes. Moreover, production and processing costs have to compete with other products, like petroleum based materials. This research focusses on sustainable process improvement from flax plant to fiber production. Flax retting is a biological process at which mainly pectin is removed. Without retting, the desired fibre remains attached to the wooden core of the flax stem. As a result, the flax fibres cannot be gained, or have a lows quality. After retting, the fibers are released from the wooden core. Furthermore, machines have been introduced in the flax production process, but the best quality fibers are still produced manually. Due to the high labor intensity the process is too expensive and the process needs to be economical optimized. Since the retting process determines all other downstream processes, retting is the first step to focus on. Lab-scale experiments were performed to investigate the retting process. Factors that were researched were low cost processing conditions like, temperature, pH, dew retting and water retting. The retting rate was low, around three weeks for complete retting. The best retting conditions were at 20°C with water and any addition of chemicals. The process could be shortened to two weeks by recycling the water phase. In a scale-up experiment, a rotating drum was used at the optimal conditions from the lab-experiment (20°C and water). First the flax did not mix with the water content in the rotating drum. The flax was too rigid and did not tumble. Therefore, bundles of flax plants were used. The inner core of the bundle seemed to be protected and the retting rate was less compared to the flax on the surface of the flax bundle. This implies that mechanical impact increased retting in the rotating drum, however heterogeneous retting should be avoided. To overcome the heterogeneous retting problem, a water column was used to improve heterogeneous retting. Retting was performed in a water column and mixing was accomplished by bubbling air. As a result of the mixing, the flax bundle was retted homogenously. And after drying, it was possible to separate the fibers from the wooden flax core. Retting with a bubble column can overcome this problem and seems to be a usable retting process step. Water samples of the lab-scale experiments, the rotating drum and the bubble column showed a chemical oxygen demand (COD) content up to 4 g/L. Overall, 1 kg Flax resulted in 40 g COD. This indicates the possibility to produce biogas that can be used for generating heat and electricity, to make the process sustainable. Around 50% of the weight consists of wooden shives. The shives can be used for pyrolysis and it was possible to produce around 30% coal and 20% oil. These compounds can be used as building blocks, but also to generate heat and electricity. Heat and electricity can be used for the flax processing. Shives were only dried for 1 day at 105°C and slow pyrolysis was used. This indicates that a higher yield can be expected at fast pyrolysis. Overall, the reported implicates that quality fiber production from flax plant can be a feasible, sustainable and a renewable production process. Feasibility of the process can be obtained by, (1) retting at low-cost process conditions of 20°C and using water without any addition of chemicals, (2) with increased flax retting rate by recycling water, (3) with increased flax retting rate by introducing mixing forces, and the ability to lower the energy consumption of the overall process, (4) producing biogas from the COD with anaerobic digestion and (5) producing pyrolysis oil and pyrolysis c
MULTIFILE
Light scattering is a fundamental property that can be exploited to create essential devices such as particle analysers. The most common particle size analyser relies on measuring the angle-dependent diffracted light from a sample illuminated by a laser beam. Compared to other non-light-based counterparts, such a laser diffraction scheme offers precision, but it does so at the expense of size, complexity and cost. In this paper, we introduce the concept of a new particle size analyser in a collimated beam configuration using a consumer electronic camera and machine learning. The key novelty is a small form factor angular spatial filter that allows for the collection of light scattered by the particles up to predefined discrete angles. The filter is combined with a light-emitting diode and a complementary metal-oxide-semiconductor image sensor array to acquire angularly resolved scattering images. From these images, a machine learning model predicts the volume median diameter of the particles. To validate the proposed device, glass beads with diameters ranging from 13 to 125 µm were measured in suspension at several concentrations. We were able to correct for multiple scattering effects and predict the particle size with mean absolute percentage errors of 5.09% and 2.5% for the cases without and with concentration as an input parameter, respectively. When only spherical particles were analysed, the former error was significantly reduced (0.72%). Given that it is compact (on the order of ten cm) and built with low-cost consumer electronics, the newly designed particle size analyser has significant potential for use outside a standard laboratory, for example, in online and in-line industrial process monitoring.
MULTIFILE
Introducing a hyperbolic vortex into a showerhead is a possibility to achieve higher spray velocities for a given discharge without reducing the nozzle diameter. Due to the introduction of air bubbles into the water by the vortex, the spray is pushed from a transition (dripping faucet) regime into a jetting regime, which results in higher droplet and jet velocities using the same nozzle diameter and throughput. The same droplet and jet diameters were realized compared to a showerhead without a vortex. Assuming that the satisfaction of a shower experience is largely dependent on the droplet size and velocity, the implementation of a vortex in the showerhead could provide the same shower experience with 14% less water consumption compared to the normal showerhead. A full optical and physical analysis was presented, and the important chemical parameters were investigated.
MULTIFILE
Social networks and news outlets use recommender systems to distribute information and suggest news to their users. These algorithms are an attractive solution to deal with the massive amount of content on the web [6]. However, some organisations prioritise retention and maximisation of the number of access, which can be incompatible with values like the diversity of content and transparency. In recent years critics have warned of the dangers of algorithmic curation. The term filter bubbles, coined by the internet activist Eli Pariser [1], describes the outcome of pre-selected personalisation, where users are trapped in a bubble of similar contents. Pariser warns that it is not the user but the algorithm that curates and selects interesting topics to watch or read. Still, there is disagreement about the consequences for individuals and society. Research on the existence of filter bubbles is inconclusive. Fletcher in [5], claims that the term filter bubbles is an oversimplification of a much more complex system involving cognitive processes and social and technological interactions. And most of the empirical studies indicate that algorithmic recommendations have not locked large segments of the audience into bubbles [3] [6]. We built an agent-based simulation tool to study the dynamic and complex interplay between individual choices and social and technological interaction. The model includes different recommendation algorithms and a range of cognitive filters that can simulate different social network dynamics. The cognitive filters are based on the triple-filter bubble model [2]. The tool can be used to understand under which circumstances algorithmic filtering and social network dynamics affect users' innate opinions and which interventions on recommender systems can mitigate adverse side effects like the presence of filter bubbles. The resulting tool is an open-source interactive web interface, allowing the simulation with different parameters such as users' characteristics, social networks and recommender system settings (see Fig. 1). The ABM model, implemented in Python Mesa [4], allows users to visualise, compare and analyse the consequence of combining various factors. Experiment results are similar to the ones published in the Triple Filter Bubble paper [2]. The novelty is the option to use a real collaborative-filter recommendation system and a new metric to measure the distance between users' innate and final opinions. We observed that slight modifications in the recommendation system, exposing items within the boundaries of users' latitude of acceptance, could increase content diversity.References 1. Pariser, E.: The filter bubble: What the internet is hiding from you. Penguin, New York, NY (2011) 2. Geschke, D., Lorenz, J., Holtz, P.: The triple-filter bubble: Using agent-based modelling to test a meta-theoretical framework for the emergence of filter bubbles and echo chambers. British Journal of Social Psychology (2019), 58, 129–149 3. Möller, J., Trilling, D., Helberger, N. , and van Es, B.: Do Not Blame It on the Algorithm: An Empirical Assessment of Multiple Recommender Systems and Their Impact on Content Diversity. Information, Communication and Society 21, no. 7 (2018): 959–77 4. Mesa: Agent-based modeling in Python, https://mesa.readthedocs.io/. Last accessed 2 Sep 2022 5. Fletcher, R.: The truth behind filter bubbles: Bursting some myths. Digital News Report - Reuters Institute (2020). https://reutersinstitute.politics.ox.ac.uk/news/truth-behind-filter-bubblesbursting-some-myths. Last accessed 2 Sep 2022 6. Haim, M., Graefe, A, Brosius, H: Burst of the Filter Bubble?: Effects of Personalization on the Diversity of Google News. Digital Journalism 6, no. 3 (2018): 330–43.
MULTIFILE
Social networks and news outlets entrust content curation to specialised algorithms from the broad family of recommender systems. Companies attempt to increase engagement by connecting users with ideas they are more likely to agree with. Eli Pariser, the author of the term filter bubble, suggested that it might come as a price of narrowing users' outlook. Although empirical studies on algorithmic recommendation showed no reduction in diversity, these algorithms are still a source of concern due to the increased societal polarisation of opinions. Diversity has been widely discussed in the literature, but little attention has been paid to the dynamics of user opinions when influenced by algorithmic curation and social network interaction.This paper describes our empirical research using an Agent-based modelling (ABM) approach to simulate users' emergent behaviour and track their opinions when getting news from news outlets and social networks. We address under which circumstances algorithmic filtering and social network dynamics affect users' innate opinions and which interventions can mitigate the effect.The simulation confirmed that an environment curated by a recommender system did not reduce diversity. The same outcome was observed in a simple social network with items shared among users. However, opinions were less susceptible to change: The difference between users' current and innate opinions was lower than in an environment with users randomly selecting items. Finally, we propose a modification to the collaborative filtering algorithm by selecting items in the boundary of users' latitude of acceptance, increasing the chances to challenge users' opinions.
Algorithmic curation is a helpful solution for the massive amount of content on the web. The term is used to describe how platforms automate the recommendation of content to users. News outlets, social networks and search engines widely use recommendation systems. Such automation has led to worries about selective exposure and its side effects. Echo chambers occur when we are over-exposed to the news we like or agree with, distorting our perception of reality (1). Filter bubbles arise where the information we dislike or disagree with is automatically filtered out – narrowing what we know (2). While the idea of Filter Bubbles makes logical sense, the magnitude of the "filter bubble effect", reducing diversity, has been questioned [3]. Most empirical studies indicate that algorithmic recommendations have not locked large audience segments into bubbles [4]. However, little attention has been paid to the interplay between technological, social, and cognitive filters. We proposed an Agent-based Modelling to simulate users' emergent behaviour and track their opinions when getting news from news outlets and social networks. The model aims to understand under which circumstances algorithmic filtering and social network dynamics affect users' innate opinions and which interventions can mitigate the effect. Agent-based models simulate the behaviour of multiple individual agents forming a larger society. The behaviour of the individual agents can be elementary, yet the population's behaviour can be much more than the sum of its parts. We have designed different scenarios to analyse the contributing factors to the emergence of filter bubbles. It includes different recommendation algorithms and social network dynamics. Cognitive filters are based on the Triple Filter Bubble model [5].References1.Richard Fletcher, 20202.Eli Pariser, 20123.Chitra & Musco, 20204. Möller et al., 20185. Daniel Geschke et al, 2018