The paper discusses the results of a case study on the effect of players' facial responses when playing a casual game. In order to do so, it measures the facial responses in casual games by recording facial EMG and analyzing players' facial expressions using FACS. It investigates which one of the two measurements is more effective to measure emotional responses in casual games. The results of this case study show that playing a casual game causes the players to respond with both facial expressions and facial EMG activity and that both measurements are needed in order to get a good understanding of the players' emotional responses to casual game events.
DOCUMENT
DOCUMENT
LINK
DOCUMENT
Digital surveillance technologies using artificial intelligence (AI) tools such as computer vision and facial recognition are becoming cheaper and easier to integrate into governance practices worldwide. Morocco serves as an example of how such technologies are becoming key tools of governance in authoritarian contexts. Based on qualitative fieldwork including semi-structured interviews, observation, and extensive desk reviews, this chapter focusses on the role played by AI-enhanced technology in urban surveillance and the control of migration between the Moroccan–Spanish borders. Two cross-cutting issues emerge: first, while international donors provide funding for urban and border surveillance projects, their role in enforcing transparency mechanisms in their implementation remains limited; second, Morocco’s existing legal framework hinders any kind of public oversight. Video surveillance is treated as the sole prerogative of the security apparatus, and so far public actors have avoided to engage directly with the topic. The lack of institutional oversight and public debate on the matter raise serious concerns on the extent to which the deployment of such technologies affects citizens’ rights. AI-enhanced surveillance is thus an intrinsically transnational challenge in which private interests of economic gain and public interests of national security collide with citizens’ human rights across the Global North/Global South divide.
MULTIFILE
In daily interaction with horses, humans primarily rely on facial expression as a non-verbal equine cue for emotional information. Difficulties in correctly recognizing these signals might arise due to the species-specificity of facial cues, possibly leading to diminished equine welfare and health. This study aimed to explore human visual search patterns when assessing equine facial expressions indicative of various pain levels, utilizing eye-tracking technology. Hundred and eight individuals (N = 108), classified into three groups (affinity with horses (N = 60), pet owners with no affinity with horses (N = 32), and individuals with no affinity with animals (N = 16)) participated in the study; with their eye movements recorded using eye tracking glasses they evaluated four photos of horses with different levels of pain. Error score, calculated by comparing participant scores to Gold Standard Visual Analogue Score levels and fixation metrics (number of fixations and duration of fixations) were analysed across the four photos, participant group and Areas of Interest (AOIs): eyes, ears, nostrils, and mouth. Statistical analysis utilized linear mixed models. Highlighting the critical role of the eyes as key indicators of pain, findings showed that the eyes played a significant role in assessing equine emotional states, as all groups focused on them for a longer time and more frequently compared to other facial features. Also, participants showed a consistent pattern in how they looked at a horse's face, first focusing on the eyes, then the ears, and finally the nose/mouth region, indicating a horse-specific pattern. Moderate pain was assessed with similar accuracy across all groups, indicating that these signals are broadly recognizable. Nevertheless, non-equestrians faced challenges with recognizing the absence of pain, possibly highlighting the role of experience in interpreting subtle equine expressions. The study's limitations, such as variability in assessment conditions may have impacted findings. Future work could further investigate why humans follow this visual search pattern and whether they recognize the significance of a horse's ears. Additionally, emphasis should be placed on developing targeted training interventions to improve equine pain recognition, possibly benefiting equine welfare and health.
LINK
We examined the neural correlates of facial attractiveness by presenting pictures of male or female faces (neutral expression) with low/intermediate/high attractiveness to 48 male or female participants while recording their electroencephalogram (EEG). Subjective attractiveness ratings were used to determine the 10% highest, 10% middlemost, and 10% lowest rated faces for each individual participant to allow for high contrast comparisons. These were then split into preferred and dispreferred gender categories. ERP components P1, N1, P2, N2, early posterior negativity (EPN), P300 and late positive potential (LPP) (up until 3000 ms post-stimulus), and the face specific N170 were analysed. A salience effect (attractive/unattractive > intermediate) in an early LPP interval (450–850 ms) and a long-lasting valence related effect (attractive > unattractive) in a late LPP interval (1000–3000 ms) were elicited by the preferred gender faces but not by the dispreferred gender faces. Multi-variate pattern analysis (MVPA)-classifications on whole-brain single-trial EEG patterns further confirmed these salience and valence effects. It is concluded that, facial attractiveness elicits neural responses that are indicative of valenced experiences, but only if these faces are considered relevant. These experiences take time to develop and last well beyond the interval that is commonly explored.
MULTIFILE
Psychophysiological measurements have so far been used to express player experience quantitatively in game genres such as shooter games and race games. However, these methods have not yet been applied to casual video games. From a development point of view, games developed in the casual sector of the games industry are characterized by very short production cycles which make them ill-suited for complex and lengthy psychophysiological testing regimes. This paper discusses some methodological innovations that lead to the application of psychophysiological measurements to enhance the design of a commercially released casual game for the Apple iPad, called 'Gua-Le-Ni'; or, The Horrendous Parade'. The game was tested in different stages of its development to dry-run a cycle of design improvements derived from psychophysiological data. The tests looked at the correlation between stress levels and the contraction of facial muscles with in-game performance in order to establish whether 'Gua-Le-Ni' offered the cognitive challenge, the learning curve, and the enjoyment the designers had in mind for this product. In this paper, we discuss the changes that were made to the game and the data-analysis that led to these changes.
DOCUMENT
There has been significant progress in the graphical realism of digital humans in recent years. This work investigates the realistic portrayal of emotions beyond facial expressions by analysing how skin colour changes when expressing different emotional states. The study combines existing knowledge from old painters, photogrammetry data, thermal imaging, and skin colouration maps to create an artistic guideline to portray emotions realistically, resulting in the proposal of a set of colour maps representing the six basic emotions. By using skin colour changes to represent emotional states, the proposed colour maps offer an alternative workflow for portraying emotions. During the experiment of this research four of these proposed colour maps, which represent neutrality, anger, disgust, and happiness, were preferred over traditional alternatives in terms of realism perception and likeability. The findings have implications for the development of digital human technology, particularly in the creation of more realistic and expressive digital characters.
LINK