There remains some debate about whether beta power effects observed during sentence comprehension reflect ongoing syntactic unification operations (beta-syntax hypothesis), or instead reflect maintenance or updating of the sentence-level representation (beta-maintenance hypothesis). In this study, we used magnetoencephalography to investigate beta power neural dynamics while participants read relative clause sentences that were initially ambiguous between a subject- or an object-relative reading. An additional condition included a grammatical violation at the disambiguation point in the relative clause sentences. The beta-maintenance hypothesis predicts a decrease in beta power at the disambiguation point for unexpected (and less preferred) object-relative clause sentences and grammatical violations, as both signal a need to update the sentence-level representation. While the beta-syntax hypothesis also predicts a beta power decrease for grammatical violations due to a disruption of syntactic unification operations, it instead predicts an increase in beta power for the object-relative clause condition because syntactic unification at the point of disambiguation becomes more demanding. We observed decreased beta power for both the agreement violation and object-relative clause conditions in typical left hemisphere language regions, which provides compelling support for the beta-maintenance hypothesis. Mid-frontal theta power effects were also present for grammatical violations and object-relative clause sentences, suggesting that violations and unexpected sentence interpretations are registered as conflicts by the brain's domain-general error detection system.
MULTIFILE
There is a growing literature investigating the relationship between oscillatory neural dynamics measured using electroencephalography (EEG) and/or magnetoencephalography (MEG), and sentence-level language comprehension. Recent proposals have suggested a strong link between predictive coding accounts of the hierarchical flow of information in the brain, and oscillatory neural dynamics in the beta and gamma frequency ranges. We propose that findings relating beta and gamma oscillations to sentence-level language comprehension might be unified under such a predictive coding account. Our suggestion is that oscillatory activity in the beta frequency range may reflect both the active maintenance of the current network configuration responsible for representing the sentence-level meaning under construction, and the top-down propagation of predictions to hierarchically lower processing levels based on that representation. In addition, we suggest that oscillatory activity in the low and middle gamma range reflect the matching of top-down predictions with bottom-up linguistic input, while evoked high gamma might reflect the propagation of bottom-up prediction errors to higher levels of the processing hierarchy. We also discuss some of the implications of this predictive coding framework, and we outline ideas for how these might be tested experimentally.
LINK
"Speak the Future" presents a novel test case at the intersection of scientific innovation and public engagement. Leveraging the power of real-time AI image generation, the project empowers festival participants to verbally describe their visions for a sustainable and regenerative future. These descriptions are instantly transformed into captivating imagery using SDXL Turbo, fostering collective engagement and tangible visualisation of abstract sustainability concepts. This unique interplay of speech recognition, AI, and projection technology breaks new ground in public engagement methods. The project offers valuable insights into public perceptions and aspirations for sustainability, as well as understanding the effectiveness of AI-powered visualisation and regenerative applications of AI. Ultimately, this will serve as a springboard for PhD research that will aim to understand How AI can serve as a vehicle for crafting regenerative futures? By employing real-time AI image generation, the project directly tests its effectiveness in fostering public engagement with sustainable futures. Analysing participant interaction and feedback sheds light on how AI-powered visualisation tools can enhance comprehension and engagement. Furthermore, the project fosters public understanding and appreciation of research. The interactive and accessible nature of "Speak the Future" demystifies the research process, showcasing its relevance and impact on everyday life. Moreover, by directly involving the public in co-creating visual representations of their aspirations, the project builds an emotional connection and sense of ownership, potentially leading to continued engagement and action beyond the festival setting. "Speak the Future" promises to be a groundbreaking initiative, bridging the gap between scientific innovation and public engagement in sustainability discourse. By harnessing the power of AI for collective visualisation, the project not only gathers valuable data for researchers but also empowers the public to envision and work towards a brighter, more sustainable future.