Background: Children with difficulties in listening and understanding speech despite normal peripheral hearing, can be diagnosed with the diagnosis Auditory Processing Disorder (A). However, there are doubts about the validity of this diagnosis. The aim of this study was to examine the relation between the listening difficulties of children between 8 and 12 years with suspected A and the attention, working memory, nonverbal intelligence and communication abilities of these children.Material and methods: In this case-control study we examined 10 children who reported listening difficulties in spite of normal peripheral hearing (3 referred by speech-language pathologist in the Northern Netherlands, 6 by an audiological center in the Southern Netherlands and one by parental concern) and 21 typically developing children (recruitment through word of mouth and by the website Taalexpert.nl), ages 8;0 to 12;0 years. The parents of all children completed three questionnaires about history, behavioral symptoms of ADHD, and communication skills (Children’s Communication Checklist). Teachers of the children completed the Children’s Auditory Processing Performance Scale (CHAPPS). Children were assessed for auditory processing abilities (speech-in-noise, filtered speech, binaural fusion, dichotic listening), nonverbal intelligence (Raven’s Coloured Progressive Matrices), and working memory (Clinical Evaluation of Language Fundamentals). Auditory and visual attention was studied with four behavioral tests of the WAFF battery of the Vienna Test System (Schuhfried).Results: Preliminary analysis shows no differences between groups on the auditory processing tests and nonverbal intelligence quotient. Children in the experimental group have poorer communication performance (parent report), poorer listening skills (teacher report), and poorer working memory and attention skills (behavioral tests).Conclusions: The results of this study showed that there is a difference between children with listening complaints and typically developing children, but that the problems are not specific to the auditory modality. There seems to be no evidence for the validity of an auditory deficit.
DOCUMENT
In an event related potential (ERP) experiment using written language materials only, we investigated a potential modulation of the N400 by the modality switch effect. The modality switch effect occurs when a first sentence, describing a fact grounded in one modality, is followed by a second sentence describing a second fact grounded in a different modality. For example, "A cellar is dark" (visual), was preceded by either another visual property "Ham is pink" or by a tactile property "A mitten is soft." We also investigated whether the modality switch effect occurs for false sentences ("A cellar is light"). We found that, for true sentences, the ERP at the critical word "dark" elicited a significantly greater frontal, early N400-like effect (270-370 ms) when there was a modality mismatch than when there was a modality-match. This pattern was not found for the critical word "light" in false sentences. Results similar to the frontal negativity were obtained in a late time window (500-700 ms). The obtained ERP effect is similar to one previously obtained for pictures. We conclude that in this paradigm we obtained fast access to conceptual properties for modality-matched pairs, which leads to embodiment effects similar to those previously obtained with pictorial stimuli.
LINK
When an adult claims he cannot sleep without his teddy bear, people tend to react surprised. Language interpretation is, thus, influenced by social context, such as who the speaker is. The present study reveals inter-individual differences in brain reactivity to social aspects of language. Whereas women showed brain reactivity when stereotype-based inferences about a speaker conflicted with the content of the message, men did not. This sex difference in social information processing can be explained by a specific cognitive trait, one's ability to empathize. Individuals who empathize to a greater degree revealed larger N400 effects (as well as a larger increase in γ-band power) to socially relevant information. These results indicate that individuals with high-empathizing skills are able to rapidly integrate information about the speaker with the content of the message, as they make use of voice-based inferences about the speaker to process language in a top-down manner. Alternatively, individuals with lower empathizing skills did not use information about social stereotypes in implicit sentence comprehension, but rather took a more bottom-up approach to the processing of these social pragmatic sentences.
MULTIFILE