3 resultados para auditory N1 event-related potential (ERP)
em Research Open Access Repository of the University of East London.
Resumo:
The effects of spatial attention and part-whole configuration on recognition of repeated objects were investigated with behavioral and event-related potential (ERP) measures. Short-term repetition effects were measured for probe objects as a function of whether a preceding prime object was shown as an intact image or coarsely scrambled (split into two halves) and whether or not it had been attended during the prime display. In line with previous behavioral experiments, priming effects were observed from both intact and split primes for attended objects, but only from intact (repeated same-view) objects when they were unattended. These behavioral results were reflected in ERP waveforms at occipital-temporal locations as more negative-going deflections for repeated items in the time window between 220 and 300 ms after probe onset (N250r). Attended intact images showed generally more enhanced repetition effects than split ones. Unattended images showed repetition effects only when presented in an intact configuration, and this finding was limited to the right-hemisphere electrodes. Repetition effects in earlier (before 200 ms) time-windows were limited to attended conditions at occipito-temporal sites (N1), a component linked to the encoding of object structure, while repetition effects at central locations during the same time window (P150) were found only from objects repeated in the same intact configuration—both previously attended and unattended probe objects. The data indicate that view-generalization is mediated by a combination of analytic (part-based) representations and automatic view-dependent representations.
Resumo:
The use of visual cues during the processing of audiovisual (AV) speech is known to be less efficient in children and adults with language difficulties and difficulties are known to be more prevalent in children from low-income populations. In the present study, we followed an economically diverse group of thirty-seven infants longitudinally from 6–9 months to 14–16 months of age. We used eye-tracking to examine whether individual differences in visual attention during AV processing of speech in 6–9 month old infants, particularly when processing congruent and incongruent auditory and visual speech cues, might be indicative of their later language development. Twenty-two of these 6–9 month old infants also participated in an event-related potential (ERP) AV task within the same experimental session. Language development was then followed-up at the age of 14–16 months, using two measures of language development, the Preschool Language Scale and the Oxford Communicative Development Inventory. The results show that those infants who were less efficient in auditory speech processing at the age of 6–9 months had lower receptive language scores at 14–16 months. A correlational analysis revealed that the pattern of face scanning and ERP responses to audiovisually incongruent stimuli at 6–9 months were both significantly associated with language development at 14–16 months. These findings add to the understanding of individual differences in neural signatures of AV processing and associated looking behavior in infants.
Resumo:
Research on audiovisual speech integration has reported high levels of individual variability, especially among young infants. In the present study we tested the hypothesis that this variability results from individual differences in the maturation of audiovisual speech processing during infancy. A developmental shift in selective attention to audiovisual speech has been demonstrated between 6 and 9 months with an increase in the time spent looking to articulating mouths as compared to eyes (Lewkowicz & Hansen-Tift. (2012) Proc. Natl Acad. Sci. USA, 109, 1431–1436; Tomalski et al. (2012) Eur. J. Dev. Psychol., 1–14). In the present study we tested whether these changes in behavioural maturational level are associated with differences in brain responses to audiovisual speech across this age range. We measured high-density event-related potentials (ERPs) in response to videos of audiovisually matching and mismatched syllables /ba/ and /ga/, and subsequently examined visual scanning of the same stimuli with eye-tracking. There were no clear age-specific changes in ERPs, but the amplitude of audiovisual mismatch response (AVMMR) to the combination of visual /ba/ and auditory /ga/ was strongly negatively associated with looking time to the mouth in the same condition. These results have significant implications for our understanding of individual differences in neural signatures of audiovisual speech processing in infants, suggesting that they are not strictly related to chronological age but instead associated with the maturation of looking behaviour, and develop at individual rates in the second half of the first year of life.