4 resultados para Infant’s development
em Duke University
Resumo:
Infants' speech perception abilities change through the first year of life, from broad sensitivity to a wide range of speech contrasts to becoming more finely attuned to their native language. What remains unclear, however, is how this perceptual change relates to brain responses to native language contrasts in terms of the functional specialization of the left and right hemispheres. Here, to elucidate the developmental changes in functional lateralization accompanying this perceptual change, we conducted two experiments on Japanese infants using Japanese lexical pitch-accent, which changes word meanings with the pitch pattern within words. In the first behavioral experiment, using visual habituation, we confirmed that infants at both 4 and 10 months have sensitivities to the lexical pitch-accent pattern change embedded in disyllabic words. In the second experiment, near-infrared spectroscopy was used to measure cortical hemodynamic responses in the left and right hemispheres to the same lexical pitch-accent pattern changes and their pure tone counterparts. We found that brain responses to the pitch change within words differed between 4- and 10-month-old infants in terms of functional lateralization: Left hemisphere dominance for the perception of the pitch change embedded in words was seen only in the 10-month-olds. These results suggest that the perceptual change in Japanese lexical pitch-accent may be related to a shift in functional lateralization from bilateral to left hemisphere dominance.
Resumo:
The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated which promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests that behavioral signs can be observed late in the first year of life. Many of these studies involve extensive frame-by-frame video observation and analysis of a child's natural behavior. Although nonintrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are burdensome for clinical and large population research purposes. This work is a first milestone in a long-term project on non-invasive early observation of children in order to aid in risk detection and research of neurodevelopmental disorders. We focus on providing low-cost computer vision tools to measure and identify ASD behavioral signs based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure responses to general ASD risk assessment tasks and activities outlined by the AOSI which assess visual attention by tracking facial features. We show results, including comparisons with expert and nonexpert clinicians, which demonstrate that the proposed computer vision tools can capture critical behavioral observations and potentially augment the clinician's behavioral observations obtained from real in-clinic assessments.
Resumo:
It is widely accepted that infants begin learning their native language not by learning words, but by discovering features of the speech signal: consonants, vowels, and combinations of these sounds. Learning to understand words, as opposed to just perceiving their sounds, is said to come later, between 9 and 15 mo of age, when infants develop a capacity for interpreting others' goals and intentions. Here, we demonstrate that this consensus about the developmental sequence of human language learning is flawed: in fact, infants already know the meanings of several common words from the age of 6 mo onward. We presented 6- to 9-mo-old infants with sets of pictures to view while their parent named a picture in each set. Over this entire age range, infants directed their gaze to the named pictures, indicating their understanding of spoken words. Because the words were not trained in the laboratory, the results show that even young infants learn ordinary words through daily experience with language. This surprising accomplishment indicates that, contrary to prevailing beliefs, either infants can already grasp the referential intentions of adults at 6 mo or infants can learn words before this ability emerges. The precocious discovery of word meanings suggests a perspective in which learning vocabulary and learning the sound structure of spoken language go hand in hand as language acquisition begins.
Resumo:
Young infants' learning of words for abstract concepts like 'all gone' and 'eat,' in contrast to their learning of more concrete words like 'apple' and 'shoe,' may follow a relatively protracted developmental course. We examined whether infants know such abstract words. Parents named one of two events shown in side-by-side videos while their 6-16-month-old infants (n=98) watched. On average, infants successfully looked at the named video by 10 months, but not earlier, and infants' looking at the named referent increased robustly at around 14 months. Six-month-olds already understand concrete words in this task (Bergelson & Swingley, 2012). A video-corpus analysis of unscripted mother-infant interaction showed that mothers used the tested abstract words less often in the presence of their referent events than they used concrete words in the presence of their referent objects. We suggest that referential uncertainty in abstract words' teaching conditions may explain the later acquisition of abstract than concrete words, and we discuss the possible role of changes in social-cognitive abilities over the 6-14 month period.