48 resultados para Nonverbal Decoding


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The few studies that have evaluated syntax in autism spectrum disorder (ASD) have yielded conflicting findings: some suggest that once matched on mental age, ASD and typically developing controls do not differ for grammar, while others report that morphosyntactic deficits are independent of cognitive skills in ASD. There is a need for a better understanding of syntax in ASD and its relation to, or dissociation from, nonverbal abilities. Aims Syntax in ASD was assessed by evaluating subject and object relative clause comprehension in adolescents and adults diagnosed with ASD with a performance IQ within the normal range, and with or without a history of language delay. Methods & Procedures Twenty-eight participants with ASD (mean age 21.8) and 28 age-matched controls (mean age 22.07) were required to point to a character designated by relative clauses that varied in syntactic complexity. Outcomes & Results Scores indicate that participants with ASD regardless of the language development history perform significantly worse than age-matched controls with object relative clauses. In addition, participants with ASD with a history of language delay (diagnosed with high-functioning autism in the DSM-IV-TR) perform worse on subject relatives than ASD participants without language delay (diagnosed with Asperger syndrome in the DSM-IV-TR), suggesting that these two groups do not have equivalent linguistic abilities. Performance IQ has a positive impact on the success of the task for the population with ASD. Conclusions & Implications This study reveals subtle grammatical difficulties remaining in adult individuals with ASD within normal IQ range as compared with age-matched peers. Even in the absence of a history of language delay in childhood, the results suggest that a slight deficit may nevertheless be present and go undetected by standardized language assessments. Both groups with and without language delay have a similar global performance on relative clause comprehension; however, the study also indicates that the participants with reported language delay show more difficulty with subject relatives than the participants without language delay, suggesting the presence of differences in linguistic abilities between these subgroups of ASD.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[New research-results on the nonverbal fine-tuning between therapists and patients and the practice of paying attention to the nonverbal channel]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Co-speech gestures are part of nonverbal communication during conversations. They either support the verbal message or provide the interlocutor with additional information. Furthermore, they prompt as nonverbal cues the cooperative process of turn taking. In the present study, we investigated the influence of co-speech gestures on the perception of dyadic dialogue in aphasic patients. In particular, we analysed the impact of co-speech gestures on gaze direction (towards speaker or listener) and fixation of body parts. We hypothesized that aphasic patients, who are restricted in verbal comprehension, adapt their visual exploration strategies. Methods: Sixteen aphasic patients and 23 healthy control subjects participated in the study. Visual exploration behaviour was measured by means of a contact-free infrared eye-tracker while subjects were watching videos depicting spontaneous dialogues between two individuals. Cumulative fixation duration and mean fixation duration were calculated for the factors co-speech gesture (present and absent), gaze direction (to the speaker or to the listener), and region of interest (ROI), including hands, face, and body. Results: Both aphasic patients and healthy controls mainly fixated the speaker’s face. We found a significant co-speech gesture x ROI interaction, indicating that the presence of a co-speech gesture encouraged subjects to look at the speaker. Further, there was a significant gaze direction x ROI x group interaction revealing that aphasic patients showed reduced cumulative fixation duration on the speaker’s face compared to healthy controls. Conclusion: Co-speech gestures guide the observer’s attention towards the speaker, the source of semantic input. It is discussed whether an underlying semantic processing deficit or a deficit to integrate audio-visual information may cause aphasic patients to explore less the speaker’s face. Keywords: Gestures, visual exploration, dialogue, aphasia, apraxia, eye movements