3 resultados para Syntactic And Semantic Comprehension Tasks
em Bucknell University Digital Commons - Pensilvania - USA
Resumo:
Evidence suggests that the social cognition deficits prevalent in autism spectrum disorders (ASDs) are widely distributed in first degree and extended relatives. This ¿broader autism phenotype¿ (BAP) can be extended into non-clinical populations and show wide distributions of social behaviors such as empathy and social responsiveness ¿ with ASDs exhibiting these behaviors on the lower ends of the distributions. Little evidence has previously shown relationships between self-report measures of social cognition and more objective tasks such as face perception in functional magnetic resonance imaging (fMRI) and event-related potentials (ERPs). In this study, three specific hypotheses were addressed: a) increased social ability, as measured by an increased Empathy Quotient, decreased Social Responsiveness Scale (SRS-A) score, and increased Social Attribution Task score, will predict increased activation of the fusiform gyrus in response to faces as compared to houses; b) these same measures will predict N170 amplitude and latency showing decreased latency and increased amplitude for faces as compared to houses with increased social ability; c) increased amygdala volume will predict increased fusiform gyrus activation when viewing faces as compared to houses. Findings supported all of the hypotheses. Empathy scores significantly predicted both right FFG activation [F(1,20) = 4.811, p = .041, ß = .450, R2 = 0.20] and left FFG activation [F(1,20) = 7.70, p = .012, ß = .537, R2 = 0.29]. Based on ERP results increased right lateralization face-related N170 was significantly predicted by the EQ [F(1,54) = 6.94, p = .011, ß = .338, R2 = 0.11]. Finally, total amygdala volume significantly predicted right [F(1,20) = 7.217, p = .014, ß = .515, R2 = 0.27] and left [F(1,20) = 36.77, p < .001, ß = .805, R2 = 0.65] FFG activation. Consistent with the a priori hypotheses, traits attributed to the BAP can significantly predict neural responses to faces in a non-clinical population. This is consistent with the face processing deficits seen in ASDs. The findings presented here contribute to the extension of the BAP from unaffected relatives of individuals with ASDs to the general population. These findings also give continued evidence in support of a continuous distribution of traits found in psychiatric illnesses in place of a traditional, dichotomous ¿all-or-nothing¿ diagnostic framework of neurodevelopmental and neuropsychiatric disorders.
Resumo:
COMPOSERS COMMONLY USE MAJOR OR MINOR SCALES to create different moods in music.Nonmusicians show poor discrimination and classification of this musical dimension; however, they can perform these tasks if the decision is phrased as happy vs. sad.We created pairs of melodies identical except for mode; the first major or minor third or sixth was the critical note that distinguished major from minor mode. Musicians and nonmusicians judged each melody as major vs. minor or happy vs. sad.We collected ERP waveforms, triggered to the onset of the critical note. Musicians showed a late positive component (P3) to the critical note only for the minor melodies, and in both tasks.Nonmusicians could adequately classify the melodies as happy or sad but showed little evidence of processing the critical information. Major appears to be the default mode in music, and musicians and nonmusicians apparently process mode differently.
Resumo:
Auditory imagery is more than just mental “replaying” of tunes in one’s head. I will review several studies that capture characteristics of complex and active imagery tasks, using both behavioral and neuroscience approaches. I use behavioral methods to capture people’s ability to make emotion judgments about both heard and imagined music in real time. My neuroimaging studies look at the neural correlates of encoding an imagined melody, anticipating an upcoming tune, and also imagining tunes backwards. Several studies show voxel-by-voxel correlates of neural activity with self-report of imagery vividness. These studies speak to the ways in which musical imagery allows us not just to remember music, but also how we use those memories to judge temporally changing aspects of the musical experience.