4 resultados para Facial perception
em Aston University Research Archive
Resumo:
Very little is known about the neural structures involved in the perception of realistic dynamic facial expressions. In the present study, a unique set of naturalistic dynamic facial emotional expressions was created. Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend Haxby et al.'s [Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. The distributed human neural system for face perception. Trends in Cognitive Science, 4, 223–233, 2000] distributed neural system for face perception. This network includes early visual regions, such as the inferior occipital gyrus, which is identified as insensitive to motion or affect but sensitive to the visual stimulus, the STS, identified as specifically sensitive to motion, and the amygdala, recruited to process affect. Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as the inferior occipital gyrus and the STS, along with coupling between the STS and the amygdala, as well as the inferior frontal gyrus. These findings support the presence of a distributed network of cortical regions that mediate the perception of different dynamic facial expressions.
Resumo:
Patients with depersonalization disorder have shown attenuated responses to emotional unpleasant stimuli, hence supporting the view that depersonalization is characterised by a selective inhibition on the processing of unpleasant emotions. It was the purpose of this study to establish if autonomic responses to facial emotional expressions also show the same blunting effect. The skin conductance responses (SCRs) of 16 patients with chronic DSM-IV depersonalization disorder, 15 normal controls and 15 clinical controls with DSM-IV anxiety disorders were recorded in response to facial expressions of happiness and disgust. Patients with anxiety disorders were found to have greater autonomic responses than patients with depersonalization, in spite of the fact that both groups had similarly high levels of subjective anxiety as measured by anxiety scales. SCR to happy faces did not vary across groups. The findings of this study provide further support to the idea that patients with depersonalization have a selective impairment in the processing of threatening or unpleasant emotional stimuli.
Resumo:
Motion is an important aspect of face perception that has been largely neglected to date. Many of the established findings are based on studies that use static facial images, which do not reflect the unique temporal dynamics available from seeing a moving face. In the present thesis a set of naturalistic dynamic facial emotional expressions was purposely created and used to investigate the neural structures involved in the perception of dynamic facial expressions of emotion, with both functional Magnetic Resonance Imaging (fMRI) and Magnetoencephalography (MEG). Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend the distributed neural system for face perception (Haxby et al.,2000). Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as inferior occipital gyri and superior temporal sulci, along with coupling between superior temporal sulci and amygdalae, as well as with inferior frontal gyri. MEG and Synthetic Aperture Magnetometry (SAM) were used to examine the spatiotemporal profile of neurophysiological activity within this dynamic face perception network. SAM analysis revealed a number of regions showing differential activation to dynamic versus static faces in the distributed face network, characterised by decreases in cortical oscillatory power in the beta band, which were spatially coincident with those regions that were previously identified with fMRI. These findings support the presence of a distributed network of cortical regions that mediate the perception of dynamic facial expressions, with the fMRI data providing information on the spatial co-ordinates paralleled by the MEG data, which indicate the temporal dynamics within this network. This integrated multimodal approach offers both excellent spatial and temporal resolution, thereby providing an opportunity to explore dynamic brain activity and connectivity during face processing.
Resumo:
Holistic face perception, i.e. the mandatory integration of featural information across the face, hasbeen considered to play a key role when recognizing emotional face expressions (e.g., Tanaka et al.,2002). However, despite their early onset holistic processing skills continue to improvethroughout adolescence (e.g., Schwarzer et al., 2010) and therefore might modulate theevaluation of facial expressions. We tested this hypothesis using an attentional blink (AB)paradigm to compare the impact of happy, fearful and neutral faces in adolescents (10–13 years)and adults on subsequently presented neutral target stimuli (animals, plants and objects) in a rapidserial visual presentation stream. Adolescents and adults were found to be equally reliable whenreporting the emotional expression of the face stimuli. However, the detection of emotional butnot neutral faces imposed a significantly stronger AB effect on the detection of the neutral targetsin adults compared to adolescents. In a control experiment we confirmed that adolescents ratedemotional faces lower in terms of valence and arousal than adults. The results suggest a protracteddevelopment of the ability to evaluate facial expressions that might be attributed to the latematuration of holistic processing skills.