30 resultados para facial expressions
Resumo:
Very little is known about the neural structures involved in the perception of realistic dynamic facial expressions. In the present study, a unique set of naturalistic dynamic facial emotional expressions was created. Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend Haxby et al.'s [Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. The distributed human neural system for face perception. Trends in Cognitive Science, 4, 223–233, 2000] distributed neural system for face perception. This network includes early visual regions, such as the inferior occipital gyrus, which is identified as insensitive to motion or affect but sensitive to the visual stimulus, the STS, identified as specifically sensitive to motion, and the amygdala, recruited to process affect. Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as the inferior occipital gyrus and the STS, along with coupling between the STS and the amygdala, as well as the inferior frontal gyrus. These findings support the presence of a distributed network of cortical regions that mediate the perception of different dynamic facial expressions.
Resumo:
Background - Difficulties in emotion processing and poor social function are common to bipolar disorder (BD) and major depressive disorder (MDD) depression, resulting in many BD depressed individuals being misdiagnosed with MDD. The amygdala is a key region implicated in processing emotionally salient stimuli, including emotional facial expressions. It is unclear, however, whether abnormal amygdala activity during positive and negative emotion processing represents a persistent marker of BD regardless of illness phase or a state marker of depression common or specific to BD and MDD depression. Methods - Sixty adults were recruited: 15 depressed with BD type 1 (BDd), 15 depressed with recurrent MDD, 15 with BD in remission (BDr), diagnosed with DSM-IV and Structured Clinical Interview for DSM-IV Research Version criteria; and 15 healthy control subjects (HC). Groups were age- and gender ratio-matched; patient groups were matched for age of illness onset and illness duration; depressed groups were matched for depression severity. The BDd were taking more psychotropic medication than other patient groups. All individuals participated in three separate 3T neuroimaging event-related experiments, where they viewed mild and intense emotional and neutral faces of fear, happiness, or sadness from a standardized series. Results - The BDd—relative to HC, BDr, and MDD—showed elevated left amygdala activity to mild and neutral facial expressions in the sad (p < .009) but not other emotion experiments that was not associated with medication. There were no other significant between-group differences in amygdala activity. Conclusions - Abnormally elevated left amygdala activity to mild sad and neutral faces might be a depression-specific marker in BD but not MDD, suggesting different pathophysiologic processes for BD versus MDD depression.
Resumo:
Sixteen clinically depressed patients and sixteen healthy controls were presented with a set of emotional facial expressions and were asked to identify the emotion portrayed by each face. They, were subsequently given a recognition memory test for these faces. There was no difference between the groups in terms of their ability to identify emotion between from faces. All participants identified emotional expressions more accurately than neutral expressions, with happy expressions being identified most accurately. During the recognition memory phase the depressed patients demonstrated superior memory for sad expressions, and inferior memory for happy expressions, relative to neutral expressions. Conversely, the controls demonstrated superior memory for happy expressions, and inferior memory for sad expressions, relative to neutral expressions. These results are discussed in terms of the cognitive model of depression proposed by Williams, Watts, MacLeod, and Mathews (1997).
Resumo:
Impaired facial expression recognition has been associated with features of major depression, which could underlie some of the difficulties in social interactions in these patients. Patients with major depressive disorder and age- and gender-matched healthy volunteers judged the emotion of 100 facial stimuli displaying different intensities of sadness and happiness and neutral expressions presented for short (100 ms) and long (2,000 ms) durations. Compared with healthy volunteers, depressed patients demonstrated subtle impairments in discrimination accuracy and a predominant bias away from the identification as happy of mildly happy expressions. The authors suggest that, in depressed patients, the inability to accurately identify subtle changes in facial expression displayed by others in social situations may underlie the impaired interpersonal functioning.
Resumo:
Holistic face perception, i.e. the mandatory integration of featural information across the face, hasbeen considered to play a key role when recognizing emotional face expressions (e.g., Tanaka et al.,2002). However, despite their early onset holistic processing skills continue to improvethroughout adolescence (e.g., Schwarzer et al., 2010) and therefore might modulate theevaluation of facial expressions. We tested this hypothesis using an attentional blink (AB)paradigm to compare the impact of happy, fearful and neutral faces in adolescents (10–13 years)and adults on subsequently presented neutral target stimuli (animals, plants and objects) in a rapidserial visual presentation stream. Adolescents and adults were found to be equally reliable whenreporting the emotional expression of the face stimuli. However, the detection of emotional butnot neutral faces imposed a significantly stronger AB effect on the detection of the neutral targetsin adults compared to adolescents. In a control experiment we confirmed that adolescents ratedemotional faces lower in terms of valence and arousal than adults. The results suggest a protracteddevelopment of the ability to evaluate facial expressions that might be attributed to the latematuration of holistic processing skills.
Resumo:
Anxiety and fear are often confounded in discussions of human emotions. However, studies of rodent defensive reactions under naturalistic conditions suggest anxiety is functionally distinct from fear. Unambiguous threats, such as predators, elicit flight from rodents (if an escape-route is available), whereas ambiguous threats (e.g., the odor of a predator) elicit risk assessment behavior, which is associated with anxiety as it is preferentially modulated by anti-anxiety drugs. However, without human evidence, it would be premature to assume that rodent-based psychological models are valid for humans. We tested the human validity of the risk assessment explanation for anxiety by presenting 8 volunteers with emotive scenarios and asking them to pose facial expressions. Photographs and videos of these expressions were shown to 40 participants who matched them to the scenarios and labeled each expression. Scenarios describing ambiguous threats were preferentially matched to the facial expression posed in response to the same scenario type. This expression consisted of two plausible environmental-scanning behaviors (eye darts and head swivels) and was labeled as anxiety, not fear. The facial expression elicited by unambiguous threat scenarios was labeled as fear. The emotion labels generated were then presented to another 18 participants who matched them back to photographs of the facial expressions. This back-matching of labels to faces also linked anxiety to the environmental-scanning face rather than fear face. Results therefore suggest that anxiety produces a distinct facial expression and that it has adaptive value in situations that are ambiguously threatening, supporting a functional, risk-assessing explanation for human anxiety.
Resumo:
The recognition of faces and of facial expressions in an important evolutionary skill, and an integral part of social communication. It has been argued that the processing of faces is distinct from the processing of non-face stimuli and functional neuroimaging investigations have even found evidence of a distinction between the perception of faces and of emotional expressions. Structural and temporal correlates of face perception and facial affect have only been separately identified. Investigation neural dynamics of face perception per se as well as facial affect would allow the mapping of these in space, time and frequency specific domains. Participants were asked to perform face categorisation and emotional discrimination tasks and Magnetoencephalography (MEG) was used to measure the neurophysiology of face and facial emotion processing. SAM analysis techniques enable the investigation of spectral changes within specific time-windows and frequency bands, thus allowing the identification of stimulus specific regions of cortical power changes. Furthermore, MEG’s excellent temporal resolution allows for the detection of subtle changes associated with the processing of face and non-face stimuli and different emotional expressions. The data presented reveal that face perception is associated with spectral power changes within a distributed cortical network comprising occipito-temporal as well as parietal and frontal areas. For the perception of facial affect, spectral power changes were also observed within frontal and limbic areas including the parahippocampal gyrus and the amygdala. Analyses of temporal correlates also reveal a distinction between the processing of faces and facial affect. Face perception per se occurred at earlier latencies whereas the discrimination of facial expression occurred within a longer time-window. In addition, the processing of faces and facial affect was differentially associated with changes in cortical oscillatory power for alpha, beta and gamma frequencies. The perception of faces and facial affect is associated with distinct changes in cortical oscillatory activity that can be mapped to specific neural structures, specific time-windows and latencies as well as specific frequency bands. Therefore, the work presented in this thesis provides further insight into the sequential processing of faces and facial affect.
Resumo:
Background: Identifying biological markers to aid diagnosis of bipolar disorder (BD) is critically important. To be considered a possible biological marker, neural patterns in BD should be discriminant from those in healthy individuals (HI). We examined patterns of neuromagnetic responses revealed by magnetoencephalography (MEG) during implicit emotion-processing using emotional (happy, fearful, sad) and neutral facial expressions, in sixteen BD and sixteen age- and gender-matched healthy individuals. Methods: Neuromagnetic data were recorded using a 306-channel whole-head MEG ELEKTA Neuromag System, and preprocessed using Signal Space Separation as implemented in MaxFilter (ELEKTA). Custom Matlab programs removed EOG and ECG signals from filtered MEG data, and computed means of epoched data (0-250ms, 250-500ms, 500-750ms). A generalized linear model with three factors (individual, emotion intensity and time) compared BD and HI. A principal component analysis of normalized mean channel data in selected brain regions identified principal components that explained 95% of data variation. These components were used in a quadratic support vector machine (SVM) pattern classifier. SVM classifier performance was assessed using the leave-one-out approach. Results: BD and HI showed significantly different patterns of activation for 0-250ms within both left occipital and temporal regions, specifically for neutral facial expressions. PCA analysis revealed significant differences between BD and HI for mild fearful, happy, and sad facial expressions within 250-500ms. SVM quadratic classifier showed greatest accuracy (84%) and sensitivity (92%) for neutral faces, in left occipital regions within 500-750ms. Conclusions: MEG responses may be used in the search for disease specific neural markers.
Resumo:
Various neuroimaging investigations have revealed that perception of emotional pictures is associated with greater visual cortex activity than their neutral counterparts. It has further been proposed that threat-related information is rapidly processed, suggesting that the modulation of visual cortex activity should occur at an early stage. Additional studies have demonstrated that oscillatory activity in the gamma band range (40-100 Hz) is associated with threat processing. Magnetoencephalography (MEG) was used to investigate such activity during perception of task-irrelevant, threat-related versus neutral facial expressions. Our results demonstrated a bilateral reduction in gamma band activity for expressions of threat, specifically anger, compared with neutral faces in extrastriate visual cortex (BA 18) within 50-250 ms of stimulus onset. These results suggest that gamma activity in visual cortex may play a role in affective modulation of visual processing, in particular with the perception of threat cues.
Resumo:
According to the sociometer hypothesis individuals with low self-esteem experience increased negative affect in response to negative social stimuli, even when these stimuli are not perceived consciously. Using an affective priming paradigm, the present study examined whether trait self-esteem would moderate mood following briefly presented facial expressions. Results from 43 undergraduates revealed that, after controlling for baseline mood, anxiety and depression, the degree of negative affect experienced by the participants following exposure to expressions of anger and disgust varied as a function of their self-esteem. Implications for individuals with low-self esteem and our understanding of the link between self-esteem and negative affect are discussed.
Resumo:
Patients with depersonalization disorder have shown attenuated responses to emotional unpleasant stimuli, hence supporting the view that depersonalization is characterised by a selective inhibition on the processing of unpleasant emotions. It was the purpose of this study to establish if autonomic responses to facial emotional expressions also show the same blunting effect. The skin conductance responses (SCRs) of 16 patients with chronic DSM-IV depersonalization disorder, 15 normal controls and 15 clinical controls with DSM-IV anxiety disorders were recorded in response to facial expressions of happiness and disgust. Patients with anxiety disorders were found to have greater autonomic responses than patients with depersonalization, in spite of the fact that both groups had similarly high levels of subjective anxiety as measured by anxiety scales. SCR to happy faces did not vary across groups. The findings of this study provide further support to the idea that patients with depersonalization have a selective impairment in the processing of threatening or unpleasant emotional stimuli.
Resumo:
Motion is an important aspect of face perception that has been largely neglected to date. Many of the established findings are based on studies that use static facial images, which do not reflect the unique temporal dynamics available from seeing a moving face. In the present thesis a set of naturalistic dynamic facial emotional expressions was purposely created and used to investigate the neural structures involved in the perception of dynamic facial expressions of emotion, with both functional Magnetic Resonance Imaging (fMRI) and Magnetoencephalography (MEG). Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend the distributed neural system for face perception (Haxby et al.,2000). Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as inferior occipital gyri and superior temporal sulci, along with coupling between superior temporal sulci and amygdalae, as well as with inferior frontal gyri. MEG and Synthetic Aperture Magnetometry (SAM) were used to examine the spatiotemporal profile of neurophysiological activity within this dynamic face perception network. SAM analysis revealed a number of regions showing differential activation to dynamic versus static faces in the distributed face network, characterised by decreases in cortical oscillatory power in the beta band, which were spatially coincident with those regions that were previously identified with fMRI. These findings support the presence of a distributed network of cortical regions that mediate the perception of dynamic facial expressions, with the fMRI data providing information on the spatial co-ordinates paralleled by the MEG data, which indicate the temporal dynamics within this network. This integrated multimodal approach offers both excellent spatial and temporal resolution, thereby providing an opportunity to explore dynamic brain activity and connectivity during face processing.
Resumo:
Background: The spectrum approach was used to examine contributions of comorbid symptom dimensions of substance abuse and eating disorder to abnormal prefrontal-cortical and subcortical-striatal activity to happy and fear faces previously demonstrated in bipolar disorder (BD). Method: Fourteen remitted BD-type I and sixteen healthy individuals viewed neutral, mild and intense happy and fear faces in two event-related fMRI experiments. All individuals completed Substance-Use and Eating-Disorder Spectrum measures. Region-of-Interest analyses for bilateral prefrontal and subcortical-striatal regions were performed. Results: BD individuals scored significantly higher on these spectrum measures than healthy individuals (p < 0.05), and were distinguished by activity in prefrontal and subcortical-striatal regions. BD relative to healthy individuals showed reduced dorsal prefrontal-cortical activity to all faces. Only BD individuals showed greater subcortical-striatal activity to happy and neutral faces. In BD individuals, negative correlations were shown between substance use severity and right PFC activity to intense happy faces (p < 0.04), and between substance use severity and right caudate nucleus activity to neutral faces (p < 0.03). Positive correlations were shown between eating disorder and right ventral putamen activity to intense happy (p < 0.02) and neutral faces (p < 0.03). Exploratory analyses revealed few significant relationships between illness variables and medication upon neural activity in BD individuals. Limitations: Small sample size of predominantly medicated BD individuals. Conclusion: This study is the first to report relationships between comorbid symptom dimensions of substance abuse and eating disorder and prefrontal-cortical and subcortical-striatal activity to facial expressions in BD. Our findings suggest that these comorbid features may contribute to observed patterns of functional abnormalities in neural systems underlying mood regulation in BD.
Resumo:
The primary aim of this study was to investigate facial emotion recognition (FER) in patients with somatoform disorders (SFD). Also of interest was the extent to which concurrent alexithymia contributed to any changes in emotion recognition accuracy. Twenty patients with SFD and 20 healthy, age, sex and education matched, controls were assessed with the Facially Expressed Emotion Labelling Test of FER and the 26-item Toronto Alexithymia Scale. Patients withSFD exhibited elevated alexithymia symptoms relative to healthy controls.Patients with SFD also recognized significantly fewer emotional expressions than did the healthy controls. However, the group difference in emotion recognition accuracy became nonsignificant once the influence of alexithymia was controlled for statistically. This suggests that the deficit in FER observed in the patients with SFD was most likely a consequence of concurrent alexithymia. It should be noted that neither depression nor anxiety was significantly related to emotion recognition accuracy, suggesting that these variables did not contribute the emotion recognition deficit. Impaired FER observed in the patients with SFD could plausibly have a negative influence on these individuals’ social functioning.
Resumo:
Significant facial emotion recognition (FER) deficits have been observed in participants exhibiting high levels of eating psychopathology. The current study aimed to determine if the pattern of FER deficits is influenced by intensity of facial emotion and to establish if eating psychopathology is associated with a specific pattern of emotion recognition errors that is independent of other psychopathological or personality factors. Eighty females, 40 high and 40 low scorers on the Eating Disorders Inventory (EDI) were presented with a series of faces, each featuring one of five emotional expressions at one of four intensities, and were asked to identify the emotion portrayed. Results revealed that, in comparison to Low EDI scorers, high scorers correctly recognised significantly fewer expressions, particularly of fear and anger. There was also a trend for this deficit to be more evident for subtle displays of emotion (50% intensity). Deficits in anger recognition were related specifically to scores on the body dissatisfaction subscale of the EDI. Error analyses revealed that, in comparison to Low EDI scorers, high scorers made significantly more and fear-as-anger errors. Also, a tendency to label anger expressions as sadness was related to body dissatisfaction. Current findings confirm FER deficits in subclinical eating psychopathology and extend these findings to subtle expressions of emotion. Furthermore, this is the first study to establish that these deficits are related to a specific pattern of recognition errors. Impaired FER could disrupt normal social functioning and might represent a risk factor for the development of more severe psychopathology.