42 resultados para facial expressions
Resumo:
The aim of this study was to empirically evaluate an embodied conversational agent called GRETA in an effort to answer two main questions: (1) What are the benefits (and costs) of presenting information via an animated agent, with certain characteristics, in a 'persuasion' task, compared to other forms of display? (2) How important is it that emotional expressions are added in a way that is consistent with the content of the message, in animated agents? To address these questions, a positively framed healthy eating message was created which was variously presented via GRETA, a matched human actor, GRETA's voice only (no face) or as text only. Furthermore, versions of GRETA were created which displayed additional emotional facial expressions in a way that was either consistent or inconsistent with the content of the message. Overall, it was found that although GRETA received significantly higher ratings for helpfulness and likability, presenting the message via GRETA led to the poorest memory performance among users. Importantly, however, when GRETA's additional emotional expressions were consistent with the content of the verbal message, the negative effect on memory performance disappeared. Overall, the findings point to the importance of achieving consistency in animated agents. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
BACKGROUND: Humans from an early age look longer at preferred stimuli, and also typically look longer at facial expressions of emotion, particularly happy faces. Atypical gaze patterns towards social stimuli are common in Autism Spectrum Conditions (ASC). However, it is unknown if gaze fixation patterns have any genetic basis. In this study, we tested if variations in the cannabinoid receptor 1 (CNR1) gene are associated with gaze duration towards happy faces. This gene was selected because CNR1 is a key component of the endocannabinoid system, involved in processing reward, and in our previous fMRI study we found variations in CNR1 modulates the striatal response to happy (but not disgust) faces. The striatum is involved in guiding gaze to rewarding aspects of a visual scene. We aimed to validate and extend this result in another sample using a different technique (gaze tracking). METHODS: 30 volunteers (13 males, 17 females) from the general population observed dynamic emotion expressions on a screen while their eye movements were recorded. They were genotyped for the identical four SNPs in the CNR1 gene tested in our earlier fMRI study. RESULTS: Two SNPs (rs806377 and rs806380) were associated with differential gaze duration for happy (but not disgust) faces. Importantly, the allelic groups associated with greater striatal response to happy faces in the fMRI study were associated with longer gaze duration for happy faces. CONCLUSIONS: These results suggest CNR1 variations modulate striatal function that underlies the perception of signals of social reward such as happy faces. This suggests CNR1 is a key element in the molecular architecture of perception of certain basic emotions. This may have implications for understanding neurodevelopmental conditions marked by atypical eye contact and facial emotion processing, such as ASC.
Resumo:
To investigate the mechanisms involved in automatic processing of facial expressions, we used the QUEST procedure to measure the display durations needed to make a gender decision on emotional faces portraying fearful, happy, or neutral facial expressions. In line with predictions of appraisal theories of emotion, our results showed greater processing priority of emotional stimuli regardless of their valence. Whereas all experimental conditions led to an averaged threshold of about 50 ms, fearful and happy facial expressions led to significantly less variability in the responses than neutral faces. Results suggest that attention may have been automatically drawn by the emotion portrayed by face targets, yielding more informative perceptions and less variable responses. The temporal resolution of the perceptual system (expressed by the thresholds) and the processing priority of the stimuli (expressed by the variability in the responses) may influence subjective and objective measures of awareness, respectively.
Resumo:
The neuropeptide substance P and its receptor NK1 have been implicated in emotion, anxiety and stress in preclinical studies. However, the role of NK1 receptors in human brain function is less clear and there have been inconsistent reports of the value of NK1 receptor antagonists in the treatment of clinical depression. The present study therefore aimed to investigate effects of NK1 antagonism on the neural processing of emotional information in healthy volunteers. Twenty-four participants were randomized to receive a single dose of aprepitant (125 mg) or placebo. Approximately 4 h later, neural responses during facial expression processing and an emotional counting Stroop word task were assessed using fMRI. Mood and subjective experience were also measured using self-report scales. As expected a single dose of aprepitant did not affect mood and subjective state in the healthy volunteers. However, NK1 antagonism increased responses specifically during the presentation of happy facial expressions in both the rostral anterior cingulate and the right amygdala. In the emotional counting Stroop task the aprepitant group had increased activation in both the medial orbitofrontal cortex and the precuneus cortex to positive vs. neutral words. These results suggest consistent effects of NK1 antagonism on neural responses to positive affective information in two different paradigms. Such findings confirm animal studies which support a role for NK1 receptors in emotion. Such an approach may be useful in understanding the effects of novel drug treatments prior to full-scale clinical trials.
Resumo:
A wealth of literature suggests that emotional faces are given special status as visual objects: Cognitive models suggest that emotional stimuli, particularly threat-relevant facial expressions such as fear and anger, are prioritized in visual processing and may be identified by a subcortical “quick and dirty” pathway in the absence of awareness (Tamietto & de Gelder, 2010). Both neuroimaging studies (Williams, Morris, McGlone, Abbott, & Mattingley, 2004) and backward masking studies (Whalen, Rauch, Etcoff, McInerney, & Lee, 1998) have supported the notion of emotion processing without awareness. Recently, our own group (Adams, Gray, Garner, & Graf, 2010) showed adaptation to emotional faces that were rendered invisible using a variant of binocular rivalry: continual flash suppression (CFS, Tsuchiya & Koch, 2005). Here we (i) respond to Yang, Hong, and Blake's (2010) criticisms of our adaptation paper and (ii) provide a unified account of adaptation to facial expression, identity, and gender, under conditions of unawareness
Resumo:
Background: Anhedonia, the loss of pleasure in usually enjoyable activities, is a central feature of major depressive disorder (MDD). The aim of the present study was to examine whether young people at a familial risk of depression display signs of anticipatory, motivational or consummatory anhedonia, which would indicate that these deficits may be trait markers for MDD. Methods: The study was completed by 22 participants with a family history of depression (FH+) and 21 controls (HC). Anticipatory anhedonia was assessed by asking participants to rate their anticipated liking of pleasant and unpleasant foods which they imagined tasting when cued with images of the foods. Motivational anhedonia was measured by requiring participants to perform key presses to obtain pleasant chocolate taste rewards or to avoid unpleasant apple tastes. Additionally, physical consummatory anhedonia was examined by instructing participants to rate the pleasantness of the acquired tastes. Moreover, social consummatory anhedonia was investigated by asking participants to make preference-based choices between neutral facial expressions, genuine smiles, and polite smiles. Results: It was found that the FH+ group’s anticipated liking of unpleasant foods was significantly lower than that of the control group. By contrast, no group differences in the pleasantness ratings of the actually experienced tastes or in the amount of performed key presses were observed. However, controls preferred genuine smiles over neutral expressions more often than they preferred polite smiles over neutral expressions, while this pattern was not seen in the FH+ group. Conclusion: These findings suggest that FH+ individuals demonstrate an altered anticipatory response to negative stimuli and show signs of social consummatory anhedonia, which may be trait markers for depression.
Resumo:
Periocular recognition has recently become an active topic in biometrics. Typically it uses 2D image data of the periocular region. This paper is the first description of combining 3D shape structure with 2D texture. A simple and effective technique using iterative closest point (ICP) was applied for 3D periocular region matching. It proved its strength for relatively unconstrained eye region capture, and does not require any training. Local binary patterns (LBP) were applied for 2D image based periocular matching. The two modalities were combined at the score-level. This approach was evaluated using the Bosphorus 3D face database, which contains large variations in facial expressions, head poses and occlusions. The rank-1 accuracy achieved from the 3D data (80%) was better than that for 2D (58%), and the best accuracy (83%) was achieved by fusing the two types of data. This suggests that significant improvements to periocular recognition systems could be achieved using the 3D structure information that is now available from small and inexpensive sensors.
Resumo:
Theory of mind ability has been associated with performance in interpersonal interactions and has been found to influence aspects such as emotion recognition, social competence, and social anxiety. Being able to attribute mental states to others requires attention to subtle communication cues such as facial emotional expressions. Decoding and interpreting emotions expressed by the face, especially those with negative valence, are essential skills to successful social interaction. The current study explored the association between theory of mind skills and attentional bias to facial emotional expressions. According to the study hypothesis, individuals with poor theory of mind skills showed preferential attention to negative faces over both non-negative faces and neutral objects. Tentative explanations for the findings are offered emphasizing the potential adaptive role of vigilance for threat as a way of allocating a limited capacity to interpret others’ mental states to obtain as much information as possible about potential danger in the social environment.
Resumo:
Spontaneous mimicry is a marker of empathy. Conditions characterized by reduced spontaneous mimicry (e.g., autism) also display deficits in sensitivity to social rewards. We tested if spontaneous mimicry of socially rewarding stimuli (happy faces) depends on the reward value of stimuli in 32 typical participants. An evaluative conditioning paradigm was used to associate different reward values with neutral target faces. Subsequently, electromyographic activity over the Zygomaticus Major was measured whilst participants watched video clips of the faces making happy expressions. Higher Zygomaticus Major activity was found in response to happy faces conditioned with high reward versus low reward. Moreover, autistic traits in the general population modulated the extent of spontaneous mimicry of happy faces. This suggests a link between reward and spontaneous mimicry and provides a possible underlying mechanism for the reduced response to social rewards seen in autism.
Resumo:
Joint attention (JA) and spontaneous facial mimicry (SFM) are fundamental processes in social interactions, and they are closely related to empathic abilities. When tested independently, both of these processes have been usually observed to be atypical in individuals with autism spectrum conditions (ASC). However, it is not known how these processes interact with each other in relation to autistic traits. This study addresses this question by testing the impact of JA on SFM of happy faces using a truly interactive paradigm. Sixty-two neurotypical participants engaged in gaze-based social interaction with an anthropomorphic, gaze-contingent virtual agent. The agent either established JA by initiating eye contact or looked away, before looking at an object and expressing happiness or disgust. Eye tracking was used to make the agent's gaze behavior and facial actions contingent to the participants' gaze. SFM of happy expressions was measured by Electromyography (EMG) recording over the Zygomaticus Major muscle. Results showed that JA augments SFM in individuals with low compared with high autistic traits. These findings are in line with reports of reduced impact of JA on action imitation in individuals with ASC. Moreover, they suggest that investigating atypical interactions between empathic processes, instead of testing these processes individually, might be crucial to understanding the nature of social deficits in autism
Resumo:
A multiple factor parametrization is described to permit the efficient calculation of collision efficiency (E) between electrically charged aerosol particles and neutral cloud droplets in numerical models of cloud and climate. The four-parameter representation summarizes the results obtained from a detailed microphysical model of E, which accounts for the different forces acting on the aerosol in the path of falling cloud droplets. The parametrization's range of validity is for aerosol particle radii of 0.4 to 10 mu m, aerosol particle densities of I to 2.0 g cm(-3), aerosol particle charges from neutral to 100 elementary charges and drop radii from 18.55 to 142 mu m. The parametrization yields values of E well within an order of magnitude of the detailed model's values, from a dataset of 3978 E values. Of these values 95% have modelled to parametrized ratios between 0.5 and 1.5 for aerosol particle sizes ranging between 0.4 and 2.0 mu m, and about 96% in the second size range. This parametrization speeds up the calculation of E by a factor of similar to 10(3) compared with the original microphysical model, permitting the inclusion of electric charge effects in numerical cloud and climate models.