147 resultados para CARDIO-FACIAL SYNDROME
Resumo:
Objectives: The co-occurrence of anger in young people with Asperger's syndrome (AS) has received little attention despite aggression, agitation, and tantrums frequently being identified as issues of concern in this population. The present study investigated the occurrence of anger in young people with AS and explores its relationship with anxiety and depression. Method: Sixty-two young people (12-23 years old) diagnosed with AS were assessed using the Beck Anger Inventory for Youth, Spence Children's Anxiety Scale, and Reynolds Adolescent Depression Scale. Results: Among young people with AS who participated in this study, 41% of participants reported clinically significant levels of anger (17%), anxiety (25.8%) and/or depression (11.5%). Anger, anxiety, and depression were positively correlated with each other. Depression, however, was the only significant predictor of anger. Conclusion: Anger is commonly experienced by young people with AS and is correlated with anxiety and depression. These findings suggest that the emotional and behavioral presentation of anger could serve as a cue for further assessment, and facilitate earlier identification and intervention for anger, as well as other mental health problems.
Resumo:
We employed a novel cuing paradigm to assess whether dynamically versus statically presented facial expressions differentially engaged predictive visual mechanisms. Participants were presented with a cueing stimulus that was either the static depiction of a low intensity expressed emotion; or a dynamic sequence evolving from a neutral expression to the low intensity expressed emotion. Following this cue and a backwards mask, participants were presented with a probe face that displayed either the same emotion (congruent) or a different emotion (incongruent) with respect to that displayed by the cue although expressed at a high intensity. The probe face had either the same or different identity from the cued face. The participants' task was to indicate whether or not the probe face showed the same emotion as the cue. Dynamic cues and same identity cues both led to a greater tendency towards congruent responding, although these factors did not interact. Facial motion also led to faster responding when the probe face was emotionally congruent to the cue. We interpret these results as indicating that dynamic facial displays preferentially invoke predictive visual mechanisms, and suggest that motoric simulation may provide an important basis for the generation of predictions in the visual system.
Resumo:
Emotionally arousing events can distort our sense of time. We used mixed block/event-related fMRI design to establish the neural basis for this effect. Nineteen participants were asked to judge whether angry, happy and neutral facial expressions that varied in duration (from 400 to 1,600 ms) were closer in duration to either a short or long duration they learnt previously. Time was overestimated for both angry and happy expressions compared to neutral expressions. For faces presented for 700 ms, facial emotion modulated activity in regions of the timing network Wiener et al. (NeuroImage 49(2):1728–1740, 2010) namely the right supplementary motor area (SMA) and the junction of the right inferior frontal gyrus and anterior insula (IFG/AI). Reaction times were slowest when faces were displayed for 700 ms indicating increased decision making difficulty. Taken together with existing electrophysiological evidence Ng et al. (Neuroscience, doi: 10.3389/fnint.2011.00077, 2011), the effects are consistent with the idea that facial emotion moderates temporal decision making and that the right SMA and right IFG/AI are key neural structures responsible for this effect.
Resumo:
Because moving depictions of face emotion have greater ecological validity than their static counterparts, it has been suggested that still photographs may not engage ‘authentic’ mechanisms used to recognize facial expressions in everyday life. To date, however, no neuroimaging studies have adequately addressed the question of whether the processing of static and dynamic expressions rely upon different brain substrates. To address this, we performed an functional magnetic resonance imaging (fMRI) experiment wherein participants made emotional expression discrimination and Sex discrimination judgements to static and moving face images. Compared to Sex discrimination, Emotion discrimination was associated with widespread increased activation in regions of occipito-temporal, parietal and frontal cortex. These regions were activated both by moving and by static emotional stimuli, indicating a general role in the interpretation of emotion. However, portions of the inferior frontal gyri and supplementary/pre-supplementary motor area showed task by motion interaction. These regions were most active during emotion judgements to static faces. Our results demonstrate a common neural substrate for recognizing static and moving facial expressions, but suggest a role for the inferior frontal gyrus in supporting simulation processes that are invoked more strongly to disambiguate static emotional cues.
Resumo:
Representation of facial expressions using continuous dimensions has shown to be inherently more expressive and psychologically meaningful than using categorized emotions, and thus has gained increasing attention over recent years. Many sub-problems have arisen in this new field that remain only partially understood. A comparison of the regression performance of different texture and geometric features and investigation of the correlations between continuous dimensional axes and basic categorized emotions are two of these. This paper presents empirical studies addressing these problems, and it reports results from an evaluation of different methods for detecting spontaneous facial expressions within the arousal-valence dimensional space (AV). The evaluation compares the performance of texture features (SIFT, Gabor, LBP) against geometric features (FAP-based distances), and the fusion of the two. It also compares the prediction of arousal and valence, obtained using the best fusion method, to the corresponding ground truths. Spatial distribution, shift, similarity, and correlation are considered for the six basic categorized emotions (i.e. anger, disgust, fear, happiness, sadness, surprise). Using the NVIE database, results show that the fusion of LBP and FAP features performs the best. The results from the NVIE and FEEDTUM databases reveal novel findings about the correlations of arousal and valence dimensions to each of six basic emotion categories.
Resumo:
Introduction: In this study, we report on initial efforts to discover putative biomarkers for differential diagnosis of a systemic inflammatory response syndrome (SIRS) versus sepsis; and different stages of sepsis. In addition, we also investigated whether there are proteins that can discriminate between patients who survived sepsis from those who did not. Materials and Methods: Our study group consisted of 16 patients, of which 6 died and 10 survived. We daily measured 28 plasma proteins, for the whole stay of the patients in the ICU. Results: We observed that metalloproteinases and sE-selectin play a role in the distinction between SIRS and sepsis, and that IL-1, IP-10, sTNF-R2 and sFas appear to be indicative for the progression from sepsis to septic shock. A combined measurement of MMP-3, -10, IL-1, IP-10, sIL-2R, sFas, sTNF-R1, sRAGE, GM-CSF, IL-1 and Eotaxin allows for a good separation of patients that survived from those that died (mortality prediction with a sensitivity of 79% and specificity of 86%). Correlation analysis suggests a novel interaction between IL-1a and IP-10. Conclusion: The marker panel is ready to be verified in a validation study with or without therapeutic intervention.