6 resultados para facial recognition
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
One of the most consistent findings in the neuroscience of autism is hypoactivation of the fusiform gyrus (FG) during face processing. In this study the authors examined whether successful facial affect recognition training is associated with an increased activation of the FG in autism. The effect of a computer-based program to teach facial affect identification was examined in 10 individuals with high-functioning autism. Blood oxygenation level-dependent (BOLD) functional magnetic resonance imaging (fMRI) changes in the FG and other regions of interest, as well as behavioral facial affect recognition measures, were assessed pre- and posttraining. No significant activation changes in the FG were observed. Trained participants showed behavioral improvements, which were accompanied by higher BOLD fMRI signals in the superior parietal lobule and maintained activation in the right medial occipital gyrus.
Resumo:
Autism is a chronic pervasive neurodevelopmental disorder characterized by the early onset of social and communicative impairments as well as restricted, ritualized, stereotypic behavior. The endophenotype of autism includes neuropsychological deficits, for instance a lack of "Theory of Mind" and problems recognizing facial affect. In this study, we report the development and evaluation of a computer-based program to teach and test the ability to identify basic facially expressed emotions. 10 adolescent or adult subjects with high-functioning autism or Asperger-syndrome were included in the investigation. A priori the facial affect recognition test had shown good psychometric properties in a normative sample (internal consistency: rtt=.91-.95; retest reliability: rtt=.89-.92). In a prepost design, one half of the sample was randomly assigned to receive computer treatment while the other half of the sample served as control group. The training was conducted for five weeks, consisting of two hours training a week. The trained individuals improved significantly on the affect recognition task, but not on any other measure. Results support the usefulness of the program to teach the detection of facial affect. However, the improvement found is limited to a circumscribed area of social-communicative function and generalization is not ensured.
Resumo:
OBJECTIVE: To review the epidemiology of facial fractures in children and to analyze whether it has changed over time. STUDY DESIGN: Retrospective review of records of children aged < or = 15 years diagnosed for fracture during 2 10-year periods. RESULTS: A total of 378 children were diagnosed with fractures, 187 in 1980-1989 and 191 in 1993-2002. The proportion of children with mandibular fractures decreased by 13.6 percentage-points from the first period to the second, whereas the proportion of patients with midfacial fractures increased by 18.7 percentage-points. Assault as a causative factor increased by 5.5 percentage-points, almost exclusively among children aged 13-15 years, with a high percentage (23.5%). CONCLUSIONS: Recognition of a change in fracture patterns over time is probably due to the increased use of computerized tomographic scanning.
Resumo:
Several studies investigated the role of featural and configural information when processing facial identity. A lot less is known about their contribution to emotion recognition. In this study, we addressed this issue by inducing either a featural or a configural processing strategy (Experiment 1) and by investigating the attentional strategies in response to emotional expressions (Experiment 2). In Experiment 1, participants identified emotional expressions in faces that were presented in three different versions (intact, blurred, and scrambled) and in two orientations (upright and inverted). Blurred faces contain mainly configural information, and scrambled faces contain mainly featural information. Inversion is known to selectively hinder configural processing. Analyses of the discriminability measure (A′) and response times (RTs) revealed that configural processing plays a more prominent role in expression recognition than featural processing, but their relative contribution varies depending on the emotion. In Experiment 2, we qualified these differences between emotions by investigating the relative importance of specific features by means of eye movements. Participants had to match intact expressions with the emotional cues that preceded the stimulus. The analysis of eye movements confirmed that the recognition of different emotions rely on different types of information. While the mouth is important for the detection of happiness and fear, the eyes are more relevant for anger, fear, and sadness.
Resumo:
Background: Emotional processing in essential hypertension beyond self-report questionnaire has hardly been investigated. The aim of this study is to examine associations between hypertension status and recognition of facial affect. Methods: 25 healthy, non-smoking, medication-free men including 13 hypertensive subjects aged between 20 and 65 years completed a computer-based task in order to examine sensitivity of recognition of facial affect. Neutral faces gradually changed to a specific emotion in a pseudo-continuous manner. Slides of the six basic emotions (fear, sadness, disgust, happiness, anger, surprise) were chosen from the „NimStim Set“. Pictures of three female and three male faces were electronically morphed in 1% steps of intensity from 0% to 100% (36 sets of faces with 100 pictures each). Each picture of a set was presented for one second, ranging from 0% to 100% of intensity. Participants were instructed to press a stop button as soon as they recognized the expression of the face. After stopping a forced choice between the six basic emotions was required. As dependent variables, we recorded the emotion intensity at which the presentation was stopped and the number of errors (error rate). Recognition sensitivity was calculated as emotion intensity of correctly identified emotions. Results: Mean arterial pressure was associated with a significantly increased recognition sensitivity of facial affect for the emotion anger (ß = - .43, p = 0.03*, Δ R2= .110). There was no association with the emotions fear, sadness, disgust, happiness, and surprise (p’s > .0.41). Mean arterial pressure did not relate to the mean number of errors for any of the facial emotions. Conclusions: Our findings suggest that an increased blood pressure is associated with increased recognition sensitivity of facial affect for the emotion anger, if a face shows anger. Hypertensives perceive facial anger expression faster than normotensives, if anger is shown.