966 resultados para Facial expression recognition


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Facial expression recognition was investigated in 20 males with high functioning autism (HFA) or Asperger syndrome (AS), compared to typically developing individuals matched for chronological age (TD CA group) and verbal and non-verbal ability (TD V/NV group). This was the first study to employ a visual search, “face in the crowd” paradigm with a HFA/AS group, which explored responses to numerous facial expressions using real-face stimuli. Results showed slower response times for processing fear, anger and sad expressions in the HFA/AS group, relative to the TD CA group, but not the TD V/NV group. Reponses to happy, disgust and surprise expressions showed no group differences. Results are discussed with reference to the amygdala theory of autism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

La tesi tratta i temi di computer vision connessi alle problematiche di inserimento in una piattaforma Web. Nel testo sono spiegate alcune soluzioni per includere una libreria software per l'emotion recognition in un'applicazione web e tecnologie per la registrazione di un video, catturando le immagine da una webcam.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Desde hace más de 20 años, muchos grupos de investigación trabajan en el estudio de técnicas de reconocimiento automático de expresiones faciales. En los últimos años, gracias al avance de las metodologías, ha habido numerosos avances que hacen posible una rápida detección de las caras presentes en una imagen y proporcionan algoritmos de clasificación de expresiones. En este proyecto se realiza un estudio sobre el estado del arte en reconocimiento automático de emociones, para conocer los diversos métodos que existen en el análisis facial y en el reconocimiento de la emoción. Con el fin de poder comparar estos métodos y otros futuros, se implementa una herramienta modular y ampliable y que además integra un método de extracción de características que consiste en la obtención de puntos de interés en la cara y dos métodos para clasificar la expresión, uno mediante comparación de desplazamientos de los puntos faciales, y otro mediante detección de movimientos específicos llamados unidades de acción. Para el entrenamiento del sistema y la posterior evaluación del mismo, se emplean las bases de datos Cohn-Kanade+ y JAFFE, de libre acceso a la comunidad científica. Después, una evaluación de estos métodos es llevada a cabo usando diferentes parámetros, bases de datos y variando el número de emociones. Finalmente, se extraen conclusiones del trabajo y su evaluación, proponiendo las mejoras necesarias e investigación futura. ABSTRACT. Currently, many research teams focus on the study of techniques for automatic facial expression recognition. Due to the appearance of digital image processing, in recent years there have been many advances in the field of face detection, feature extraction and expression classification. In this project, a study of the state of the art on automatic emotion recognition is performed to know the different methods existing in facial feature extraction and emotion recognition. To compare these methods, a user friendly tool is implemented. Besides, a feature extraction method is developed which consists in obtaining 19 facial feature points. Those are passed to two expression classifier methods, one based on point displacements, and one based on the recognition of facial Action Units. Cohn-Kanade+ and JAFFE databases, both freely available to the scientific community, are used for system training and evaluation. Then, an evaluation of the methods is performed with different parameters, databases and varying the number of emotions. Finally, conclusions of the work and its evaluation are extracted, proposing some necessary improvements and future research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Emotion-based analysis has raised a lot of interest, particularly in areas such as forensics, medicine, music, psychology, and human-machine interface. Following this trend, the use of facial analysis (either automatic or human-based) is the most common subject to be investigated once this type of data can easily be collected and is well accepted in the literature as a metric for inference of emotional states. Despite this popularity, due to several constraints found in real world scenarios (e.g. lightning, complex backgrounds, facial hair and so on), automatically obtaining affective information from face accurately is a very challenging accomplishment. This work presents a framework which aims to analyse emotional experiences through naturally generated facial expressions. Our main contribution is a new 4-dimensional model to describe emotional experiences in terms of appraisal, facial expressions, mood, and subjective experiences. In addition, we present an experiment using a new protocol proposed to obtain spontaneous emotional reactions. The results have suggested that the initial emotional state described by the participants of the experiment was different from that described after the exposure to the eliciting stimulus, thus showing that the used stimuli were capable of inducing the expected emotional states in most individuals. Moreover, our results pointed out that spontaneous facial reactions to emotions are very different from those in prototypic expressions due to the lack of expressiveness in the latter.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A large variety of social signals, such as facial expression and body language, are conveyed in everyday interactions and an accurate perception and interpretation of these social cues is necessary in order for reciprocal social interactions to take place successfully and efficiently. The present study was conducted to determine whether impairments in social functioning that are commonly observed following a closed head injury, could at least be partially attributable to disruption in the ability to appreciate social cues. More specifically, an attempt was made to determine whether face processing deficits following a closed head injury (CHI) coincide with changes in electrophysiological responsivity to the presentation of facial stimuli. A number of event-related potentials (ERPs) that have been linked specifically to various aspects of visual processing were examined. These included the N170, an index of structural encoding ability, the N400, an index of the ability to detect differences in serially presented stimuli, and the Late Positivity (LP), an index of the sensitivity to affective content in visually-presented stimuli. Electrophysiological responses were recorded while participants with and without a closed head injury were presented with pairs of faces delivered in a rapid sequence and asked to compare them on the basis of whether they matched with respect to identity or emotion. Other behavioural measures of identity and emotion recognition were also employed, along with a small battery of standard neuropsychological tests used to determine general levels of cognitive impairment. Participants in the CHI group were impaired in a number of cognitive domains that are commonly affected following a brain injury. These impairments included reduced efficiency in various aspects of encoding verbal information into memory, general slower rate of information processing, decreased sensitivity to smell, and greater difficulty in the regulation of emotion and a limited awareness of this impairment. Impairments in face and emotion processing were clearly evident in the CHI group. However, despite these impairments in face processing, there were no significant differences between groups in the electrophysiological components examined. The only exception was a trend indicating delayed N170 peak latencies in the CHI group (p = .09), which may reflect inefficient structural encoding processes. In addition, group differences were noted in the region of the N100, thought to reflect very early selective attention. It is possible, then, that facial expression and identity processing deficits following CHI are secondary to (or exacerbated by) an underlying disruption of very early attentional processes. Alternately the difficulty may arise in the later cognitive stages involved in the interpretation of the relevant visual information. However, the present data do not allow these alternatives to be distinguished. Nonetheless, it was clearly evident that individuals with CHI are more likely than controls to make face processing errors, particularly for the more difficult to discriminate negative emotions. Those working with individuals who have sustained a head injury should be alerted to this potential source of social monitoring difficulties which is often observed as part of the sequelae following a CHI.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A right-handed man developed a sudden transient, amnestic syndrome associated with bilateral hemorrhage of the hippocampi, probably due to Urbach-Wiethe disease. In the 3rd month, despite significant hippocampal structural damage on imaging, only a milder degree of retrograde and anterograde amnesia persisted on detailed neuropsychological examination. On systematic testing of recognition of facial and vocal expression of emotion, we found an impairment of the vocal perception of fear, but not that of other emotions, such as joy, sadness and anger. Such selective impairment of fear perception was not present in the recognition of facial expression of emotion. Thus emotional perception varies according to the different aspects of emotions and the different modality of presentation (faces versus voices). This is consistent with the idea that there may be multiple emotion systems. The study of emotional perception in this unique case of bilateral involvement of hippocampus suggests that this structure may play a critical role in the recognition of fear in vocal expression, possibly dissociated from that of other emotions and from that of fear in facial expression. In regard of recent data suggesting that the amygdala is playing a role in the recognition of fear in the auditory as well as in the visual modality this could suggest that the hippocampus may be part of the auditory pathway of fear recognition.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

[EN]The use of new technologies in order to step up the inter- action between humans and machines is the main proof that faces are important in videos. Therefore we suggest a novel Face Video Database for development, testing and veri cation of algorithms related to face- based applications and to facial recognition applications. In addition of facial expression videos, the database includes body videos. The videos are taken by three di erent cameras, working in real time, without vary- ing illumination conditions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Autism is a chronic pervasive neurodevelopmental disorder characterized by the early onset of social and communicative impairments as well as restricted, ritualized, stereotypic behavior. The endophenotype of autism includes neuropsychological deficits, for instance a lack of "Theory of Mind" and problems recognizing facial affect. In this study, we report the development and evaluation of a computer-based program to teach and test the ability to identify basic facially expressed emotions. 10 adolescent or adult subjects with high-functioning autism or Asperger-syndrome were included in the investigation. A priori the facial affect recognition test had shown good psychometric properties in a normative sample (internal consistency: rtt=.91-.95; retest reliability: rtt=.89-.92). In a prepost design, one half of the sample was randomly assigned to receive computer treatment while the other half of the sample served as control group. The training was conducted for five weeks, consisting of two hours training a week. The trained individuals improved significantly on the affect recognition task, but not on any other measure. Results support the usefulness of the program to teach the detection of facial affect. However, the improvement found is limited to a circumscribed area of social-communicative function and generalization is not ensured.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Several studies investigated the role of featural and configural information when processing facial identity. A lot less is known about their contribution to emotion recognition. In this study, we addressed this issue by inducing either a featural or a configural processing strategy (Experiment 1) and by investigating the attentional strategies in response to emotional expressions (Experiment 2). In Experiment 1, participants identified emotional expressions in faces that were presented in three different versions (intact, blurred, and scrambled) and in two orientations (upright and inverted). Blurred faces contain mainly configural information, and scrambled faces contain mainly featural information. Inversion is known to selectively hinder configural processing. Analyses of the discriminability measure (A′) and response times (RTs) revealed that configural processing plays a more prominent role in expression recognition than featural processing, but their relative contribution varies depending on the emotion. In Experiment 2, we qualified these differences between emotions by investigating the relative importance of specific features by means of eye movements. Participants had to match intact expressions with the emotional cues that preceded the stimulus. The analysis of eye movements confirmed that the recognition of different emotions rely on different types of information. While the mouth is important for the detection of happiness and fear, the eyes are more relevant for anger, fear, and sadness.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Because faces and bodies share some abstract perceptual features, we hypothesised that similar recognition processes might be used for both. We investigated whether similar caricature effects to those found in facial identity and expression recognition could be found in the recognition of individual bodies and socially meaningful body positions. Participants were trained to name four body positions (anger, fear, disgust, sadness) and four individuals (in a neutral position). We then tested their recognition of extremely caricatured, moderately caricatured, anticaricatured, and undistorted images of each stimulus. Consistent with caricature effects found in face recognition, moderately caricatured representations of individuals' bodies were recognised more accurately than undistorted and extremely caricatured representations. No significant difference was found between participants' recognition of extremely caricatured, moderately caricatured, or undistorted body position line-drawings. AU anti-caricatured representations were named significandy less accurately than the veridical stimuli. Similar mental representations may be used for both bodies and faces.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Verbal working memory and emotional self-regulation are impaired in Bipolar Disorder (BD). Our aim was to investigate the effect of Lamotrigine (LTG), which is effective in the clinical management of BD, on the neural circuits subserving working memory and emotional processing. Functional Magnetic Resonance Imaging data from 12 stable BD patients was used to detect LTG-induced changes as the differences in brain activity between drug-free and post-LTG monotherapy conditions during a verbal working memory (N-back sequential letter task) and an angry facial affect recognition task. For both tasks, LGT monotherapy compared to baseline was associated with increased activation mostly within the prefrontal cortex and cingulate gyrus, in regions normally engaged in verbal working memory and emotional processing. Therefore, LTG monotherapy in BD patients may enhance cortical function within neural circuits involved in memory and emotional self-regulation. © 2007 Elsevier B.V. and ECNP.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The amygdala participates in the detection and control of affective states, and has been proposed to be a site of dysfunction in affective disorders. To assess amygdala processing in individuals with unipolar depression, we applied a functional MRI (fMRI) paradigm previously shown to be sensitive to amygdala function. Fourteen individuals with untreated DSM-IV major depression and 15 healthy subjects were studied using fMRI with a standardized emotion face recognition task. Voxel-level data sets were subjected to a multiple-regression analysis, and functionally defined regions of interest (ROI), including bilateral amygdala, were analyzed with MANOVA. Pearson correlation coefficients between amygdala activation and HAM-D score also were performed. While both depressed and healthy groups showed increased amygdala activity when viewing emotive faces compared to geometric shapes, patients with unipolar depression showed relatively more activity than healthy subjects, particularly on the left. Positive Pearson correlations between amygdala activation and HAM-D score were found for both left and right ROIs in the patient group. This study provides in vivo imaging evidence to support the hypothesis of abnormal amygdala functioning in depressed individuals. (C) 2009 Elsevier Ireland Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Background: It has been suggested that individuals with social anxiety disorder (SAD) are exaggeratedly concerned about approval and disapproval by others. Therefore, we assessed the recognition of facial expressions by individuals with SAD, in an attempt to overcome the limitations of previous studies. Methods: The sample was formed by 231 individuals (78 SAD patients and 153 healthy controls). All individuals were treatment naive, aged 18-30 years and with similar socioeconomic level. Participants judged which emotion (happiness, sadness, disgust, anger, fear, and surprise) was presented in the facial expression of stimuli displayed on a computer screen. The stimuli were manipulated in order to depict different emotional intensities, with the initial image being a neutral face (0%) and, as the individual moved on across images, the expressions increased their emotional intensity until reaching the total emotion (100%). The time, accuracy, and intensity necessary to perform judgments were evaluated. Results: The groups did not show statistically significant differences in respect to the number of correct judgments or to the time necessary to respond. However, women with SAD required less emotional intensity to recognize faces displaying fear (p = 0.002), sadness (p = 0.033) and happiness (p = 0.002), with no significant differences for the other emotions or men with SAD. Conclusions: The findings suggest that women with SAD are hypersensitive to threat-related and approval-related social cues. Future studies investigating the neural basis of the impaired processing of facial emotion in SAD using functional neuroimaging would be desirable and opportune. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Very preterm (VP) infants are at greater risk for cognitive difficulties that may persist during school-age, adolescence and adulthood. Behavioral assessments report either effortful control (part of executive functions) or emotional reactivity/regulation impairments. AIMS: The aim of this study is to examine whether emotional recognition, reactivity, and regulation, as well as effortful control abilities are impaired in very preterm children at 42 months of age, compared with their full-term peers, and to what extent emotional and effortful control difficulties are linked. STUDY DESIGN: Children born very preterm (VP; < 29 weeks gestational age, n=41) and full-term (FT) aged-matched children (n=47) participated in a series of specific neuropsychological tests assessing their level of emotional understanding, reactivity and regulation, as well as their attentional and effortful control abilities. RESULTS: VP children exhibited higher scores of frustration and fear, and were less accurate in naming facial expressions of emotions than their aged-matched peers. However, VP children and FT children equally performed when asked to choose emotional facial expression in social context, and when we assessed their selective attention skills. VP performed significantly lower than full terms on two tasks of inhibition when correcting for verbal skills. Moreover, significant correlations between cognitive capacities (effortful control) and emotional abilities were evidenced. CONCLUSIONS: Compared to their FT peers, 42 month-olds who were born very preterm are at higher risk of exhibiting specific emotional and effortful control difficulties. The results suggest that these difficulties are linked. Ongoing behavioral and emotional impairments starting at an early age in preterms highlight the need for early interventions based on a better understanding of the relationship between emotional and cognitive difficulties.