939 resultados para speech emotion recognition


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accurate and fast decoding of speech imagery from electroencephalographic (EEG) data could serve as a basis for a new generation of brain computer interfaces (BCIs), more portable and easier to use. However, decoding of speech imagery from EEG is a hard problem due to many factors. In this paper we focus on the analysis of the classification step of speech imagery decoding for a three-class vowel speech imagery recognition problem. We empirically show that different classification subtasks may require different classifiers for accurately decoding and obtain a classification accuracy that improves the best results previously published. We further investigate the relationship between the classifiers and different sets of features selected by the common spatial patterns method. Our results indicate that further improvement on BCIs based on speech imagery could be achieved by carefully selecting an appropriate combination of classifiers for the subtasks involved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The work presented here is part of a larger study to identify novel technologies and biomarkers for early Alzheimer disease (AD) detection and it focuses on evaluating the suitability of a new approach for early AD diagnosis by non-invasive methods. The purpose is to examine in a pilot study the potential of applying intelligent algorithms to speech features obtained from suspected patients in order to contribute to the improvement of diagnosis of AD and its degree of severity. In this sense, Artificial Neural Networks (ANN) have been used for the automatic classification of the two classes (AD and control subjects). Two human issues have been analyzed for feature selection: Spontaneous Speech and Emotional Response. Not only linear features but also non-linear ones, such as Fractal Dimension, have been explored. The approach is non invasive, low cost and without any side effects. Obtained experimental results were very satisfactory and promising for early diagnosis and classification of AD patients.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Psychopathy is associated with well-known characteristics such as a lack of empathy and impulsive behaviour, but it has also been associated with impaired recognition of emotional facial expressions. The use of event-related potentials (ERPs) to examine this phenomenon could shed light on the specific time course and neural activation associated with emotion recognition processes as they relate to psychopathic traits. In the current study we examined the PI , N170, and vertex positive potential (VPP) ERP components and behavioural performance with respect to scores on the Self-Report Psychopathy (SRP-III) questionnaire. Thirty undergraduates completed two tasks, the first of which required the recognition and categorization of affective face stimuli under varying presentation conditions. Happy, angry or fearful faces were presented under with attention directed to the mouth, nose or eye region and varied stimulus exposure duration (30, 75, or 150 ms). We found that behavioural performance to be unrelated to psychopathic personality traits in all conditions, but there was a trend for the Nl70 to peak later in response to fearful and happy facial expressions for individuals high in psychopathic traits. However, the amplitude of the VPP was significantly negatively associated with psychopathic traits, but only in response to stimuli presented under a nose-level fixation. Finally, psychopathic traits were found to be associated with longer N170 latencies in response to stimuli presented under the 30 ms exposure duration. In the second task, participants were required to inhibit processing of irrelevant affective and scrambled face distractors while categorizing unrelated word stimuli as living or nonliving. Psychopathic traits were hypothesized to be positively associated with behavioural performance, as it was proposed that individuals high in psychopathic traits would be less likely to automatically attend to task-irrelevant affective distractors, facilitating word categorization. Thus, decreased interference would be reflected in smaller N170 components, indicating less neural activity associated with processing of distractor faces. We found that overall performance decreased in the presence of angry and fearful distractor faces as psychopathic traits increased. In addition, the amplitude of the N170 decreased and the latency increased in response to affective distractor faces for individuals with higher levels of psychopathic traits. Although we failed to find the predicted behavioural deficit in emotion recognition in Task 1 and facilitation effect in Task 2, the findings of increased N170 and VPP latencies in response to emotional faces are consistent wi th the proposition that abnormal emotion recognition processes may in fact be inherent to psychopathy as a continuous personality trait.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effect of multiple sclerosis (MS) on the ability to identify emotional expressions in faces was investigated, and possible associations with patients’ characteristics were explored. 56 non-demented MS patients and 56 healthy subjects (HS) with similar demographic characteristics performed an emotion recognition task (ERT), the Benton Facial Recognition Test (BFRT), and answered the Hospital Anxiety and Depression Scale (HADS). Additionally, MS patients underwent a neurological examination and a comprehensive neuropsychological evaluation. The ERT consisted of 42 pictures of faces (depicting anger, disgust, fear, happiness, sadness, surprise and neutral expressions) from the NimStim set. An iViewX high-speed eye tracker was used to record eye movements during ERT. The fixation times were calculated for two regions of interest (i.e., eyes and rest of the face). No significant differences were found between MS and HC on ERT’s behavioral and oculomotor measures. Bivariate and multiple regression analyses revealed significant associations between ERT’s behavioral performance and demographic, clinical, psychopathological, and cognitive measures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Empathy is the lens through which we view others' emotion expressions, and respond to them. In this study, empathy and facial emotion recognition were investigated in adults with autism spectrum conditions (ASC; N=314), parents of a child with ASC (N=297) and IQ-matched controls (N=184). Participants completed a self-report measure of empathy (the Empathy Quotient [EQ]) and a modified version of the Karolinska Directed Emotional Faces Task (KDEF) using an online test interface. Results showed that mean scores on the EQ were significantly lower in fathers (p<0.05) but not mothers (p>0.05) of children with ASC compared to controls, whilst both males and females with ASC obtained significantly lower EQ scores (p<0.001) than controls. On the KDEF, statistical analyses revealed poorer overall performance by adults with ASC (p<0.001) compared to the control group. When the 6 distinct basic emotions were analysed separately, the ASC group showed impaired performance across five out of six expressions (happy, sad, angry, afraid and disgusted). Parents of a child with ASC were not significantly worse than controls at recognising any of the basic emotions, after controlling for age and non-verbal IQ (all p>0.05). Finally, results indicated significant differences between males and females with ASC for emotion recognition performance (p<0.05) but not for self-reported empathy (p>0.05). These findings suggest that self-reported empathy deficits in fathers of autistic probands are part of the 'broader autism phenotype'. This study also reports new findings of sex differences amongst people with ASC in emotion recognition, as well as replicating previous work demonstrating empathy difficulties in adults with ASC. The use of empathy measures as quantitative endophenotypes for ASC is discussed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this research, we propose a facial expression recognition system with a layered encoding cascade optimization model. Since generating an effective facial representation is a vital step to the success of facial emotion recognition, a modified Local Gabor Binary Pattern operator is first employed to derive a refined initial face representation and we then propose two evolutionary algorithms for feature optimization including (i) direct similarity and (ii) Pareto-based feature selection, under the layered cascade model. The direct similarity feature selection considers characteristics within the same emotion category that give the minimum within-class variation while the Pareto-based feature optimization focuses on features that best represent each expression category and at the same time provide the most distinctions to other expressions. Both a neural network and an ensemble classifier with weighted majority vote are implemented for the recognition of seven expressions based on the selected optimized features. The ensemble model also automatically updates itself with the most recent concepts in the data. Evaluated with the Cohn-Kanade database, our system achieves the best accuracies when the ensemble classifier is applied, and outperforms other research reported in the literature with 96.8% for direct similarity based optimization and 97.4% for the Pareto-based feature selection. Cross-database evaluation with frontal images from the MMI database has also been conducted to further prove system efficiency where it achieves 97.5% for Pareto-based approach and 90.7% for direct similarity-based feature selection and outperforms related research for MMI. When evaluated with 90° side-view images extracted from the videos of the MMI database, the system achieves superior performances with >80% accuracies for both optimization algorithms. Experiments with other weighting and meta-learning combination methods for the construction of ensembles are also explored with our proposed ensemble showing great adpativity to new test data stream for cross-database evaluation. In future work, we aim to incorporate other filtering techniques and evolutionary algorithms into the optimization models to further enhance the recognition performance.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: One of the many cognitive deficits reported in bipolar disorder (BD) patients is facial emotion recognition (FER), which has recently been associated with dopaminergic catabolism. Catechol-O-methyltransferase (COMT) is one of the main enzymes involved in the metabolic degradation of dopamine (DA) in the prefrontal cortex (PFC). The COMT gene polymorphism rs4680 (Val(158)Met) Met allele is associated with decreased activity of this enzyme in healthy controls. The objective of this study was to evaluate the influence of Val(158)Met on FER during manic and depressive episodes in BD patients and in healthy controls. Materials and methods: 64 BD type I patients (39 in manic and 25 in depressive episodes) and 75 healthy controls were genotyped for COMT rs4680 and assessed for FER using the Ekman 60 Faces (EK60) and Emotion Hexagon (Hx) tests. Results: Bipolar manic patients carrying the Met allele recognized fewer surprised faces, while depressed patients with the Met allele recognized fewer "angry" and "happy" faces. Healthy homozygous subjects with the Met allele had higher FER scores on the Hx total score, as well as on "disgust" and "angry" faces than other genotypes. Conclusion: This is the first study suggesting that COMT rs4680 modulates FER differently during BD episodes and in healthy controls. This provides evidence that PFC DA is part of the neurobiological mechanisms of social cognition. Further studies on other COMT polymorphisms that include euthymic BD patients are warranted. ClinicalTrials.gov Identifier: NCT00969. (C) 2011 Elsevier B.V. All rights reserved.