892 resultados para emotion dysregulation
Resumo:
Facial emotions are the most expressive way to display emotions. Many algorithms have been proposed which employ a particular set of people (usually a database) to both train and test their model. This paper focuses on the challenging task of database independent emotion recognition, which is a generalized case of subject-independent emotion recognition. The emotion recognition system employed in this work is a Meta-Cognitive Neuro-Fuzzy Inference System (McFIS). McFIS has two components, a neuro-fuzzy inference system, which is the cognitive component and a self-regulatory learning mechanism, which is the meta-cognitive component. The meta-cognitive component, monitors the knowledge in the neuro-fuzzy inference system and decides on what-to-learn, when-to-learn and how-to-learn the training samples, efficiently. For each sample, the McFIS decides whether to delete the sample without being learnt, use it to add/prune or update the network parameter or reserve it for future use. This helps the network avoid over-training and as a result improve its generalization performance over untrained databases. In this study, we extract pixel based emotion features from well-known (Japanese Female Facial Expression) JAFFE and (Taiwanese Female Expression Image) TFEID database. Two sets of experiment are conducted. First, we study the individual performance of both databases on McFIS based on 5-fold cross validation study. Next, in order to study the generalization performance, McFIS trained on JAFFE database is tested on TFEID and vice-versa. The performance The performance comparison in both experiments against SVNI classifier gives promising results.
Resumo:
Study of emotions in human-computer interaction is a growing research area. This paper shows an attempt to select the most significant features for emotion recognition in spoken Basque and Spanish Languages using different methods for feature selection. RekEmozio database was used as the experimental data set. Several Machine Learning paradigms were used for the emotion classification task. Experiments were executed in three phases, using different sets of features as classification variables in each phase. Moreover, feature subset selection was applied at each phase in order to seek for the most relevant feature subset. The three phases approach was selected to check the validity of the proposed approach. Achieved results show that an instance-based learning algorithm using feature subset selection techniques based on evolutionary algorithms is the best Machine Learning paradigm in automatic emotion recognition, with all different feature sets, obtaining a mean of 80,05% emotion recognition rate in Basque and a 74,82% in Spanish. In order to check the goodness of the proposed process, a greedy searching approach (FSS-Forward) has been applied and a comparison between them is provided. Based on achieved results, a set of most relevant non-speaker dependent features is proposed for both languages and new perspectives are suggested.
Resumo:
In this paper, a novel approach for mandarin speech emotion recognition, that is mandarin speech emotion recognition based on high dimensional geometry theory, is proposed. The human emotions are classified into 6 archetypal classes: fear, anger, happiness, sadness, surprise and disgust. According to the characteristics of these emotional speech signals, the amplitude, pitch frequency and formant are used as the feature parameters for speech emotion recognition. The new method called high dimensional geometry theory is applied for recognition. Compared with traditional GSVM model, the new method has some advantages. It is noted that this method has significant values for researches and applications henceforth.
Resumo:
Both animals and mobile robots, or animats, need adaptive control systems to guide their movements through a novel environment. Such control systems need reactive mechanisms for exploration, and learned plans to efficiently reach goal objects once the environment is familiar. How reactive and planned behaviors interact together in real time, and arc released at the appropriate times, during autonomous navigation remains a major unsolved problern. This work presents an end-to-end model to address this problem, named SOVEREIGN: A Self-Organizing, Vision, Expectation, Recognition, Emotion, Intelligent, Goal-oriented Navigation system. The model comprises several interacting subsystems, governed by systems of nonlinear differential equations. As the animat explores the environment, a vision module processes visual inputs using networks that arc sensitive to visual form and motion. Targets processed within the visual form system arc categorized by real-time incremental learning. Simultaneously, visual target position is computed with respect to the animat's body. Estimates of target position activate a motor system to initiate approach movements toward the target. Motion cues from animat locomotion can elicit orienting head or camera movements to bring a never target into view. Approach and orienting movements arc alternately performed during animat navigation. Cumulative estimates of each movement, based on both visual and proprioceptive cues, arc stored within a motor working memory. Sensory cues are stored in a parallel sensory working memory. These working memories trigger learning of sensory and motor sequence chunks, which together control planned movements. Effective chunk combinations arc selectively enhanced via reinforcement learning when the animat is rewarded. The planning chunks effect a gradual transition from reactive to planned behavior. The model can read-out different motor sequences under different motivational states and learns more efficient paths to rewarded goals as exploration proceeds. Several volitional signals automatically gate the interactions between model subsystems at appropriate times. A 3-D visual simulation environment reproduces the animat's sensory experiences as it moves through a simplified spatial environment. The SOVEREIGN model exhibits robust goal-oriented learning of sequential motor behaviors. Its biomimctic structure explicates a number of brain processes which are involved in spatial navigation.
Resumo:
Lewis proposes "reconceptualization" (p. 1) of how to link the psychology and neurobiology of emotion and cognitive-emotional interactions. His main proposed themes have actually been actively and quantitatively developed in the neural modeling literature for over thirty years. This commentary summarizes some of these themes and points to areas of particularly active research in this area.
Resumo:
BACKGROUND: Coronary artery bypass grafting (CABG) is often used to treat patients with significant coronary heart disease (CHD). To date, multiple longitudinal and cross-sectional studies have examined the association between depression and CABG outcomes. Although this relationship is well established, the mechanism underlying this relationship remains unclear. The purpose of this study was twofold. First, we compared three markers of autonomic nervous system (ANS) function in four groups of patients: 1) Patients with coronary heart disease and depression (CHD/Dep), 2) Patients without CHD but with depression (NonCHD/Dep), 3) Patients with CHD but without depression (CHD/NonDep), and 4) Patients without CHD and depression (NonCHD/NonDep). Second, we investigated the impact of depression and autonomic nervous system activity on CABG outcomes. METHODS: Patients were screened to determine whether they met some of the study's inclusion or exclusion criteria. ANS function (i.e., heart rate, heart rate variability, and plasma norepinephrine levels) were measured. Chi-square and one-way analysis of variance were performed to evaluate group differences across demographic, medical variables, and indicators of ANS function. Logistic regression and multiple regression analyses were used to assess impact of depression and autonomic nervous system activity on CABG outcomes. RESULTS: The results of the study provide some support to suggest that depressed patients with CHD have greater ANS dysregulation compared to those with only CHD or depression. Furthermore, independent predictors of in-hospital length of stay and non-routine discharge included having a diagnosis of depression and CHD, elevated heart rate, and low heart rate variability. CONCLUSIONS: The current study presents evidence to support the hypothesis that ANS dysregulation might be one of the underlying mechanisms that links depression to cardiovascular CABG surgery outcomes. Thus, future studies should focus on developing and testing interventions that targets modifying ANS dysregulation, which may lead to improved patient outcomes.
Resumo:
Emotional and attentional functions are known to be distributed along ventral and dorsal networks in the brain, respectively. However, the interactions between these systems remain to be specified. The present study used event-related functional magnetic resonance imaging (fMRI) to investigate how attentional focus can modulate the neural activity elicited by scenes that vary in emotional content. In a visual oddball task, aversive and neutral scenes were presented intermittently among circles and squares. The squares were frequent standard events, whereas the other novel stimulus categories occurred rarely. One experimental group [N=10] was instructed to count the circles, whereas another group [N=12] counted the emotional scenes. A main effect of emotion was found in the amygdala (AMG) and ventral frontotemporal cortices. In these regions, activation was significantly greater for emotional than neutral stimuli but was invariant to attentional focus. A main effect of attentional focus was found in dorsal frontoparietal cortices, whose activity signaled task-relevant target events irrespective of emotional content. The only brain region that was sensitive to both emotion and attentional focus was the anterior cingulate gyrus (ACG). When circles were task-relevant, the ACG responded equally to circle targets and distracting emotional scenes. The ACG response to emotional scenes increased when they were task-relevant, and the response to circles concomitantly decreased. These findings support and extend prominent network theories of emotion-attention interactions that highlight the integrative role played by the anterior cingulate.