825 resultados para EMOTION PERCEPTION
Resumo:
Previously, studies investigating emotional face perception - regardless of whether they involved adults or children - presented participants with static photos of faces in isolation. In the natural world, faces are rarely encountered in isolation. In the few studies that have presented faces in context, the perception of emotional facial expressions is altered when paired with an incongruent context. For both adults and 8- year-old children, reaction times increase and accuracy decreases when facial expressions are presented in an incongruent context depicting a similar emotion (e.g., sad face on a fear body) compared to when presented in a congruent context (e.g., sad face on a sad body; Meeren, van Heijnsbergen, & de Gelder, 2005; Mondloch, 2012). This effect is called a congruency effect and does not exist for dissimilar emotions (e.g., happy and sad; Mondloch, 2012). Two models characterize similarity between emotional expressions differently; the emotional seed model bases similarity on physical features, whereas the dimensional model bases similarity on underlying dimensions of valence an . arousal. Study 1 investigated the emergence of an adult-like pattern of congruency effects in pre-school aged children. Using a child-friendly sorting task, we identified the youngest age at which children could accurately sort isolated facial expressions and body postures and then measured whether an incongruent context disrupted the perception of emotional facial expressions. Six-year-old children showed congruency effects for sad/fear but 4-year-old children did not for sad/happy. This pattern of congruency effects is consistent with both models and indicates that an adult-like pattern exists at the youngest age children can reliably sort emotional expressions in isolation. In Study 2, we compared the two models to determine their predictive abilities. The two models make different predictions about the size of congruency effects for three emotions: sad, anger, and fear. The emotional seed model predicts larger congruency effects when sad is paired with either anger or fear compared to when anger and fear are paired with each other. The dimensional model predicts larger congruency effects when anger and fear are paired together compared to when either is paired with sad. In both a speeded and unspeeded task the results failed to support either model, but the pattern of results indicated fearful bodies have a special effect. Fearful bodies reduced accuracy, increased reaction times more than any other posture, and shifted the pattern of errors. To determine whether the results were specific to bodies, we ran the reverse task to determine if faces could disrupt the perception of body postures. This experiment did not produce congruency effects, meaning faces do not influence the perception of body postures. In the final experiment, participants performed a flanker task to determine whether the effect of fearful bodies was specific to faces or whether fearful bodies would also produce a larger effect in an unrelated task in which faces were absent. Reaction times did not differ across trials, meaning fearful bodies' large effect is specific to situations with faces. Collectively, these studies provide novel insights, both developmentally and theoretically, into how emotional faces are perceived in context.
Resumo:
This paper studies the ability of pre-kindergarten students with both normal hearing and impaired hearing to identify emotions in speech through audition only. In addition, the study assesses whether a listener's familiarity with a speaker's voice has an effect on his/her ability to identify the emotion of the speaker.
Resumo:
The primary goal of this project is to study the ability of adult cochlear implant users to perceive emotion through speech alone. A secondary goal of this project is to study the development of emotion perception in normal hearing children to serve as a baseline for comparing emotion perception abilities in similarly-aged children with impaired hearing.
Resumo:
The primary goal of this study is to examine the ability of pediatric hearing-aid listeners, with mild to moderately-severe hearing loss, to perceive emotion and to discriminate talkers. These listeners’ performance is compared to that of similarly-aged listeners with normal hearing and who use cochlear implants.
Resumo:
Most previous neurophysiological studies evoked emotions by presenting visual stimuli. Models of the emotion circuits in the brain have for the most part ignored emotions arising from musical stimuli. To our knowledge, this is the first emotion brain study which examined the influence of visual and musical stimuli on brain processing. Highly arousing pictures of the International Affective Picture System and classical musical excerpts were chosen to evoke the three basic emotions of happiness, sadness and fear. The emotional stimuli modalities were presented for 70 s either alone or combined (congruent) in a counterbalanced and random order. Electroencephalogram (EEG) Alpha-Power-Density, which is inversely related to neural electrical activity, in 30 scalp electrodes from 24 right-handed healthy female subjects, was recorded. In addition, heart rate (HR), skin conductance responses (SCR), respiration, temperature and psychometrical ratings were collected. Results showed that the experienced quality of the presented emotions was most accurate in the combined conditions, intermediate in the picture conditions and lowest in the sound conditions. Furthermore, both the psychometrical ratings and the physiological involvement measurements (SCR, HR, Respiration) were significantly increased in the combined and sound conditions compared to the picture conditions. Finally, repeated measures ANOVA revealed the largest Alpha-Power-Density for the sound conditions, intermediate for the picture conditions, and lowest for the combined conditions, indicating the strongest activation in the combined conditions in a distributed emotion and arousal network comprising frontal, temporal, parietal and occipital neural structures. Summing up, these findings demonstrate that music can markedly enhance the emotional experience evoked by affective pictures.
Resumo:
We examined whether it is possible to identify the emotional content of behaviour from point-light displays where pairs of actors are engaged in interpersonal communication. These actors displayed a series of emotions, which included sadness, anger, joy, disgust, fear, and romantic love. In experiment 1, subjects viewed brief clips of these point-light displays presented the right way up and upside down. In experiment 2, the importance of the interaction between the two figures in the recognition of emotion was examined. Subjects were shown upright versions of (i) the original pairs (dyads), (ii) a single actor (monad), and (iii) a dyad comprising a single actor and his/her mirror image (reflected dyad). In each experiment, the subjects rated the emotional content of the displays by moving a slider along a horizontal scale. All of the emotions received a rating for every clip. In experiment 1, when the displays were upright, the correct emotions were identified in each case except disgust; but, when the displays were inverted, performance was significantly diminished for some ernotions. In experiment 2, the recognition of love and joy was impaired by the absence of the acting partner, and the recognition of sadness, joy, and fear was impaired in the non-veridical (mirror image) displays. These findings both support and extend previous research by showing that biological motion is sufficient for the perception of emotion, although inversion affects performance. Moreover, emotion perception from biological motion can be affected by the veridical or non-veridical social context within the displays.
Resumo:
Emotion regulation is crucial for successfully engaging in social interactions. Yet, little is known about the neural mechanisms controlling behavioral responses to emotional expressions perceived in the face of other people, which constitute a key element of interpersonal communication. Here, we investigated brain systems involved in social emotion perception and regulation, using functional magnetic resonance imaging (fMRI) in 20 healthy participants. The latter saw dynamic facial expressions of either happiness or sadness, and were asked to either imitate the expression or to suppress any expression on their own face (in addition to a gender judgment control task). fMRI results revealed higher activity in regions associated with emotion (e.g., the insula), motor function (e.g., motor cortex), and theory of mind (e.g., [pre]cuneus) during imitation. Activity in dorsal cingulate cortex was also increased during imitation, possibly reflecting greater action monitoring or conflict with own feeling states. In addition, premotor regions were more strongly activated during both imitation and suppression, suggesting a recruitment of motor control for both the production and inhibition of emotion expressions. Expressive suppression (eSUP) produced increases in dorsolateral and lateral prefrontal cortex typically related to cognitive control. These results suggest that voluntary imitation and eSUP modulate brain responses to emotional signals perceived from faces, by up- and down-regulating activity in distributed subcortical and cortical networks that are particularly involved in emotion, action monitoring, and cognitive control.
Resumo:
Emotion is generally argued to be an influence on the behavior of life systems, largely concerning flexibility and adaptivity. The way in which life systems acts in response to a particular situations of the environment, has revealed the decisive and crucial importance of this feature in the success of behaviors. And this source of inspiration has influenced the way of thinking artificial systems. During the last decades, artificial systems have undergone such an evolution that each day more are integrated in our daily life. They have become greater in complexity, and the subsequent effects are related to an increased demand of systems that ensure resilience, robustness, availability, security or safety among others. All of them questions that raise quite a fundamental challenges in control design. This thesis has been developed under the framework of the Autonomous System project, a.k.a the ASys-Project. Short-term objectives of immediate application are focused on to design improved systems, and the approaching of intelligence in control strategies. Besides this, long-term objectives underlying ASys-Project concentrate on high order capabilities such as cognition, awareness and autonomy. This thesis is placed within the general fields of Engineery and Emotion science, and provides a theoretical foundation for engineering and designing computational emotion for artificial systems. The starting question that has grounded this thesis aims the problem of emotion--based autonomy. And how to feedback systems with valuable meaning has conformed the general objective. Both the starting question and the general objective, have underlaid the study of emotion, the influence on systems behavior, the key foundations that justify this feature in life systems, how emotion is integrated within the normal operation, and how this entire problem of emotion can be explained in artificial systems. By assuming essential differences concerning structure, purpose and operation between life and artificial systems, the essential motivation has been the exploration of what emotion solves in nature to afterwards analyze analogies for man--made systems. This work provides a reference model in which a collection of entities, relationships, models, functions and informational artifacts, are all interacting to provide the system with non-explicit knowledge under the form of emotion-like relevances. This solution aims to provide a reference model under which to design solutions for emotional operation, but related to the real needs of artificial systems. The proposal consists of a multi-purpose architecture that implement two broad modules in order to attend: (a) the range of processes related to the environment affectation, and (b) the range or processes related to the emotion perception-like and the higher levels of reasoning. This has required an intense and critical analysis beyond the state of the art around the most relevant theories of emotion and technical systems, in order to obtain the required support for those foundations that sustain each model. The problem has been interpreted and is described on the basis of AGSys, an agent assumed with the minimum rationality as to provide the capability to perform emotional assessment. AGSys is a conceptualization of a Model-based Cognitive agent that embodies an inner agent ESys, the responsible of performing the emotional operation inside of AGSys. The solution consists of multiple computational modules working federated, and aimed at conforming a mutual feedback loop between AGSys and ESys. Throughout this solution, the environment and the effects that might influence over the system are described as different problems. While AGSys operates as a common system within the external environment, ESys is designed to operate within a conceptualized inner environment. And this inner environment is built on the basis of those relevances that might occur inside of AGSys in the interaction with the external environment. This allows for a high-quality separate reasoning concerning mission goals defined in AGSys, and emotional goals defined in ESys. This way, it is provided a possible path for high-level reasoning under the influence of goals congruence. High-level reasoning model uses knowledge about emotional goals stability, letting this way new directions in which mission goals might be assessed under the situational state of this stability. This high-level reasoning is grounded by the work of MEP, a model of emotion perception that is thought as an analogy of a well-known theory in emotion science. The work of this model is described under the operation of a recursive-like process labeled as R-Loop, together with a system of emotional goals that are assumed as individual agents. This way, AGSys integrates knowledge that concerns the relation between a perceived object, and the effect which this perception induces on the situational state of the emotional goals. This knowledge enables a high-order system of information that provides the sustain for a high-level reasoning. The extent to which this reasoning might be approached is just delineated and assumed as future work. This thesis has been studied beyond a long range of fields of knowledge. This knowledge can be structured into two main objectives: (a) the fields of psychology, cognitive science, neurology and biological sciences in order to obtain understanding concerning the problem of the emotional phenomena, and (b) a large amount of computer science branches such as Autonomic Computing (AC), Self-adaptive software, Self-X systems, Model Integrated Computing (MIC) or the paradigm of models@runtime among others, in order to obtain knowledge about tools for designing each part of the solution. The final approach has been mainly performed on the basis of the entire acquired knowledge, and described under the fields of Artificial Intelligence, Model-Based Systems (MBS), and additional mathematical formalizations to provide punctual understanding in those cases that it has been required. This approach describes a reference model to feedback systems with valuable meaning, allowing for reasoning with regard to (a) the relationship between the environment and the relevance of the effects on the system, and (b) dynamical evaluations concerning the inner situational state of the system as a result of those effects. And this reasoning provides a framework of distinguishable states of AGSys derived from its own circumstances, that can be assumed as artificial emotion.
Resumo:
The mechanisms underlying the effects of antidepressant treatment in patients with Parkinson`s disease (PD) are unclear. The neural changes after successful therapy investigated by neuroimaging methods can give insights into the mechanisms of action related to a specific treatment choice. To study the mechanisms of neural modulation of repetitive transcranial magnetic Stimulation (rTMS) and fluoxetine, 21 PD depressed patients were randomized into only two active treatment groups for 4 wk: active rTMS over left dorsolateral prefrontal cortex (DLPFC) (5 Hz rTMS; 120% motor threshold) with placebo pill and sham rTMS with fluoxetine 20mg/d. Event-related functional magnetic resonance imaging (fMRI) with emotional stimuli was performed before and after treatment - in two sessions (test and re-test) at each time-point. The two groups of treatment had a significant, similar mood improvement. After rTMS treatment, there were brain activity decreases in left fusiform gyrus, cerebellum and right DLPFC and brain activity increases in left DLPFC and anterior cingulate gyrus compared to baseline. In contrast, after fluoxetine treatment, there were brain activity increases in right premotor and right medial prefrontal cortex. There was a significant interaction effect between groups vs. time in the left medial prefrontal cortex, suggesting that the activity in this area changed differently in the two treatment groups. Our findings show that antidepressant effects of rTMS and fluoxetine in PD are associated with changes in different areas of the depression-related neural network.
Resumo:
Previous magnetic resonance imaging (MRI) studies described consistent age-related gray matter (GM) reductions in the fronto-parietal neocortex, insula and cerebellum in elderly subjects, but not as frequently in limbic/paralimbic structures. However, it is unclear whether such features are already present during earlier stages of adulthood, and if age-related GM changes may follow non-linear patterns at such age range. This voxel-based morphometry study investigated the relationship between GM volumes and age specifically during non-elderly life (18-50 years) in 89 healthy individuals (48 males and 41 females). Voxelwise analyses showed significant (p < 0.05, corrected) negative correlations in the right prefrontal cortex and left cerebellum, and positive correlations (indicating lack of GM loss) in the medial temporal region, cingulate gyrus, insula and temporal neocortex. Analyses using ROI masks showed that age-related dorsolateral prefrontal volume decrements followed non-linear patterns, and were less prominent in females compared to males at this age range. These findings further support for the notion of a heterogeneous and asynchronous pattern of age-related brain morphometric changes, with region-specific non-linear features. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Objectives: Many morphometric magnetic resonance imaging (MRI) studies that have investigated the presence of gray matter (GM) volume abnormalities associated with the diagnosis of bipolar disorder (BD) have reported conflicting findings. None of these studies has compared patients with recent-onset psychotic BD with asymptomatic controls selected from exactly the same environment using epidemiological methods, or has directly contrasted BD patients against subjects with first-onset psychotic major depressive disorder (MDD). We examined structural brain differences between (i) BD (type I) subjects and MDD subjects with psychotic features in their first contact with the healthcare system in Brazil, and (ii) these two mood disorder groups relative to a sample of geographically matched asymptomatic controls. Methods: A total of 26 BD subjects, 20 subjects with MDD, and 94 healthy controls were examined using either of two identical MRI scanners and acquisition protocols. Diagnoses were based on DSM-IV criteria and confirmed one year after brain scanning. Image processing was conducted using voxel-based morphometry. Results: The BD group showed increased volume of the right dorsal anterior cingulate cortex relative to controls, while the MDD subjects exhibited bilateral foci GM deficits in the dorsolateral prefrontal cortex (p < 0.05, corrected for multiple comparisons). Direct comparison between BD and MDD patients showed a focus of GM reduction in the right-sided dorsolateral prefrontal cortex (p < 0.05, corrected for multiple comparisons) and a trend (p < 0.10, corrected) toward left-sided GM deficits in the dorsolateral prefrontal cortex of MDD patients. When analyses were repeated with scanner site as a confounding covariate the finding of increased right anterior cingulate volumes in BD patients relative to controls remained statistically significant (p = 0.01, corrected for multiple comparisons). Conclusions: These findings reinforce the view that there are important pathophysiological distinctions between BD and MDD, and indicate that subtle dorsal anterior cingulate abnormalities may be relevant to the pathophysiology of BD.
Resumo:
Neuroimaging studies in bipolar disorder report gray matter volume (GMV) abnormalities in neural regions implicated in emotion regulation. This includes a reduction in ventral/orbital medial prefrontal cortex (OMPFC) GMV and, inconsistently, increases in amygdala GMV. We aimed to examine OMPFC and amygdala GMV in bipolar disorder type 1 patients (BPI) versus healthy control participants (HC), and the potential confounding effects of gender, clinical and illness history variables and psychotropic medication upon any group differences that were demonstrated in OMPFC and amygdala GMV Images were acquired from 27 BPI (17 euthymic, 10 depressed) and 28 age- and gender-matched HC in a 3T Siemens scanner. Data were analyzed with SPM5 using voxel-based morphometry (VBM) to assess main effects of diagnostic group and gender upon whole brain (WB) GMV. Post-hoc analyses were subsequently performed using SPSS to examine the extent to which clinical and illness history variables and psychotropic medication contributed to GMV abnormalities in BPI in a priori and non-a priori regions has demonstrated by the above VBM analyses. BPI showed reduced GMV in bilateral posteromedial rectal gyrus (PMRG), but no abnormalities in amygdala GMV. BPI also showed reduced GMV in two non-a priori regions: left parahippocampal gyrus and left putamen. For left PMRG GMV, there was a significant group by gender by trait anxiety interaction. GMV was significantly reduced in male low-trait anxiety BPI versus male low-trait anxiety HC, and in high-versus low-trait anxiety male BPI. Our results show that in BPI there were significant effects of gender and trait-anxiety, with male BPI and those high in trait-anxiety showing reduced left PMRG GMV. PMRG is part of medial prefrontal network implicated in visceromotor and emotion regulation. (C) 2008 Elsevier Ireland Ltd. All rights reserved.
Resumo:
Pretendemos estudar situações de conflito interpessoal e sócio-moral e comparar as estratégias de resolução que são escolhidas por adolescentes de contextos culturais diferentes. Participaram no estudo 89 adolescentes, 44 portugueses e 45 angolanos, com idades compreendidas entre os 14 e os 16 anos, distribuídos de forma equivalente pelos dois géneros. Foram construídos quatro dilemas que configuram situações de conflito interpessoal e moral habituais entre adolescentes (humilhação, inveja, rivalidade entre grupos e traição amorosa) e foram utilizadas quatro medidas, atribuição de comportamento e de emoção, perceção de ganhos e antecipação de custos. Os resultados revelaram que a escolha de estratégias de resolução de conflitos interpessoais (agressão física, agressão verbal, indiferença, negociação e outras) é insensível ao género, mas varia inter-medidas nas histórias de inveja e de traição, varia com o conteúdo sócio-moral das situações, sobretudo entre a história de traição amorosa e as outras histórias, e varia inter-culturas também para a história de traição, onde se verificam diferenças estatísticas nas três medidas. Os resultados mostram ainda que os adolescentes têm dificuldade em antecipar os custos associados às estratégias de resolução de conflitos interpessoais, pois optam maioritariamente por não responder a esta questão.
Resumo:
This commentary raises general questions about the parsimony and generalizability of the SIMS model, before interrogating the specific roles that the amygdala and eye contact play in it. Additionally, this situates the SIMS model alongside another model of facial expression processing, with a view to incorporating individual differences in emotion perception.
Resumo:
The male bias in autism spectrum conditions (ASC) has led to females with ASC being under-researched. This lack of attention to females could hide variability due to sex that may explain some of the heterogeneity within ASC. In this study we investigate four key cognitive domains (mentalizing and emotion perception, executive function, perceptual attention to detail, and motor function) in ASC, to test for similarities and differences between males and females with and without ASC (n = 128 adults; n = 32 per group). In the mentalizing and facial emotion perception domain, males and females with ASC showed similar deficits compared to neurotypical controls. However, in attention to detail and dexterity involving executive function, although males with ASC showed poorer performance relative to neurotypical males, females with ASC performed comparably to neurotypical females. We conclude that performance in the social-cognitive domain is equally impaired in male and female adults with ASC. However, in specific non-social cognitive domains, performance within ASC depends on sex. This suggests that in specific domains, cognitive profiles in ASC are modulated by sex.