968 resultados para Facial expression


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The present topical review deals with the motor control of facial expressions in humans. Facial expressions are a central part of human communication. Emotional face expressions have a crucial role in human non-verbal behavior, allowing a rapid transfer of information between individuals. Facial expressions can be both voluntarily or emotionally controlled. Recent studies in non-human primates and humans revealed that the motor control of facial expressions has a distributed neural representation. At least 5 cortical regions on the medial and lateral aspects of each hemisphere are involved: the primary motor cortex, the ventral lateral premotor cortex, the supplementary motor area on the medial wall, and, finally, the rostral and caudal cingulate cortex. The results of studies in humans and non-human primates suggest that the innervation of the face is bilaterally controlled for the upper part, and mainly contralaterally controlled for the lower part. Furthermore, the primary motor cortex, the ventral lateral premotor cortex, and the supplementary motor area are essential for the voluntary control of facial expressions. In contrast, the cingulate cortical areas are important for emotional expression, since they receive input from different structures of the limbic system. This article is protected by copyright. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Anxiety and fear are often confounded in discussions of human emotions. However, studies of rodent defensive reactions under naturalistic conditions suggest anxiety is functionally distinct from fear. Unambiguous threats, such as predators, elicit flight from rodents (if an escape-route is available), whereas ambiguous threats (e.g., the odor of a predator) elicit risk assessment behavior, which is associated with anxiety as it is preferentially modulated by anti-anxiety drugs. However, without human evidence, it would be premature to assume that rodent-based psychological models are valid for humans. We tested the human validity of the risk assessment explanation for anxiety by presenting 8 volunteers with emotive scenarios and asking them to pose facial expressions. Photographs and videos of these expressions were shown to 40 participants who matched them to the scenarios and labeled each expression. Scenarios describing ambiguous threats were preferentially matched to the facial expression posed in response to the same scenario type. This expression consisted of two plausible environmental-scanning behaviors (eye darts and head swivels) and was labeled as anxiety, not fear. The facial expression elicited by unambiguous threat scenarios was labeled as fear. The emotion labels generated were then presented to another 18 participants who matched them back to photographs of the facial expressions. This back-matching of labels to faces also linked anxiety to the environmental-scanning face rather than fear face. Results therefore suggest that anxiety produces a distinct facial expression and that it has adaptive value in situations that are ambiguously threatening, supporting a functional, risk-assessing explanation for human anxiety.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Age-related changes in the facial expression of pain during the first 18 months of life have important implications for our understanding of pain and pain assessment. We examined facial reactions video recorded during routine immunization injections in 75 infants stratified into 2-, 4-, 6-, 12-, and 18-month age groups. Two facial coding systems differing in the amount of detail extracted were applied to the records. In addition, parents completed a brief questionnaire that assessed child temperament and provided background information. Parents' efforts to soothe the children also were described. While there were consistencies in facial displays over the age groups, there also were differences on both measures of facial activity, indicating systematic variation in the nature and severity of distress. The least pain was expressed by the 4-month age group. Temperament was not related to the degree of pain expressed. Systematic variations in parental soothing behaviour indicated accommodation to the age of the child. Reasons for the differing patterns of facial activity are examined, with attention paid to the development of inhibitory mechanisms and the role of negative emotions such as anger and anxiety.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Pain expression in neonates instigated by heel-lance for blood sampling purposes was systematically described using measures of facial expression and cry and compared across sleep/waking states and sex. From gate-control theory it was hypothesized that pain behavior would vary with the ongoing functional state of the infant, rather than solely reflecting tissue insult. Awake-alert but inactive infants responded with the most facial activity, consistent with current views that infants in this state are most receptive to environmental stimulation. Infants in quiet sleep showed the least facial reaction and the longest latency to cry. Fundamental frequency of cry was not related to sleep/waking state. This suggested that findings from the cry literature on qualities of pain cry as a reflection of nervous system 'stress', in unwell newborns, do not generalize directly to healthy infants as a function of state. Sex differences were apparent in speed of response, with boys showing shorter time to cry and to display facial action following heel-lance. The findings of facial action variation across sleep/waking state were interpreted as indicating that the biological and behavioral context of pain events affects behavioral expression, even at the earliest time developmentally, before the opportunity for learned response patterns occurs. Issues raised by the study include the importance of using measurement techniques which are independent of preconceived categories of affective response.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A large variety of social signals, such as facial expression and body language, are conveyed in everyday interactions and an accurate perception and interpretation of these social cues is necessary in order for reciprocal social interactions to take place successfully and efficiently. The present study was conducted to determine whether impairments in social functioning that are commonly observed following a closed head injury, could at least be partially attributable to disruption in the ability to appreciate social cues. More specifically, an attempt was made to determine whether face processing deficits following a closed head injury (CHI) coincide with changes in electrophysiological responsivity to the presentation of facial stimuli. A number of event-related potentials (ERPs) that have been linked specifically to various aspects of visual processing were examined. These included the N170, an index of structural encoding ability, the N400, an index of the ability to detect differences in serially presented stimuli, and the Late Positivity (LP), an index of the sensitivity to affective content in visually-presented stimuli. Electrophysiological responses were recorded while participants with and without a closed head injury were presented with pairs of faces delivered in a rapid sequence and asked to compare them on the basis of whether they matched with respect to identity or emotion. Other behavioural measures of identity and emotion recognition were also employed, along with a small battery of standard neuropsychological tests used to determine general levels of cognitive impairment. Participants in the CHI group were impaired in a number of cognitive domains that are commonly affected following a brain injury. These impairments included reduced efficiency in various aspects of encoding verbal information into memory, general slower rate of information processing, decreased sensitivity to smell, and greater difficulty in the regulation of emotion and a limited awareness of this impairment. Impairments in face and emotion processing were clearly evident in the CHI group. However, despite these impairments in face processing, there were no significant differences between groups in the electrophysiological components examined. The only exception was a trend indicating delayed N170 peak latencies in the CHI group (p = .09), which may reflect inefficient structural encoding processes. In addition, group differences were noted in the region of the N100, thought to reflect very early selective attention. It is possible, then, that facial expression and identity processing deficits following CHI are secondary to (or exacerbated by) an underlying disruption of very early attentional processes. Alternately the difficulty may arise in the later cognitive stages involved in the interpretation of the relevant visual information. However, the present data do not allow these alternatives to be distinguished. Nonetheless, it was clearly evident that individuals with CHI are more likely than controls to make face processing errors, particularly for the more difficult to discriminate negative emotions. Those working with individuals who have sustained a head injury should be alerted to this potential source of social monitoring difficulties which is often observed as part of the sequelae following a CHI.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Facial expression is one of the main issues of face recognition in uncontrolled environments. In this paper, we apply the probabilistic linear discriminant analysis (PLDA) method to recognize faces across expressions. Several PLDA approaches are tested and cross-evaluated on the Cohn-Kanade and JAFFE databases. With less samples per gallery subject, high recognition rates comparable to previous works have been achieved indicating the robustness of the approaches. Among the approaches, the mixture of PLDAs has demonstrated better performances. The experimental results also indicate that facial regions around the cheeks, eyes, and eyebrows are more discriminative than regions around the mouth, jaw, chin, and nose.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Techniques to improve the automated analysis of natural and spontaneous facial expressions have been developed. The outcome of the research has applications in several fields including national security (eg: expression invariant face recognition); education (eg: affect aware interfaces); mental and physical health (eg: depression and pain recognition).

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The proliferation of news reports published in online websites and news information sharing among social media users necessitates effective techniques for analysing the image, text and video data related to news topics. This paper presents the first study to classify affective facial images on emerging news topics. The proposed system dynamically monitors and selects the current hot (of great interest) news topics with strong affective interestingness using textual keywords in news articles and social media discussions. Images from the selected hot topics are extracted and classified into three categorized emotions, positive, neutral and negative, based on facial expressions of subjects in the images. Performance evaluations on two facial image datasets collected from real-world resources demonstrate the applicability and effectiveness of the proposed system in affective classification of facial images in news reports. Facial expression shows high consistency with the affective textual content in news reports for positive emotion, while only low correlation has been observed for neutral and negative. The system can be directly used for applications, such as assisting editors in choosing photos with a proper affective semantic for a certain topic during news report preparation.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Most developmental studies of emotional face processing to date have focused on infants and very young children. Additionally, studies that examine emotional face processing in older children do not distinguish development in emotion and identity face processing from more generic age-related cognitive improvement. In this study, we developed a paradigm that measures processing of facial expression in comparison to facial identity and complex visual stimuli. The three matching tasks were developed (i.e., facial emotion matching, facial identity matching, and butterfly wing matching) to include stimuli of similar level of discriminability and to be equated for task difficulty in earlier samples of young adults. Ninety-two children aged 5–15 years and a new group of 24 young adults completed these three matching tasks. Young children were highly adept at the butterfly wing task relative to their performance on both face-related tasks. More importantly, in older children, development of facial emotion discrimination ability lagged behind that of facial identity discrimination.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

The characterisation of facial expression through landmark-based analysis methods such as FACEM (Pilowsky & Katsikitis, 1994) has a variety of uses in psychiatric and psychological research. In these systems, important structural relationships are extracted from images of facial expressions by the analysis of a pre-defined set of feature points. These relationship measures may then be used, for instance, to assess the degree of variability and similarity between different facial expressions of emotion. FaceXpress is a multimedia software suite that provides a generalised workbench for landmark-based facial emotion analysis and stimulus manipulation. It is a flexible tool that is designed to be specialised at runtime by the user. While FaceXpress has been used to implement the FACEM process, it can also be configured to support any other similar, arbitrary system for quantifying human facial emotion. FaceXpress also implements an integrated set of image processing tools and specialised tools for facial expression stimulus production including facial morphing routines and the generation of expression-representative line drawings from photographs.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Affect is an important feature of multimedia content and conveys valuable information for multimedia indexing and retrieval. Most existing studies for affective content analysis are limited to low-level features or mid-level representations, and are generally criticized for their incapacity to address the gap between low-level features and high-level human affective perception. The facial expressions of subjects in images carry important semantic information that can substantially influence human affective perception, but have been seldom investigated for affective classification of facial images towards practical applications. This paper presents an automatic image emotion detector (IED) for affective classification of practical (or non-laboratory) data using facial expressions, where a lot of “real-world” challenges are present, including pose, illumination, and size variations etc. The proposed method is novel, with its framework designed specifically to overcome these challenges using multi-view versions of face and fiducial point detectors, and a combination of point-based texture and geometry. Performance comparisons of several key parameters of relevant algorithms are conducted to explore the optimum parameters for high accuracy and fast computation speed. A comprehensive set of experiments with existing and new datasets, shows that the method is effective despite pose variations, fast, and appropriate for large-scale data, and as accurate as the method with state-of-the-art performance on laboratory-based data. The proposed method was also applied to affective classification of images from the British Broadcast Corporation (BBC) in a task typical for a practical application providing some valuable insights.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Humans are a social species with the internal capability to process social information from other humans. To understand others behavior and to react accordingly, it is necessary to infer their internal states, emotions and aims, which are conveyed by subtle nonverbal bodily cues such as postures, gestures, and facial expressions. This thesis investigates the brain functions underlying the processing of such social information. Studies I and II of this thesis explore the neural basis of perceiving pain from another person s facial expressions by means of functional magnetic resonance imaging (fMRI) and magnetoencephalography (MEG). In Study I, observing another s facial expression of pain activated the affective pain system (previously associated with self-experienced pain) in accordance with the intensity of the observed expression. The strength of the response in anterior insula was also linked to the observer s empathic abilities. The cortical processing of facial pain expressions advanced from the visual to temporal-lobe areas at similar latencies (around 300 500 ms) to those previously shown for emotional expressions such as fear or disgust. Study III shows that perceiving a yawning face is associated with middle and posterior STS activity, and the contagiousness of a yawn correlates negatively with amygdalar activity. Study IV explored the brain correlates of interpreting social interaction between two members of the same species, in this case human and canine. Observing interaction engaged brain activity in very similar manner for both species. Moreover, the body and object sensitive brain areas of dog experts differentiated interaction from noninteraction in both humans and dogs whereas in the control subjects, similar differentiation occurred only for humans. Finally, Study V shows the engagement of the brain area associated with biological motion when exposed to the sounds produced by a single human being walking. However, more complex pattern of activation, with the walking sounds of several persons, suggests that as the social situation becomes more complex so does the brain response. Taken together, these studies demonstrate the roles of distinct cortical and subcortical brain regions in the perception and sharing of others internal states via facial and bodily gestures, and the connection of brain responses to behavioral attributes.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Evaluation of pain in neonates is difficult due to their limited means of communication. The aim was to determine whether behavioural reactions of cry and facial activity provoked by an invasive procedure could be discriminated from responses to non-invasive tactile events. Thirty-six healthy full-term infants (mean age 2.2 h) received 3 procedures in counterbalanced order: intramuscular injection, application of triple dye to the umbilical stub, and rubbing thigh with alcohol. Significant effects of procedure were found for total face activity and latency to face movement. A cluster of facial actions comprised of brow bulging, eyes squeezed shut, deepening of the naso-labial furrow and open mouth was associated most frequently with the invasive procedure. Comparisons between the 2 non-invasive procedures showed more facial activity to thigh swabbing and least to application of triple dye to the umbilical cord. Acoustic analysis of cry showed statistically significant differences across procedures only for latency to cry and cry duration for the group as a whole. However, babies who cried to two procedures showed higher pitch and greater intensity to the injection. There were no significant differences in melody, dysphonation, or jitter. Methodological difficulties for investigators in this area were examined, including criteria for the selection of cries for analysis, and the logical and statistical challenges of contrasting cries induced by different conditions when some babies do not always cry. It was concluded that facial expression, in combination with short latency to onset of cry and long duration of first cry cycle typifies reaction to acute invasive procedures.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Background This study aims to examine the relationship between how individuals with intellectual disabilities report their own levels of anger, and the ability of those individuals to recognize emotions. It was hypothesized that increased expression of anger would be linked to lower ability to recognize facial emotional expressions and increased tendency to interpret facial expressions in a hostile or negative manner. It was also hypothesized increased levels of anger may lead to the altered perception of a particular emotion.

Method A cross-sectional survey design was used. Thirty participants completed a test of facial emotion recognition (FER), and a self-report anger inventory (Benson & Ivins 1992) as part of a structured interview.

Results Individuals with higher self-reported anger did not show significantly reduced performance in FER, or interpret facial expressions in a more hostile manner compared with individuals with less self-reported anger. However, they were less accurate in recognizing neutral facial emotions.

Conclusions It is tentatively suggested that individuals with high levels of anger may be likely to perceive emotional content in a neutral facial expression because of their high levels of emotional arousal.