126 resultados para Affective Computing


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This output is an invited and refereed chapter in the second of the two book length outputs resulting from the EU HUMAINE grant and follow-on grants. The book is in the OUP Affective Science Series and is intended to provide a theoretically oriented state of the art model for those working in the area of affective computing. Each chapter provides a synthesis of a specific area and presents new data/findings/approaches developed by the author(s) which take the area further. This chapter is in the section on ‘Approaches to developing expression corpora and databases.’ The chapter provides a critical synthesis of the issues involved in databases for affective computing and introduces the SEMAINE SAL Database, developed as an integral part of the EU SEMAINE Project (The Sensitive Agent Project 2008-2011) which is an interdisciplinary project. The project aimed to develop a computer interface that would allow a human to interact with an artificial agent in an emotional manner.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper describes a substantial effort to build a real-time interactive multimodal dialogue system with a focus on emotional and non-verbal interaction capabilities. The work is motivated by the aim to provide technology with competences in perceiving and producing the emotional and non-verbal behaviours required to sustain a conversational dialogue. We present the Sensitive Artificial Listener (SAL) scenario as a setting which seems particularly suited for the study of emotional and non- verbal behaviour, since it requires only very limited verbal understanding on the part of the machine. This scenario allows us to concentrate on non-verbal capabilities without having to address at the same time the challenges of spoken language understanding, task modeling etc. We first report on three prototype versions of the SAL scenario, in which the behaviour of the Sensitive Artificial Listener characters was determined by a human operator. These prototypes served the purpose of verifying the effectiveness of the SAL scenario and allowed us to collect data required for building system components for analysing and synthesising the respective behaviours. We then describe the fully autonomous integrated real-time system we created, which combines incremental analysis of user behaviour, dialogue management, and synthesis of speaker and listener behaviour of a SAL character displayed as a virtual agent. We discuss principles that should underlie the evaluation of SAL-type systems. Since the system is designed for modularity and reuse, and since it is publicly available, the SAL system has potential as a joint research tool in the affective computing research community.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

SEMAINE has created a large audiovisual database as a part of an iterative approach to building Sensitive Artificial Listener (SAL) agents that can engage a person in a sustained, emotionally colored conversation. Data used to build the agents came from interactions between users and an operator simulating a SAL agent, in different configurations: Solid SAL (designed so that operators displayed an appropriate nonverbal behavior) and Semi-automatic SAL (designed so that users' experience approximated interacting with a machine). We then recorded user interactions with the developed system, Automatic SAL, comparing the most communicatively competent version to versions with reduced nonverbal skills. High quality recording was provided by five high-resolution, high-framerate cameras, and four microphones, recorded synchronously. Recordings total 150 participants, for a total of 959 conversations with individual SAL characters, lasting approximately 5 minutes each. Solid SAL recordings are transcribed and extensively annotated: 6-8 raters per clip traced five affective dimensions and 27 associated categories. Other scenarios are labeled on the same pattern, but less fully. Additional information includes FACS annotation on selected extracts, identification of laughs, nods, and shakes, and measures of user engagement with the automatic system. The material is available through a web-accessible database. © 2010-2012 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper tries to achieve a balanced view of the ethical issues raised by emotion-oriented technology as it is, rather than as it might be imagined. A high proportion of applications seem ethically neutral. Uses in entertainment and allied areas do no great harm or good. Empowering professions may do either, but regulatory systems already exist. Ethically positive aspirations involve mitigating problems that already exist by supporting humans in emotion-related judgments, by replacing technology that treats people in dehumanized and/or demeaning ways, and by improving access for groups who struggle with existing interfaces. Emotion-oriented computing may also contribute to revaluing human faculties other than pure intellect. Many potential negatives apply to technology as a whole. Concerns specifically related to emotion involve creating a lie, by simulate emotions that the systems do not have, or promoting mechanistic conceptions of emotion. Intermediate issues arise where more general problems could be exacerbated-helping systems to sway human choices or encouraging humans to choose virtual worlds rather than reality. "SIIF" systems (semi-intelligent information filters) are particularly problematic. These use simplified rules to make judgments about people that are complex, and have potentially serious consequences. The picture is one of balances to recognize and negotiate, not uniform good or evil. © 2010-2012 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

For many years psychological research on facial expression of emotion has relied heavily on a recognition paradigm based on posed static photographs. There is growing evidence that there may be fundamental differences between the expressions depicted in such stimuli and the emotional expressions present in everyday life. Affective computing, with its pragmatic emphasis on realism, needs examples of natural emotion. This paper describes a unique database containing recordings of mild to moderate emotionally coloured responses to a series of laboratory based emotion induction tasks. The recordings are accompanied by information on self-report of emotion and intensity, continuous trace-style ratings of valence and intensity, the sex of the participant, the sex of the experimenter, the active or passive nature of the induction task and it gives researchers the opportunity to compare expressions from people from more than one culture.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Convincing conversational agents require a coherent set of behavioral responses that can be interpreted by a human observer as indicative of a personality. This paper discusses the continued development and subsequent evaluation of virtual agents based on sound psychological principles. We use Eysenck's theoretical basis to explain aspects of the characterization of our agents, and we describe an architecture where personality affects the agent's global behavior quality as well as their back-channel productions. Drawing on psychological research, we evaluate perception of our agents' personalities and credibility by human viewers (N = 187). Our results suggest that we succeeded in validating theoretically grounded indicators of personality in our virtual agents, and that it is feasible to place our characters on Eysenck's scales. A key finding is that the presence of behavioral characteristics reinforces the prescribed personality profiles that are already emerging from the still images. Our long-term goal is to enhance agents' ability to sustain realistic interaction with human users, and we discuss how this preliminary work may be further developed to include more systematic variation of Eysenck's personality scales. © 2012 IEEE.


--------------------------------------------------------------------------------

Reaxys Database Information|

--------------------------------------------------------------------------------