Changing musical emotion : a computational rule system for modifying score and performance


Autoria(s): Livingstone, Steven R.; Muhlberger, Ralf; Brown, Andrew R.; Thompson, William F.
Data(s)

2010

Resumo

When communicating emotion in music, composers and performers encode their expressive intentions through the control of basic musical features such as: pitch, loudness, timbre, mode, and articulation. The extent to which emotion can be controlled through the systematic manipulation of these features has not been fully examined. In this paper we present CMERS, a Computational Music Emotion Rule System for the control of perceived musical emotion that modifies features at the levels of score and performance in real-time. CMERS performance was evaluated in two rounds of perceptual testing. In experiment I, 20 participants continuously rated the perceived emotion of 15 music samples generated by CMERS. Three music works, each with five emotional variations were used (normal, happy, sad, angry, and tender). The intended emotion by CMERS was correctly identified 78% of the time, with significant shifts in valence and arousal also recorded, regardless of the works’ original emotion.

Identificador

http://eprints.qut.edu.au/31295/

Publicador

The MIT Press

Relação

DOI:10.1162/comj.2010.34.1.41

Livingstone, Steven R., Muhlberger, Ralf, Brown, Andrew R., & Thompson, William F. (2010) Changing musical emotion : a computational rule system for modifying score and performance. Computer Music Journal, 34(1), pp. 41-64.

Fonte

Australasian CRC for Interaction Design (ACID); Creative Industries Faculty; Music & Sound

Palavras-Chave #190400 PERFORMING ARTS AND CREATIVE WRITING #music perception #emotion #generative music
Tipo

Journal Article