1000 resultados para Musical interfaces
Resumo:
Amphibian is an 10’00’’ musical work which explores new musical interfaces and approaches to hybridising performance practices from the popular music, electronic dance music and computer music traditions. The work is designed to be presented in a range of contexts associated with the electro-acoustic, popular and classical music traditions. The work is for two performers using two synchronised laptops, an electric guitar and a custom designed gestural interface for vocal performers - the e-Mic (Extended Mic-stand Interface Controller). This interface was developed by one of the co-authors, Donna Hewitt. The e-Mic allows a vocal performer to manipulate the voice in real time through the capture of physical gestures via an array of sensors - pressure, distance, tilt - along with ribbon controllers and an X-Y joystick microphone mount. Performance data are then sent to a computer, running audio-processing software, which is used to transform the audio signal from the microphone. In this work, data is also exchanged between performers via a local wireless network, allowing performers to work with shared data streams. The duo employs the gestural conventions of guitarist and singer (i.e. 'a band' in a popular music context), but transform these sounds and gestures into new digital music. The gestural language of popular music is deliberately subverted and taken into a new context. The piece thus explores the nexus between the sonic and performative practices of electro acoustic music and intelligent electronic dance music (‘idm’). This work was situated in the research fields of new musical interfacing, interaction design, experimental music composition and performance. The contexts in which the research was conducted were live musical performance and studio music production. The work investigated new methods for musical interfacing, performance data mapping, hybrid performance and compositional practices in electronic music. The research methodology was practice-led. New insights were gained from the iterative experimental workshopping of gestural inputs, musical data mapping, inter-performer data exchange, software patch design, data and audio processing chains. In respect of interfacing, there were innovations in the design and implementation of a novel sensor-based gestural interface for singers, the e-Mic, one of the only existing gestural controllers for singers. This work explored the compositional potential of sharing real time performance data between performers and deployed novel methods for inter-performer data exchange and mapping. As regards stylistic and performance innovation, the work explored and demonstrated an approach to the hybridisation of the gestural and sonic language of popular music with recent ‘post-digital’ approaches to laptop based experimental music The development of the work was supported by an Australia Council Grant. Research findings have been disseminated via a range of international conference publications, recordings, radio interviews (ABC Classic FM), broadcasts, and performances at international events and festivals. The work was curated into the major Australian international festival, Liquid Architecture, and was selected by an international music jury (through blind peer review) for presentation at the International Computer Music Conference in Belfast, N. Ireland.
Resumo:
Real-time adaptive music is now well-established as a popular medium, largely through its use in video game soundtracks. Commercial packages, such as fmod, make freely available the underlying technical methods for use in educational contexts, making adaptive music technologies accessible to students. Writing adaptive music, however, presents a significant learning challenge, not least because it requires a different mode of thought, and tutor and learner may have few mutual points of connection in discovering and understanding the musical drivers, relationships and structures in these works. This article discusses the creation of ‘BitBox!’, a gestural music interface designed to deconstruct and explain the component elements of adaptive composition through interactive play. The interface was displayed at the Dare Protoplay games exposition in Dundee in August 2014. The initial proof-of- concept study proved successful, suggesting possible refinements in design and a broader range of applications.
Resumo:
In this paper, we propose a theoretical framework for the design of tangible interfaces for musical expression. The main insight for the proposed approach is the importance and utility of familiar sensorimotor experiences for the creation of engaging and playable new musical instruments. In particular, we suggest exploiting the commonalities between different natural interactions by varying the auditory response or tactile details of the instrument within certain limits. Using this principle, devices for classes of sounds such as coarse grain collision interactions or friction interactions can be designed. The designs we propose retain the familiar tactile aspect of the interaction so that the performer can take advantage of tacit knowledge gained through experiences with such phenomena in the real world.
Resumo:
At the outset of a discussion of evaluating digital musical instruments, that is to say instruments whose sound generators are digital and separable though not necessarily separate from their control interfaces (Malloch, 2006), it is reasonable to ask what the term evaluation in this context really means. After all, there may be many perspectives from which to view the effectiveness or otherwise of the instruments we build. For most performers, performance on an instrument becomes a means of evaluating how well it functions in the context of live music making, and their measure of success is the response of the audience to their performance. Audiences evaluate performances on the basis of how engaged they feel they have been by what they have seen and heard. When questioned, they are likely to describe good performances as “exciting,” “skillful,” “musical.” Bad performances are “boring,” and those which are marred by technical malfunction are often dismissed out of hand. If performance is considered to be a valid means of evaluating a musical instrument, then it follows that, for the field of DMI design, a much broader definition of the term “evaluation” than that typically used in human-computer interaction (HCI) is required to reflect the fact that there are a number of stakeholders involved in the design and evaluation of DMIs. In addition to players and audiences, there are also composers, instrument builders, component manufacturers, and perhaps even customers, each of whom will have a different concept of what is meant by “evaluation.”
Resumo:
As NIME's focus has expanded beyond the design reports which were pervasive in the early days to include studies and experiments involving music control devices, we report on a particular area of activity that has been overlooked: designs of music devices in experimental contexts. We demonstrate this is distinct from designing for artistic performances, with a unique set of novel challenges. A survey of methodological approaches to experiments in NIME reveals a tendency to rely on existing instruments or evaluations of new devices designed for broader creative application. We present two examples from our own studies that reveal the merits of designing purpose-built devices for experimental contexts.
Resumo:
Psychotherapy literature provides a theoretical understanding of parent-infant attachment. This article will reflect upon the specific need to give thoughtful consideration to those infants admitted to the acute-care setting, such as neonatal and paediatric intensive care units, and the potential for this environment to affect infant development and the parent-infant relationship. Infant-directed singing, as described in this article, is an improvised form of vocal interaction that is specifically informed by an understanding of the musical parameters of pitch, rhythm, phrasing, timbre, register, dynamic, tempo and silence. This article will detail a theoretical understanding of using infant-directed singing to foster parent-infant interaction within the acute care environment. In particular, the potentially sensitive, reciprocal and engaging nature of infant-directed singing, coupled with its ability to promote and support maternal demonstrations of empathy, will be discussed with a view to the psychological and physical development of the hospitalised infant.
Resumo:
Recent developments in interactive technologies have seen major changes in the manner in which artists, performers, and creative individuals interact with digital music technology; this is due to the increasing variety of interactive technologies that are readily available today. Digital Musical Instruments (DMIs) present musicians with performance challenges that are unique to this form of computer music. One of the most significant deviations from conventional acoustic musical instruments is the level of physical feedback conveyed by the instrument to the user. Currently, new interfaces for musical expression are not designed to be as physically communicative as acoustic instruments. Specifically, DMIs are often void of haptic feedback and therefore lack the ability to impart important performance information to the user. Moreover, there currently is no standardised way to measure the effect of this lack of physical feedback. Best practice would expect that there should be a set of methods to effectively, repeatedly, and quantifiably evaluate the functionality, usability, and user experience of DMIs. Earlier theoretical and technological applications of haptics have tried to address device performance issues associated with the lack of feedback in DMI designs and it has been argued that the level of haptic feedback presented to a user can significantly affect the user’s overall emotive feeling towards a musical device. The outcome of the investigations contained within this thesis are intended to inform new haptic interface.