956 resultados para Interactive Performance
Resumo:
What happens when the traditional framing mechanisms of our performance environments are removed and we are forced as directors to work with actors in digital environments that capture performance in 360 degrees? As directors contend with the challenges of interactive performance, the emergence of the online audience and the powerful influence of the games industry, how can we approach the challenges of directing work that is performance captured and presented in real time using motion capture and associated 3D imaging software? The 360 degree real time capture of performance, while allowing for an unlimited amount of framing potential, demands a unique and uncompromisingly disciplined style of direction and performance that has thus far remained unstudied and unquantified. By a close analysis of the groundbreaking work of artists like Robert Zemeckis and the Wetta Digital studio it is possible to begin to quantify what the technical requirements and challenges of 360 degree direction might be, but little has been discovered about the challenges of communicating the unlimited potential of framing and focus to the actors who work with these directors within these systems. It cannot be argued that the potential of theatrical space has evolved beyond the physical and moved into a more accessible virtual and digitised form, so how then can we direct for this unlimited potential and where do we place the focus of our directed (and captured) performance?
Resumo:
The focus of this paper questions how the performance place was transformed to a performance space. This major change in distinction holds an ongoing significance to the development of the actors, scenographers, animators, writers and film directors craft within current digitally mediated and interactive performance environments. As part of this discussion this paper traces the crucial seed of the revolution that transformed modern scenographic practice from the droll of the romantic realism of the Victorian stage to the open potential of the performance environment of today. This is achieved through close readings on the practical work of Edward Gordon Craig and Adolphe Appia as well as the scenographic discussions of Chris Baugh.
Resumo:
This research explores music in space, as experienced through performing and music-making with interactive systems. It explores how musical parameters may be presented spatially and displayed visually with a view to their exploration by a musician during performance. Spatial arrangements of musical components, especially pitches and harmonies, have been widely studied in the literature, but the current capabilities of interactive systems allow the improvisational exploration of these musical spaces as part of a performance practice. This research focuses on quantised spatial organisation of musical parameters that can be categorised as grid music systems (GMSs), and interactive music systems based on them. The research explores and surveys existing and historical uses of GMSs, and develops and demonstrates the use of a novel grid music system designed for whole body interaction. Grid music systems provide plotting of spatialised input to construct patterned music on a two-dimensional grid layout. GMSs are navigated to construct a sequence of parametric steps, for example a series of pitches, rhythmic values, a chord sequence, or terraced dynamic steps. While they are conceptually simple when only controlling one musical dimension, grid systems may be layered to enable complex and satisfying musical results. These systems have proved a viable, effective, accessible and engaging means of music-making for the general user as well as the musician. GMSs have been widely used in electronic and digital music technologies, where they have generally been applied to small portable devices and software systems such as step sequencers and drum machines. This research shows that by scaling up a grid music system, music-making and musical improvisation are enhanced, gaining several advantages: (1) Full body location becomes the spatial input to the grid. The system becomes a partially immersive one in four related ways: spatially, graphically, sonically and musically. (2) Detection of body location by tracking enables hands-free operation, thereby allowing the playing of a musical instrument in addition to “playing” the grid system. (3) Visual information regarding musical parameters may be enhanced so that the performer may fully engage with existing spatial knowledge of musical materials. The result is that existing spatial knowledge is overlaid on, and combined with, music-space. Music-space is a new concept produced by the research, and is similar to notions of other musical spaces including soundscape, acoustic space, Smalley's “circumspace” and “immersive space” (2007, 48-52), and Lotis's “ambiophony” (2003), but is rather more textural and “alive”—and therefore very conducive to interaction. Music-space is that space occupied by music, set within normal space, which may be perceived by a person located within, or moving around in that space. Music-space has a perceivable “texture” made of tensions and relaxations, and contains spatial patterns of these formed by musical elements such as notes, harmonies, and sounds, changing over time. The music may be performed by live musicians, created electronically, or be prerecorded. Large-scale GMSs have the capability not only to interactively display musical information as music representative space, but to allow music-space to co-exist with it. Moving around the grid, the performer may interact in real time with musical materials in music-space, as they form over squares or move in paths. Additionally he/she may sense the textural matrix of the music-space while being immersed in surround sound covering the grid. The HarmonyGrid is a new computer-based interactive performance system developed during this research that provides a generative music-making system intended to accompany, or play along with, an improvising musician. This large-scale GMS employs full-body motion tracking over a projected grid. Playing with the system creates an enhanced performance employing live interactive music, along with graphical and spatial activity. Although one other experimental system provides certain aspects of immersive music-making, currently only the HarmonyGrid provides an environment to explore and experience music-space in a GMS.
Resumo:
We describe a technique for interactive rendering of diffraction effects produced by biological nanostructures such as snake skin surface gratings. Our approach uses imagery from atomic force microscopy that accurately captures the nanostructures responsible for structural coloration, that is, coloration due to wave interference, in a variety of animals. We develop a rendering technique that constructs bidirectional reflection distribution functions (BRDFs) directly from the measured data and leverages precomputation to achieve interactive performance. We demonstrate results of our approach using various shapes of the surface grating nanostructures. Finally, we evaluate the accuracy of our precomputation-based technique and compare to a reference BRDF construction technique.
Resumo:
We describe a technique for interactive rendering of diffraction effects produced by biological nanostructures, such as snake skin surface gratings. Our approach uses imagery from atomic force microscopy that accurately captures the geometry of the nanostructures responsible for structural colouration, that is, colouration due to wave interference, in a variety of animals. We develop a rendering technique that constructs bidirectional reflection distribution functions (BRDFs) directly from the measured data and leverages pre-computation to achieve interactive performance. We demonstrate results of our approach using various shapes of the surface grating nanostructures. Finally, we evaluate the accuracy of our pre-computation-based technique and compare to a reference BRDF construction technique.
Resumo:
Grid music systems provide discrete geometric methods for simplified music-making by providing spatialised input to construct patterned music on a 2D matrix layout. While they are conceptually simple, grid systems may be layered to enable complex and satisfying musical results. Grid music systems have been applied to a range of systems from small portable devices up to larger systems. In this paper we will discuss the use of grid music systems in general and present an overview of the HarmonyGrid system we have developed as a new interactive performance system. We discuss a range of issues related to the design and use of larger-scale grid- based interactive performance systems such as the HarmonyGrid.
Resumo:
what was silent will speak, what is closed will open and will take on a voice Paul Virilio The fundamental problem in dealing with the digital is that we are forced to contend with a fundamental deconstruction of form. A deconstruction that renders our content and practice into a single state that can be openly and easily manipulated, reimagined and mashed together in rapid time to create completely unique artefacts and potentially unwranglable jumbles of data. Once our work is essentially broken down into this series of number sequences, (or bytes), our sound, images, movies and documents – our memory files - we are left with nothing but choice….and this is the key concern. This absence of form transforms our work into new collections and poses unique challenges for the artist seeking opportunities to exploit the potential of digital deconstruction. It is through this struggle with the absent form that we are able to thoroughly explore the latent potential of content, exploit modern abstractions of time and devise approaches within our practice that actively deal with the digital as an essential matter of course.
Resumo:
The Silk Road Project was a practice-based research project investigating the potential of motion capture technology to inform perceptions of embodiment in dance performance. The project created a multi-disciplinary collaborative performance event using dance performance and real-time motion capture at Deakin University’s Deakin Motion Lab. Several new technological advances in producing real-time motion capture performance were produced, along with a performance event that examined the aesthetic interplay between a dancer’s movement and the precise mappings of its trajectories created by motion capture and real-time motion graphic visualisations.
Resumo:
The Pedestrian Interaction Patch Project (PIPP) seeks to exert influence over and encourage abnormal pedestrian behavior. By placing an unadvertised (and non recording) interactive video manipulation system and projection source in a high traffic public area, the PIPP allows pedestrians to privately (and publically) re-engage with a previously inactive physical environment, like a commonly used walkway or corridor. This system, the results of which are projected in real time on the architectural surface, inadvertently provides pedestrians with questions around preconceived notions of self and space. In an attempt to re-activate our relationship with the physical surrounds we occupy each day the PIPP creates a new set of memories to be recalled as we re-enter known environments once PIPP has moved on and as such re-enlivens our relationship with the everyday architecture we stroll past everyday. The PIPP environment is controlled using the software program Isadora, devised by Mark Coniglio at Troika Ranch, and contains a series of video manipulation patches that are designed to not only grab the pedestrians attention but to also encourage a sense of play and interaction between the architecture, the digital environment, the initially unsuspecting participant(s) and the pedestrian audience. The PIPP was included as part of the planned walking tour for the “Playing in Urban Spaces” seminar day, and was an installation that ran for the length of the symposium in a reclaimed pedestrian space that was encountered by both the participants and general public during the course of the day long event. Ideally once discovered PIPP encouraged pedestrians to return through the course of the seminar day to see if the environmental patches had changed or altered, and changed their standard route to include the PIPP installation or to avoid it, either way, encouraging an active response to the pathways normally traveled or newly discovered each day.
Resumo:
A series of short performance experiments demonstrating the creative potential of motion capture technology as a tool within performance for exploring audience behaviour and interaction. Examples highlight the possibilities for future work and give basic demonstrations of what is technically possible for the stage. Please note that people attending this performance may be videoed and motion captured for research and/or publication purposes. This work has been funded by the EPSRC c4dm platform grant.
Resumo:
How does a digitally mediated environment work towards the ongoing support of the Hip Hop landscape present in the work of Jonzi D productions UK National Tour of "Markus the Sadist"
Resumo:
This research explores a new approach to Beckett’s Not I, a work in which the spectator is asked to focus primarily on a human mouth suspended in space eight feet off the ground. The digital reconstitution of the x, y and z axes in Beckett's work has been retitled "?ot I". In "?ot I", the prescribed x, y and z axes of the original have been re-spatialised environmentally, physically and aurally to create an invigorated version of the text. In this work, it is primarily the reconstitution of spatial dynamics and time that are explored. An adaption and series of responses to Samuel Beckett's "Not I", first performed at Melbourne University and Deakin Motion.Lab, Deakin University, 2007
Resumo:
Presented as part of the Sampled Festival at Sadlers Wells UK in January 2009, PIPP #2 continues the exploration of the first installation (PIPP #1 Leeds) and asks audiences to connect with an interactive work presented in the foyer of a major dance festival. Literally choreographing their own dances on the walls of the venue, pedestrians re-connect with the architectural surrounds generating unique memories of self.
Resumo:
The Silk Road Project was a practice-based research project investigating the potential of motion capture technology to inform perceptions of embodiment in dance performance. The project created a multi-disciplinary collaborative performance event using dance performance and real-time motion capture at Deakin University’s Deakin Motion Lab. Performances at Deakin University, December 2007.
Resumo:
Le flou de mouvement de haute qualité est un effet de plus en plus important en rendu interactif. Avec l'augmentation constante en qualité des ressources et en fidélité des scènes vient un désir semblable pour des effets lenticulaires plus détaillés et réalistes. Cependant, même dans le contexte du rendu hors-ligne, le flou de mouvement est souvent approximé à l'aide d'un post-traitement. Les algorithmes de post-traitement pour le flou de mouvement ont fait des pas de géant au niveau de la qualité visuelle, générant des résultats plausibles tout en conservant un niveau de performance interactif. Néanmoins, des artefacts persistent en présence, par exemple, de mouvements superposés ou de motifs de mouvement à très large ou très fine échelle, ainsi qu'en présence de mouvement à la fois linéaire et rotationnel. De plus, des mouvements d'amplitude importante ont tendance à causer des artefacts évidents aux bordures d'objets ou d'image. Ce mémoire présente une technique qui résout ces artefacts avec un échantillonnage plus robuste et un système de filtrage qui échantillonne selon deux directions qui sont dynamiquement et automatiquement sélectionnées pour donner l'image la plus précise possible. Ces modifications entraînent un coût en performance somme toute mineur comparativement aux implantations existantes: nous pouvons générer un flou de mouvement plausible et temporellement cohérent pour plusieurs séquences d'animation complexes, le tout en moins de 2ms à une résolution de 1280 x 720. De plus, notre filtre est conçu pour s'intégrer facilement avec des filtres post-traitement d'anticrénelage.