985 resultados para Interactive Techniques
Resumo:
There are many interactive media systems, including computer games and media art works, in which it is desirable for music to vary in response to changes in the environment. In this paper we will outline a range of algorithmic techniques that enable music to adapt to such changes, taking into account the need for the music to vary in its expressiveness or mood while remaining coherent and recognisable. We will discuss the approaches which we have arrived at after experience in a range of adaptive music systems over recent years, and draw upon these experiences to inform discussion of relevant considerations and to illustrate the techniques and their effect.
Resumo:
The first use of computing technologies and the development of land use models in order to support decision-making processes in urban planning date back to as early as mid 20th century. The main thrust of computing applications in urban planning is their contribution to sound decision-making and planning practices. During the last couple of decades many new computing tools and technologies, including geospatial technologies, are designed to enhance planners' capability in dealing with complex urban environments and planning for prosperous and healthy communities. This chapter, therefore, examines the role of information technologies, particularly internet-based geographic information systems, as decision support systems to aid public participatory planning. The chapter discusses challenges and opportunities for the use of internet-based mapping application and tools in collaborative decision-making, and introduces a prototype internet-based geographic information system that is developed to integrate public-oriented interactive decision mechanisms into urban planning practice. This system, referred as the 'Community-based Internet GIS' model, incorporates advanced information technologies, distance learning, sustainable urban development principles and community involvement techniques in decision-making processes, and piloted in Shibuya, Tokyo, Japan.
Resumo:
Providing a positive user experience (UX) has become the key differentiator for products to win a competition in mature markets. To ensure that a product will support enjoyable experiences for its users, assessment of UX should be conducted early during the design and development process. However, most UX frameworks and evaluation techniques focus on understanding and assessing user’s experience with functional prototypes or existing products. This situation delays UX assessment until the late phases of product development which may result in costly design modifications and less desirable products. A qualitative study was conducted to investigate anticipated user experience (AUX) to address this issue. Twenty pairs of participants were asked to imagine an interactive product, draw their product concept, and anticipate their interactions and experiences with it. The data was analyzed to identify general characteristics of AUX. We found that while positive AUX was mostly related to an imagined/desired product, negative AUX was mainly associated with existing products. It was evident that the pragmatic quality of product was fundamental, and significantly influenced user’s anticipated experiences. Furthermore, the hedonic quality of product received more focus in positive than negative AUX. The results also showed that context, user profile, experiential knowledge, and anticipated emotion could be reflected in AUX. The understanding of AUX will help product designers to better foresee the users’ underlying needs and to focus on the most important aspects of their positive experiences, which in turn facilitates the designers to ensure pleasurable UX from the start of the design process.
Resumo:
Concrete is commonly used as a primary construction material for tall building construction. Load bearing components such as columns and walls in concrete buildings are subjected to instantaneous and long term axial shortening caused by the time dependent effects of "shrinkage", "creep" and "elastic" deformations. Reinforcing steel content, variable concrete modulus, volume to surface area ratio of the elements and environmental conditions govern axial shortening. The impact of differential axial shortening among columns and core shear walls escalate with increasing building height. Differential axial shortening of gravity loaded elements in geometrically complex and irregular buildings result in permanent distortion and deflection of the structural frame which have a significant impact on building envelopes, building services, secondary systems and the life time serviceability and performance of a building. Existing numerical methods commonly used in design to quantify axial shortening are mainly based on elastic analytical techniques and therefore unable to capture the complexity of non-linear time dependent effect. Ambient measurements of axial shortening using vibrating wire, external mechanical strain, and electronic strain gauges are methods that are available to verify pre-estimated values from the design stage. Installing these gauges permanently embedded in or on the surface of concrete components for continuous measurements during and after construction with adequate protection is uneconomical, inconvenient and unreliable. Therefore such methods are rarely if ever used in actual practice of building construction. This research project has developed a rigorous numerical procedure that encompasses linear and non-linear time dependent phenomena for prediction of axial shortening of reinforced concrete structural components at design stage. This procedure takes into consideration (i) construction sequence, (ii) time varying values of Young's Modulus of reinforced concrete and (iii) creep and shrinkage models that account for variability resulting from environmental effects. The capabilities of the procedure are illustrated through examples. In order to update previous predictions of axial shortening during the construction and service stages of the building, this research has also developed a vibration based procedure using ambient measurements. This procedure takes into consideration the changes in vibration characteristic of structure during and after construction. The application of this procedure is illustrated through numerical examples which also highlight the features. The vibration based procedure can also be used as a tool to assess structural health/performance of key structural components in the building during construction and service life.
Resumo:
In the university education arena, it is becoming apparent that traditional methods of conducting classes are not the most effective ways to achieve desired learning outcomes. The traditional class/method involves the instructor verbalizing information for passive, note-taking students who are assumed to be empty receptacles waiting to be filled with knowledge. This method is limited in its effectiveness, as the flow of information is usually only in one direction. Furthermore, “It has been demonstrated that students in many cases can recite and apply formulas in numerical problems, but the actual meaning and understanding of the concept behind the formula is not acquired (Crouch & Mazur)”. It is apparent that memorization is the main technique present in this approach. A more effective method of teaching involves increasing the students’ level of activity during, and hence their involvement in the learning process. This technique stimulates self- learning and assists in keeping these students’ levels of concentration more uniform. In this work, I am therefore interested in studying the influence of a particular TLA on students’ learning-outcomes. I want to foster high-level understanding and critical thinking skills using active learning (Silberman, 1996) techniques. The TLA in question aims to promote self-study by students and to expose them to a situation where their learning-outcomes can be tested. The motivation behind this activity is based on studies which suggest that some sensory modalities are more effective than others. Using various instruments for data collection and by means of a thorough analysis I present evidence of the effectiveness of this action research project which aims to improve my own teaching practices, with the ultimate goal of enhancing student’s learning.
Resumo:
This project investigates machine listening and improvisation in interactive music systems with the goal of improvising musically appropriate accompaniment to an audio stream in real-time. The input audio may be from a live musical ensemble, or playback of a recording for use by a DJ. I present a collection of robust techniques for machine listening in the context of Western popular dance music genres, and strategies of improvisation to allow for intuitive and musically salient interaction in live performance. The findings are embodied in a computational agent – the Jambot – capable of real-time musical improvisation in an ensemble setting. Conceptually the agent’s functionality is split into three domains: reception, analysis and generation. The project has resulted in novel techniques for addressing a range of issues in each of these domains. In the reception domain I present a novel suite of onset detection algorithms for real-time detection and classification of percussive onsets. This suite achieves reasonable discrimination between the kick, snare and hi-hat attacks of a standard drum-kit, with sufficiently low-latency to allow perceptually simultaneous triggering of accompaniment notes. The onset detection algorithms are designed to operate in the context of complex polyphonic audio. In the analysis domain I present novel beat-tracking and metre-induction algorithms that operate in real-time and are responsive to change in a live setting. I also present a novel analytic model of rhythm, based on musically salient features. This model informs the generation process, affording intuitive parametric control and allowing for the creation of a broad range of interesting rhythms. In the generation domain I present a novel improvisatory architecture drawing on theories of music perception, which provides a mechanism for the real-time generation of complementary accompaniment in an ensemble setting. All of these innovations have been combined into a computational agent – the Jambot, which is capable of producing improvised percussive musical accompaniment to an audio stream in real-time. I situate the architectural philosophy of the Jambot within contemporary debate regarding the nature of cognition and artificial intelligence, and argue for an approach to algorithmic improvisation that privileges the minimisation of cognitive dissonance in human-computer interaction. This thesis contains extensive written discussions of the Jambot and its component algorithms, along with some comparative analyses of aspects of its operation and aesthetic evaluations of its output. The accompanying CD contains the Jambot software, along with video documentation of experiments and performances conducted during the project.
Resumo:
We advocate for the use of predictive techniques in interactive computer music systems. We suggest that the inclusion of prediction can assist in the design of proactive rather than reactive computational performance partners. We summarize the significant role prediction plays in human musical decisions, and the only modest use of prediction in interactive music systems to date. After describing how we are working toward employing predictive processes in our own metacreation software we reflect on future extensions to these approaches.
Resumo:
Custom designed for display on the Cube Installation situated in the new Science and Engineering Centre (SEC) at QUT, the ECOS project is a playful interface that uses real-time weather data to simulate how a five-star energy building operates in climates all over the world. In collaboration with the SEC building managers, the ECOS Project incorporates energy consumption and generation data of the building into an interactive simulation, which is both engaging to users and highly informative, and which invites play and reflection on the roles of green buildings. ECOS focuses on the principle that humans can have both a positive and negative impact on ecosystems with both local and global consequence. The ECOS project draws on the practice of Eco-Visualisation, a term used to encapsulate the important merging of environmental data visualization with the philosophy of sustainability. Holmes (2007) uses the term Eco-Visualisation (EV) to refer to data visualisations that ‘display the real time consumption statistics of key environmental resources for the goal of promoting ecological literacy’. EVs are commonly artifacts of interaction design, information design, interface design and industrial design, but are informed by various intellectual disciplines that have shared interests in sustainability. As a result of surveying a number of projects, Pierce, Odom and Blevis (2008) outline strategies for designing and evaluating effective EVs, including ‘connecting behavior to material impacts of consumption, encouraging playful engagement and exploration with energy, raising public awareness and facilitating discussion, and stimulating critical reflection.’ Consequently, Froehlich (2010) and his colleagues also use the term ‘Eco-feedback technology’ to describe the same field. ‘Green IT’ is another variation which Tomlinson (2010) describes as a ‘field at the juncture of two trends… the growing concern over environmental issues’ and ‘the use of digital tools and techniques for manipulating information.’ The ECOS Project team is guided by these principles, but more importantly, propose an example for how these principles may be achieved. The ECOS Project presents a simplified interface to the very complex domain of thermodynamic and climate modeling. From a mathematical perspective, the simulation can be divided into two models, which interact and compete for balance – the comfort of ECOS’ virtual denizens and the ecological and environmental health of the virtual world. The comfort model is based on the study of psychometrics, and specifically those relating to human comfort. This provides baseline micro-climatic values for what constitutes a comfortable working environment within the QUT SEC buildings. The difference between the ambient outside temperature (as determined by polling the Google Weather API for live weather data) and the internal thermostat of the building (as set by the user) allows us to estimate the energy required to either heat or cool the building. Once the energy requirements can be ascertained, this is then balanced with the ability of the building to produce enough power from green energy sources (solar, wind and gas) to cover its energy requirements. Calculating the relative amount of energy produced by wind and solar can be done by, in the case of solar for example, considering the size of panel and the amount of solar radiation it is receiving at any given time, which in turn can be estimated based on the temperature and conditions returned by the live weather API. Some of these variables can be altered by the user, allowing them to attempt to optimize the health of the building. The variables that can be changed are the budget allocated to green energy sources such as the Solar Panels, Wind Generator and the Air conditioning to control the internal building temperature. These variables influence the energy input and output variables, modeled on the real energy usage statistics drawn from the SEC data provided by the building managers.
Resumo:
In this paper, an interactive planning and scheduling framework are proposed for optimising operations from pits to crushers in ore mining industry. Series of theoretical and practical operations research techniques are investigated to improve the overall efficiency of mining systems due to the facts that mining managers need to tackle optimisation problems within different horizons and with different levels of detail. Under this framework, mine design planning,mine production sequencing and mine transportation scheduling models are integrated and interacted within a whole optimisation system. The proposed integrated framework could be used by mining industry for reducing equipment costs, improving the production efficiency and maximising the net present value.
Resumo:
This project constructed virtual plant leaf surfaces from digitised data sets for use in droplet spray models. Digitisation techniques for obtaining data sets for cotton, chenopodium and wheat leaves are discussed and novel algorithms for the reconstruction of the leaves from these three plant species are developed. The reconstructed leaf surfaces are included into agricultural droplet spray models to investigate the effect of the nozzle and spray formulation combination on the proportion of spray retained by the plant. A numerical study of the post-impaction motion of large droplets that have formed on the leaf surface is also considered.
Resumo:
As an animator and practice-based researcher with a background in games development, I am interested in technological change in the video game medium, with a focus on the tools and technologies that drive game character animation and interactive story. In particular, I am concerned with the issue of ‘user agency’, or the ability of the end user to affect story development—a key quality of the gaming experience and essential to the aesthetics of gaming, which is defined in large measure by its interactive elements. In this paper I consider the unique qualities of the video game1 as an artistic medium and the impact that these qualities have on the production of animated virtual character performances. I discuss the somewhat oppositional nature of animated character performances found in games from recent years, which range from inactive to active—in other words, low to high agency. Where procedural techniques (based on coded rules of movement) are used to model dynamic character performances, the user has the ability to interactively affect characters in real-time within the larger sphere of the game. This game play creates a high degree of user agency. However, it lacks the aesthetic nuances of the more crafted sections of games: the short cut-scenes, or narrative interludes where entire acted performances are mapped onto game characters (often via performance capture)2 and constructed into relatively cinematic representations. While visually spectacular, cut-scenes involve minimal interactivity, so user agency is low. Contemporary games typically float between these two distinct methods of animation, from a focus on user agency and dynamically responsive animation to a focus on animated character performance in sections where the user is a passive participant. We tend to think of the majority of action in games as taking place via playable figures: an avatar or central character that represents a player. However, there is another realm of characters that also partake in actions ranging from significant to incidental: non-playable characters, or NPCs, which populate action sequences where game play takes place as well as cut scenes that unfold without much or any interaction on the part of the player. NPCs are the equivalent to supporting roles, bit characters, or extras in the world of cinema. Minor NPCs may simply be background characters or enemies to defeat, but many NPCs are crucial to the overall game story. It is my argument that, thus far, no game has successfully utilized the full potential of these characters to contribute toward development of interactive, high performance action. In particular, a type of NPC that I have identified as ‘pivotal’3—those constituting the supporting cast of a video game—are essential to the telling of a game story, particularly in genres that focus on story and characters: adventure games, action games, and role-playing games. A game story can be defined as the entirety of the narrative, told through non-interactive cut-scenes as well a interactive sections of play, and development of more complex stories in games clearly impacts the animation of NPCs. I argue that NPCs in games must be capable of acting with emotion throughout a game—in the cutscenes, which are tightly controlled, but also in sections of game play, where player agency can potentially alter the story in real-time. When the animated performance of NPCs and user agency are not continuous throughout the game, the implication is that game stories may be primarily told through short movies within games, making it more difficult to define video games animation as a distinct artistic medium.
Resumo:
Constraint programming has emerged as a successful paradigm for modelling combinatorial problems arising from practical situations. In many of those situations, we are not provided with an immutable set of constraints. Instead, a user will modify his requirements, in an interactive fashion, until he is satisfied with a solution. Examples of such applications include, amongst others, model-based diagnosis, expert systems, product configurators. The system he interacts with must be able to assist him by showing the consequences of his requirements. Explanations are the ideal tool for providing this assistance. However, existing notions of explanations fail to provide sufficient information. We define new forms of explanations that aim to be more informative. Even if explanation generation is a very hard task, in the applications we consider, we must manage to provide a satisfactory level of interactivity and, therefore, we cannot afford long computational times. We introduce the concept of representative sets of relaxations, a compact set of relaxations that shows the user at least one way to satisfy each of his requirements and at least one way to relax them, and present an algorithm that efficiently computes such sets. We introduce the concept of most soluble relaxations, maximising the number of products they allow. We present algorithms to compute such relaxations in times compatible with interactivity, achieving this by indifferently making use of different types of compiled representations. We propose to generalise the concept of prime implicates to constraint problems with the concept of domain consequences, and suggest to generate them as a compilation strategy. This sets a new approach in compilation, and allows to address explanation-related queries in an efficient way. We define ordered automata to compactly represent large sets of domain consequences, in an orthogonal way from existing compilation techniques that represent large sets of solutions.
Resumo:
Fado was listed as UNESCO Intangible Cultural Heritage in 2011. This dissertation describes a theoretical model, as well as an automatic system, able to generate instrumental music based on the musics and vocal sounds typically associated with fado’s practice. A description of the phenomenon of fado, its musics and vocal sounds, based on ethnographic, historical sources and empirical data is presented. The data includes the creation of a digital corpus, of musical transcriptions, identified as fado, and statistical analysis via music information retrieval techniques. The second part consists in the formulation of a theory and the coding of a symbolic model, as a proof of concept, for the automatic generation of instrumental music based on the one in the corpus.
Resumo:
Réalisé en cotutelle avec l'Université de Grenoble.
Resumo:
La théorie de l'information quantique s'est développée à une vitesse fulgurante au cours des vingt dernières années, avec des analogues et extensions des théorèmes de codage de source et de codage sur canal bruité pour la communication unidirectionnelle. Pour la communication interactive, un analogue quantique de la complexité de la communication a été développé, pour lequel les protocoles quantiques peuvent performer exponentiellement mieux que les meilleurs protocoles classiques pour certaines tâches classiques. Cependant, l'information quantique est beaucoup plus sensible au bruit que l'information classique. Il est donc impératif d'utiliser les ressources quantiques à leur plein potentiel. Dans cette thèse, nous étudions les protocoles quantiques interactifs du point de vue de la théorie de l'information et étudions les analogues du codage de source et du codage sur canal bruité. Le cadre considéré est celui de la complexité de la communication: Alice et Bob veulent faire un calcul quantique biparti tout en minimisant la quantité de communication échangée, sans égard au coût des calculs locaux. Nos résultats sont séparés en trois chapitres distincts, qui sont organisés de sorte à ce que chacun puisse être lu indépendamment. Étant donné le rôle central qu'elle occupe dans le contexte de la compression interactive, un chapitre est dédié à l'étude de la tâche de la redistribution d'état quantique. Nous prouvons des bornes inférieures sur les coûts de communication nécessaires dans un contexte interactif. Nous prouvons également des bornes atteignables avec un seul message, dans un contexte d'usage unique. Dans un chapitre subséquent, nous définissons une nouvelle notion de complexité de l'information quantique. Celle-ci caractérise la quantité d'information, plutôt que de communication, qu'Alice et Bob doivent échanger pour calculer une tâche bipartie. Nous prouvons beaucoup de propriétés structurelles pour cette quantité, et nous lui donnons une interprétation opérationnelle en tant que complexité de la communication quantique amortie. Dans le cas particulier d'entrées classiques, nous donnons une autre caractérisation permettant de quantifier le coût encouru par un protocole quantique qui oublie de l'information classique. Deux applications sont présentées: le premier résultat général de somme directe pour la complexité de la communication quantique à plus d'une ronde, ainsi qu'une borne optimale, à un terme polylogarithmique près, pour la complexité de la communication quantique avec un nombre de rondes limité pour la fonction « ensembles disjoints ». Dans un chapitre final, nous initions l'étude de la capacité interactive quantique pour les canaux bruités. Étant donné que les techniques pour distribuer de l'intrication sont bien étudiées, nous nous concentrons sur un modèle avec intrication préalable parfaite et communication classique bruitée. Nous démontrons que dans le cadre plus ardu des erreurs adversarielles, nous pouvons tolérer un taux d'erreur maximal de une demie moins epsilon, avec epsilon plus grand que zéro arbitrairement petit, et ce avec un taux de communication positif. Il s'ensuit que les canaux avec bruit aléatoire ayant une capacité positive pour la transmission unidirectionnelle ont une capacité positive pour la communication interactive quantique. Nous concluons avec une discussion de nos résultats et des directions futures pour ce programme de recherche sur une théorie de l'information quantique interactive.