4 resultados para nonprofessional audience
em Repositório Digital da UNIVERSIDADE DA MADEIRA - Portugal
Resumo:
This thesis argues on the possibility of supporting deictic gestures through handheld multi-touch devices in remote presentation scenarios. In [1], Clark distinguishes indicative techniques of placing-for and directing-to, where placing-for refers to placing a referent into the addressee’s attention, and directing-to refers to directing the addressee’s attention towards a referent. Keynote, PowerPoint, FuzeMeeting and others support placing-for efficiently with slide transitions, and animations, but support limited to none directing-to. The traditional “pointing feature” present in some presentation tools comes as a virtual laser pointer or mouse cursor. [12, 13] have shown that the mouse cursor and laser pointer offer very little informational expressiveness and do not do justice to human communicative gestures. In this project, a prototype application was implemented for the iPad in order to explore, develop, and test the concept of pointing in remote presentations. The prototype offers visualizing and navigating the slides as well as “pointing” and zooming. To further investigate the problem and possible solutions, a theoretical framework was designed representing the relationships between the presenter’s intention and gesture and the resulting visual effect (cursor) that enables the audience members to interpret the meaning of the effect and the presenter’s intention. Two studies were performed to investigate people’s appreciation of different ways of presenting remotely. An initial qualitative study was performed at The Hague, followed by an online quantitative user experiment. The results indicate that subjects found pointing to be helpful in understanding and concentrating, while the detached video feed of the presenter was considered to be distracting. The positive qualities of having the video feed were the emotion and social presence that it adds to the presentations. For a number of subjects, pointing displayed some of the same social and personal qualities [2] that video affords, while less intensified. The combination of pointing and video proved to be successful with 10-out-of-19 subjects scoring it the highest while pointing example came at a close 8-out-of-19. Video was the least preferred with only one subject preferring it. We suggest that the research performed here could provide a basis for future research and possibly be applied in a variety of distributed collaborative settings.
Resumo:
This thesis describes all process of the development of music visualization, starting with the implementation, followed by realization and then evaluation. The main goal is to have to knowledge of how the audience live performance experience can be enhanced through music visualization. With music visualization is possible to give a better understanding about the music feelings constructing an intensive atmosphere in the live music performance, which enhances the connection between the live music and the audience through visuals. These visuals have to be related to the live music, furthermore has to quickly respond to live music changes and introduce novelty into the visuals. The mapping between music and visuals is the focus of this project, in order to improve the relationship between the live performance and the spectators. The implementation of music visualization is based on the translation of music into graphic visualizations, therefore at the beginning the project was based on the existent works. Later on, it was decided to introduce new ways of conveying music into visuals. Several attempts were made in order to discover the most efficient mapping between music and visualization so people can fully connect with the performance. Throughout this project, those attempts resulted in several music visualizations created for four live music performances, afterwards it was produced an online survey to evaluate those live performances with music visualization. In the end, all conclusions are presented based on the results of the online survey, and also is explained which music elements should be depicted in the visuals, plus how those visuals should respond to the selected music elements.
Resumo:
This report tells a story which started as an idea that came to us to fight the battle-cry feeling commonly known as stress and anxiety. Before creating the solution of the idea, we first need to understand the feelings underneath and its effects on our well-being. Throughout the course of our lives, we experience states of weakness and fear. These feelings can arise, for instance, while we are in an emergency room. Needless to say, how much it would have imaginable effects on children, who are unfamiliar to such environments. We ran through a serious of scenarios to find the most suitable solution, among them the study of interaction with positive expressions by Dr. Baldwin, proved to be a valued resource. It was reduced due to its length and to be suitable to our public audience. The game was then created in order to reduce or even eliminate the stress and anxiety of children. Since the game was initially released, some modifications had been made but the original idea - interaction with positive expressions – remained. When the time came, we asked children to play one of the two versions of the game while waiting in the emergency room. This not only created a diversion for them but also a learning experience as it displayed some hospital equipment. The difference between the two versions is that one provides expressions, while the other does not. After all our hard work, we felt rewarded because the project proved its worth and we would see that in the expressions on children’s faces while they played. Most importantly, their anxiety level numbers were significantly reduced during that short period of time.
Resumo:
Location aware content-based experiences have a substantial tradition in HCI, several projects over the last two decades have explored the association of digital media to specific locations or objects. However, a large portion of the literature has little focus on the creative side of designing of the experience and on the iterative process of user evaluations. In this thesis we present two iterations in the design and evaluation of a location based story delivery system (LBSDS), inspired by local folklore and oral storytelling in Madeira. We started by testing an already existing location based story platform, PlaceWear, with short multimedia clips that recounted local traditions and folktales, to this experience we called iLand. An initial evaluation of iLand, was conducted; we shadowed users during the experience and then they responded to a questionnaire. By analyzing the evaluation results we uncovered several issues that informed the redesign of the system itself as well as part of the story content. The outcome of this re design was the 7Stories experience. In the new experience we performed the integration of visual markers in the interface and the framing of the fragmented story content through the literary technique of the narrator. This was done aiming to improving the connection of the audience to the physical context where the experience is delivered. The 7Stories experience was evaluated following a similar methodology to the iLand evaluation but the user’s experience resulted considerably different; because of the same setting for the experience in both versions and the constancy of the most of the content across the two versions we were able to assess the specific effect of the new design and discuss its strengths and shortcomings. Although we did not run a formal and strict comparative test between the two evaluations, it is evident from the collected data how the specific design changes to our LBSDS influenced the user experience.