4 resultados para Chu gokugo Bunpo .

em Repositório Digital da UNIVERSIDADE DA MADEIRA - Portugal


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image stitching is the process of joining several images to obtain a bigger view of a scene. It is used, for example, in tourism to transmit to the viewer the sensation of being in another place. I am presenting an inexpensive solution for automatic real time video and image stitching with two web cameras as the video/image sources. The proposed solution relies on the usage of several markers in the scene as reference points for the stitching algorithm. The implemented algorithm is divided in four main steps, the marker detection, camera pose determination (in reference to the markers), video/image size and 3d transformation, and image translation. Wii remote controllers are used to support several steps in the process. The builtin IR camera provides clean marker detection, which facilitates the camera pose determination. The only restriction in the algorithm is that markers have to be in the field of view when capturing the scene. Several tests where made to evaluate the final algorithm. The algorithm is able to perform video stitching with a frame rate between 8 and 13 fps. The joining of the two videos/images is good with minor misalignments in objects at the same depth of the marker,misalignments in the background and foreground are bigger. The capture process is simple enough so anyone can perform a stitching with a very short explanation. Although realtime video stitching can be achieved by this affordable approach, there are few shortcomings in current version. For example, contrast inconsistency along the stitching line could be reduced by applying a color correction algorithm to every source videos. In addition, the misalignments in stitched images due to camera lens distortion could be eased by optical correction algorithm. The work was developed in Apples Quartz Composer, a visual programming environment. A library of extended functions was developed using Xcode tools also from Apple.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis describes all process of the development of music visualization, starting with the implementation, followed by realization and then evaluation. The main goal is to have to knowledge of how the audience live performance experience can be enhanced through music visualization. With music visualization is possible to give a better understanding about the music feelings constructing an intensive atmosphere in the live music performance, which enhances the connection between the live music and the audience through visuals. These visuals have to be related to the live music, furthermore has to quickly respond to live music changes and introduce novelty into the visuals. The mapping between music and visuals is the focus of this project, in order to improve the relationship between the live performance and the spectators. The implementation of music visualization is based on the translation of music into graphic visualizations, therefore at the beginning the project was based on the existent works. Later on, it was decided to introduce new ways of conveying music into visuals. Several attempts were made in order to discover the most efficient mapping between music and visualization so people can fully connect with the performance. Throughout this project, those attempts resulted in several music visualizations created for four live music performances, afterwards it was produced an online survey to evaluate those live performances with music visualization. In the end, all conclusions are presented based on the results of the online survey, and also is explained which music elements should be depicted in the visuals, plus how those visuals should respond to the selected music elements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This report tells a story which started as an idea that came to us to fight the battle-cry feeling commonly known as stress and anxiety. Before creating the solution of the idea, we first need to understand the feelings underneath and its effects on our well-being. Throughout the course of our lives, we experience states of weakness and fear. These feelings can arise, for instance, while we are in an emergency room. Needless to say, how much it would have imaginable effects on children, who are unfamiliar to such environments. We ran through a serious of scenarios to find the most suitable solution, among them the study of interaction with positive expressions by Dr. Baldwin, proved to be a valued resource. It was reduced due to its length and to be suitable to our public audience. The game was then created in order to reduce or even eliminate the stress and anxiety of children. Since the game was initially released, some modifications had been made but the original idea - interaction with positive expressions remained. When the time came, we asked children to play one of the two versions of the game while waiting in the emergency room. This not only created a diversion for them but also a learning experience as it displayed some hospital equipment. The difference between the two versions is that one provides expressions, while the other does not. After all our hard work, we felt rewarded because the project proved its worth and we would see that in the expressions on childrens faces while they played. Most importantly, their anxiety level numbers were significantly reduced during that short period of time.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Computer vision is a field that uses techniques to acquire, process, analyze and understand images from the real world in order to produce numeric or symbolic information in the form of decisions [1]. This project aims to use computer vision to prepare an app to analyze a Madeira Wine and characterize it (identify its variety) by its color. Dry or sweet wines, young or old wines have a specific color. It uses techniques to compare histograms in order to analyze the images taken from a test sample inside a special container designed for this purpose. The color analysis from a wine sample using an image captured by a smartphone can be difficult. Many factors affect the captured image such as, light conditions, the background of the sample container due to the many positions the photo can be taken (different to capture facing a white wall or facing the floor for example). Using new technologies such as 3D printing it was possible to create a prototype that aims to control the effect of those external factors on the captured image. The results for this experiment are good indicators for future works. Although its necessary to do more tests, the first tests had a success rate of 80% to 90% of correct results. This report documents the development of this project and all the techniques and steps required to execute the tests.