3 resultados para Music and movement

em Universidad Politécnica de Madrid


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Multi-camera 3D tracking systems with overlapping cameras represent a powerful mean for scene analysis, as they potentially allow greater robustness than monocular systems and provide useful 3D information about object location and movement. However, their performance relies on accurately calibrated camera networks, which is not a realistic assumption in real surveillance environments. Here, we introduce a multi-camera system for tracking the 3D position of a varying number of objects and simultaneously refin-ing the calibration of the network of overlapping cameras. Therefore, we introduce a Bayesian framework that combines Particle Filtering for tracking with recursive Bayesian estimation methods by means of adapted transdimensional MCMC sampling. Addi-tionally, the system has been designed to work on simple motion detection masks, making it suitable for camera networks with low transmission capabilities. Tests show that our approach allows a successful performance even when starting from clearly inaccurate camera calibrations, which would ruin conventional approaches.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Este proyecto tiene como objetivo el desarrollo de una interfaz MIDI, basada en técnicas de procesamiento digital de la imagen, capaz de controlar diversos parámetros de un software de audio mediante información gestual: el movimiento de las manos. La imagen es capturada por una cámara Kinect comercial y los datos obtenidos por ésta son procesados en tiempo real. La finalidad es convertir la posición de varios puntos de control de nuestro cuerpo en información de control musical MIDI. La interfaz ha sido desarrollada en el lenguaje y entorno de programación Processing, el cual está basado en Java, es de libre distribución y de fácil utilización. El software de audio seleccionado es Ableton Live, versión 8.2.2, elegido porque es útil tanto para la composición musical como para la música en directo, y esto último es la principal utilidad que se le pretende dar a la interfaz. El desarrollo del proyecto se divide en dos bloques principales: el primero, diseño gráfico del controlador, y el segundo, la gestión de la información musical. En el primer apartado se justifica el diseño del controlador, formado por botones virtuales: se explica el funcionamiento y, brevemente, la función de cada botón. Este último tema es tratado en profundidad en el Anexo II: Manual de usuario. En el segundo bloque se explica el camino que realiza la información MIDI desde el procesador gestual hasta el sintetizador musical. Este camino empieza en Processing, desde donde se mandan los mensajes que más tarde son interpretados por el secuenciador seleccionado, Ableton Live. Una vez terminada la explicación con detalle del desarrollo del proyecto se exponen las conclusiones del autor acerca del desarrollo del proyecto, donde se encuentran los pros y los contras a tener en cuenta para poder sacar el máximo provecho en el uso del controlador . En este mismo bloque de la memoria se exponen posibles líneas futuras a desarrollar. Se facilita también un presupuesto, desglosado en costes materiales y de personal. ABSTRACT. The aim of this project is the development of a MIDI interface based on image digital processing techniques, able to control several parameters of an audio software using gestural information, the movement of the hands. The image is captured by a commercial Kinect camera and the data obtained by it are processed in real time. The purpose is to convert the position of various points of our body into MIDI musical control information. The interface has been developed in the Processing programming language and environment which is based on Java, freely available and easy to used. The audio software selected is Ableton Live, version 8.2.2, chosen because it is useful for both music composition and live music, and the latter is the interface main intended utility. The project development is divided into two main blocks: the controller graphic design, and the information management. The first section justifies the controller design, consisting of virtual buttons: it is explained the operation and, briefly, the function of each button. This latter topic is covered in detail in Annex II: user manual. In the second section it is explained the way that the MIDI information makes from the gestural processor to the musical synthesizer. It begins in Processing, from where the messages, that are later interpreted by the selected sequencer, Ableton Live, are sent. Once finished the detailed explanation of the project development, the author conclusions are presented, among which are found the pros and cons to take into account in order to take full advantage in the controller use. In this same block are explained the possible future aspects to develop. It is also provided a budget, broken down into material and personal costs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This article presents the first musculoskeletal model and simulation of upper plexus brachial injury. From this model is possible to analyse forces and movement ranges in order to develop a robotic exoskeleton to improve rehabilitation. The software that currently exists for musculoskeletal modeling is varied and most have advanced features for proper analysis and study of motion simulations. Whilst more powerful computer packages are usually expensive, there are other free and open source packages available which offer different tools to perform animations and simulations and which obtain forces and moments of inertia. Among them, Musculoskeletal Modeling Software was selected to construct a model of the upper limb, which has 7 degrees of freedom and 10 muscles. These muscles are important for two of the movements simulated in this article that are part of the post-surgery rehabilitation protocol. We performed different movement animations which are made using the inertial measurement unit to capture real data from movements made by a human being. We also performed the simulation of forces produced in elbow flexion-extension and arm abduction-adduction of a healthy subject and one with upper brachial plexus injury in a postoperative state to compare the force that is capable of being produced in both cases.