966 resultados para Haptic system
Resumo:
Instability in conventional haptic rendering destroys the perception of rigid objects in virtual environments. Inherent limitations in the conventional haptic loop restrict the maximum stiffness that can be rendered. In this paper we present a method to render virtual walls that are much stiffer than those achieved by conventional techniques. By removing the conventional digital haptic loop and replacing it with a part-continuous and part-discrete time hybrid haptic loop, we were able to render stiffer walls. The control loop is implemented as a combinational logic circuit on an field-programmable gate array. We compared the performance of the conventional haptic loop and our hybrid haptic loop on the same haptic device, and present mathematical analysis to show the limit of stability of our device. Our hybrid method removes the computer-intensive haptic loop from the CPU-this can free a significant amount of resources that can be used for other purposes such as graphical rendering and physics modeling. It is our hope that, in the future, similar designs will lead to a haptics processing unit (HPU).
Resumo:
When teaching students with visual impairments educators generally rely on tactile tools to depict visual mathematical topics. Tactile media, such as embossed paper and simple manipulable materials, are typically used to convey graphical information. Although these tools are easy to use and relatively inexpensive, they are solely tactile and are not modifiable. Dynamic and interactive technologies such as pin matrices and haptic pens are also commercially available, but tend to be more expensive and less intuitive. This study aims to bridge the gap between easy-to-use tactile tools and dynamic, interactive technologies in order to facilitate the haptic learning of mathematical concepts. We developed an haptic assistive device using a Tanvas electrostatic touchscreen that provides the user with multimodal (haptic, auditory, and visual) output. Three methodological steps comprise this research: 1) a systematic literature review of the state of the art in the design and testing of tactile and haptic assistive devices, 2) a user-centered system design, and 3) testing of the system’s effectiveness via a usability study. The electrostatic touchscreen exhibits promise as an assistive device for displaying visual mathematical elements via the haptic modality.
Resumo:
This document is a summary of the Bachelor thesis titled “VHDL-Based System Design of a Cognitive Sensorimotor Loop (CSL) for Haptic Human-Machine Interaction (HMI)” written by Pablo de Miguel Morales, Electronics Engineering student at the Universidad Politécnica de Madrid (UPM Madrid, Spain) during an Erasmus+ Exchange Program at the Beuth Hochschule für Technik (BHT Berlin, Germany). The tutor of this project is Dr. Prof. Hild. This project has been developed inside the Neurobotics Research Laboratory (NRL) in close collaboration with Benjamin Panreck, a member of the NRL, and another exchange student from the UPM Pablo Gabriel Lezcano. For a deeper comprehension of the content of the thesis, a deeper look in the document is needed as well as the viewing of the videos and the VHDL design. In the growing field of automation, a large amount of workforce is dedicated to improve, adapt and design motor controllers for a wide variety of applications. In the specific field of robotics or other machinery designed to interact with humans or their environment, new needs and technological solutions are often being discovered due to the existing, relatively unexplored new scenario it is. The project consisted of three main parts: Two VHDL-based systems and one short experiment on the haptic perception. Both VHDL systems are based on a Cognitive Sensorimotor Loop (CSL) which is a control loop designed by the NRL and mainly developed by Dr. Prof. Hild. The CSL is a control loop whose main characteristic is the fact that it does not use any external sensor to measure the speed or position of the motor but the motor itself. The motor always generates a voltage that is proportional to its angular speed so it does not need calibration. This method is energy efficient and simplifies control loops in complex systems. The first system, named CSL Stay In Touch (SIT), consists in a one DC motor system controller by a FPGA Board (Zynq ZYBO 7000) whose aim is to keep contact with any external object that touches its Sensing Platform in both directions. Apart from the main behavior, three features (Search Mode, Inertia Mode and Return Mode) have been designed to enhance the haptic interaction experience. Additionally, a VGA-Screen is also controlled by the FPGA Board for the monitoring of the whole system. This system has been completely developed, tested and improved; analyzing its timing and consumption properties. The second system, named CSL Fingerlike Mechanism (FM), consists in a fingerlike mechanical system controlled by two DC motors (Each controlling one part of the finger). The behavior is similar to the first system but in a more complex structure. This system was optional and not part of the original objectives of the thesis and it could not be properly finished and tested due to the lack of time. The haptic perception experiment was an experiment conducted to have an insight into the complexity of human haptic perception in order to implement this knowledge into technological applications. The experiment consisted in testing the capability of the subjects to recognize different objects and shapes while being blindfolded and with their ears covered. Two groups were done, one had full haptic perception while the other had to explore the environment with a plastic piece attached to their finger to create a haptic handicap. The conclusion of the thesis was that a haptic system based only on a CSL-based system is not enough to retrieve valuable information from the environment and that other sensors are needed (temperature, pressure, etc.) but that a CSL-based system is very useful to control the force applied by the system to interact with haptic sensible surfaces such as skin or tactile screens. RESUMEN. Este documento es un resumen del proyecto fin de grado titulado “VHDL-Based System Design of a Cognitive Sensorimotor Loop (CSL) for Haptic Human-Machine Interaction (HMI)” escrito por Pablo de Miguel, estudiante de Ingeniería Electrónica de Comunicaciones en la Universidad Politécnica de Madrid (UPM Madrid, España) durante un programa de intercambio Erasmus+ en la Beuth Hochschule für Technik (BHT Berlin, Alemania). El tutor de este proyecto ha sido Dr. Prof. Hild. Este proyecto se ha desarrollado dentro del Neurorobotics Research Laboratory (NRL) en estrecha colaboración con Benjamin Panreck (un miembro del NRL) y con Pablo Lezcano (Otro estudiante de intercambio de la UPM). Para una comprensión completa del trabajo es necesaria una lectura detenida de todo el documento y el visionado de los videos y análisis del diseño VHDL incluidos en el CD adjunto. En el creciente sector de la automatización, una gran cantidad de esfuerzo está dedicada a mejorar, adaptar y diseñar controladores de motor para un gran rango de aplicaciones. En el campo específico de la robótica u otra maquinaria diseñada para interactuar con los humanos o con su entorno, nuevas necesidades y soluciones tecnológicas se siguen desarrollado debido al relativamente inexplorado y nuevo escenario que supone. El proyecto consta de tres partes principales: Dos sistemas basados en VHDL y un pequeño experimento sobre la percepción háptica. Ambos sistemas VHDL están basados en el Cognitive Sesnorimotor Loop (CSL) que es un lazo de control creado por el NRL y cuyo desarrollador principal ha sido Dr. Prof. Hild. El CSL es un lazo de control cuya principal característica es la ausencia de sensores externos para medir la velocidad o la posición del motor, usando el propio motor como sensor. El motor siempre genera un voltaje proporcional a su velocidad angular de modo que no es necesaria calibración. Este método es eficiente en términos energéticos y simplifica los lazos de control en sistemas complejos. El primer sistema, llamado CSL Stay In Touch (SIT), consiste en un sistema formado por un motor DC controlado por una FPGA Board (Zynq ZYBO 7000) cuyo objetivo es mantener contacto con cualquier objeto externo que toque su plataforma sensible en ambas direcciones. Aparte del funcionamiento básico, tres modos (Search Mode, Inertia Mode y Return Mode) han sido diseñados para mejorar la interacción. Adicionalmente, se ha diseñado el control a través de la FPGA Board de una pantalla VGA para la monitorización de todo el sistema. El sistema ha sido totalmente desarrollado, testeado y mejorado; analizando su propiedades de timing y consumo energético. El segundo sistema, llamado CSL Fingerlike Mechanism (FM), consiste en un mecanismo similar a un dedo controlado por dos motores DC (Cada uno controlando una falange). Su comportamiento es similar al del primer sistema pero con una estructura más compleja. Este sistema no formaba parte de los objetivos iniciales del proyecto y por lo tanto era opcional. No pudo ser plenamente desarrollado debido a la falta de tiempo. El experimento de percepción háptica fue diseñado para profundizar en la percepción háptica humana con el objetivo de aplicar este conocimiento en aplicaciones tecnológicas. El experimento consistía en testear la capacidad de los sujetos para reconocer diferentes objetos, formas y texturas en condiciones de privación del sentido del oído y la vista. Se crearon dos grupos, en uno los sujetos tenían plena percepción háptica mientras que en el otro debían interactuar con los objetos a través de una pieza de plástico para generar un hándicap háptico. La conclusión del proyecto fue que un sistema háptico basado solo en sistemas CSL no es suficiente para recopilar información valiosa del entorno y que debe hacer uso de otros sensores (temperatura, presión, etc.). En cambio, un sistema basado en CSL es idóneo para el control de la fuerza aplicada por el sistema durante la interacción con superficies hápticas sensibles tales como la piel o pantallas táctiles.
Resumo:
Dans les situations du quotidien, nous manipulons fréquemment des objets sans les regarder. Pour effectuer des mouvements vers une cible précise avec un objet à la main, il est nécessaire de percevoir les propriétés spatiales de l’objet. Plusieurs études ont démontré que les sujets peuvent discriminer entre des longueurs d'objet différentes sans l’aide des informations visuelles et peuvent adapter leurs mouvements aux nouvelles caractéristiques inertielles produites lors de la manipulation d’un objet. Dans cette étude, nous avons conduit deux expérimentations afin d’évaluer la capacité des sujets à adapter leurs mouvements d’atteinte à la longueur et à la forme perçues des objets manipulés sur la base unique des sensations non visuelles (sensations haptiques). Dans l'expérience 1, dix sujets devaient exécuter des mouvements d’atteintes vers 4 cibles tridimensionnelles (3D) avec un objet à la main. Trois objets de longueur différente ont été utilisés (pointeurs: 12.5, 17.5, 22.5 cm). Aucune connaissance de la position de la main et de l’objet par rapport à la cible n’était disponible pendant et après les mouvements vers les cibles 3D. Ainsi, lorsque comparé avec les erreurs spatiales commises lors des atteintes manuelles sans pointeur, l’erreur spatiale de chacun des mouvements avec pointeur reflète la précision de l’estimation de la longueur des pointeurs. Nos résultats indiquent que les sujets ont augmenté leurs erreurs spatiales lors des mouvements d’atteinte avec un objet en comparaison avec la condition sans pointeur. Cependant, de façon intéressante, ils ont maintenu le même niveau de précision à travers les trois conditions avec des objets de différentes longueurs malgré une différence de 10 cm entre l’objet le plus court et le plus long. Dans l'expérience 2, neuf sujets différents ont effectué des mouvements d’atteinte vers les mêmes cibles utilisant cette fois-ci deux objets en forme de L (objet no.1 : longueur de 17,5 cm et déviation à droite de 12,5 cm – objet no.2 : longueur de 17,5 cm et déviation à droite de 17,5 cm). Comme c’était le cas lors de l'expérience 1, les sujets ont augmenté leurs erreurs spatiales lors des mouvements d’atteinte avec les objets et cette augmentation était similaire entre les deux conditions avec les objets en forme de L. Une observation frappante de l’expérience 2 est que les erreurs de direction n’ont pas augmenté de façon significative entre les conditions avec objet en forme de L et la condition contrôle sans objet. Ceci démontre que les participants ont perçu de façon précise la déviation latérale des objets sans jamais avoir eu de connaissances visuelles de la configuration des objets. Les résultats suggèrent que l’adaptation à la longueur et à la forme des objets des mouvements d’atteinte est principalement basée sur l’intégration des sensations haptiques. À notre connaissance, cette étude est la première à fournir des données quantitatives sur la précision avec laquelle le système haptique peut permettre la perception de la longueur et de la forme d’un objet tenu dans la main afin d’effectuer un mouvement précis en direction d’une cible.
Resumo:
The goal of this research is to develop the prototype of a tactile sensing platform for anthropomorphic manipulation research. We investigate this problem through the fabrication and simple control of a planar 2-DOF robotic finger inspired by anatomic consistency, self-containment, and adaptability. The robot is equipped with a tactile sensor array based on optical transducer technology whereby localized changes in light intensity within an illuminated foam substrate correspond to the distribution and magnitude of forces applied to the sensor surface plane. The integration of tactile perception is a key component in realizing robotic systems which organically interact with the world. Such natural behavior is characterized by compliant performance that can initiate internal, and respond to external, force application in a dynamic environment. However, most of the current manipulators that support some form of haptic feedback either solely derive proprioceptive sensation or only limit tactile sensors to the mechanical fingertips. These constraints are due to the technological challenges involved in high resolution, multi-point tactile perception. In this work, however, we take the opposite approach, emphasizing the role of full-finger tactile feedback in the refinement of manual capabilities. To this end, we propose and implement a control framework for sensorimotor coordination analogous to infant-level grasping and fixturing reflexes. This thesis details the mechanisms used to achieve these sensory, actuation, and control objectives, along with the design philosophies and biological influences behind them. The results of behavioral experiments with a simple tactilely-modulated control scheme are also described. The hope is to integrate the modular finger into an %engineered analog of the human hand with a complete haptic system.
Resumo:
The maintenance of a given body orientation is obtained by the complex relation between sensory information and muscle activity. Therefore, this study purpose was to review the role of visual, somatosensory, vestibular and auditory information in the maintenance and control of the posture. Method. a search by papers for the last 24 years was done in the PubMed and CAPES databases. The following keywords were used: postural control, sensory information, vestibular system, visual system, somatosensory system, auditory system and haptic system. Results. the influence of each sensory system and its integration were analyzed for the maintenance and control of the posture. Conclusion. the literature showed that there is information redundancy provided by sensory channels. Thus, the central nervous system chooses the main source for the posture control.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Pós-graduação em Ciências da Motricidade - IBRC
Resumo:
In the U.K., dental students require to perform training and practice on real human tissues at the very early stage of their courses. Currently, the human tissues, such as decayed teeth, are mounted in a human head like physical model. The problems with these models in teaching are; (1) every student operates on tooth, which are always unique; (2) the process cannot be recorded for examination purposes and (3) same training are not repeatable. The aim of the PHATOM Project is to develop a dental training system using Haptic technology. This paper documents the project background, specification, research and development of the first prototype system. It also discusses the research in the visual display, haptic devices and haptic rendering. This includes stereo vision, motion parallax, volumetric modelling, surface remapping algorithms as well as analysis design of the system. A new volumetric to surface model transformation algorithm is also introduced. This paper includes the future work on the system development and research.
Resumo:
This paper presents a novel design of a virtual dental training system (hapTEL) using haptic technology. The system allows dental students to learn and practice procedures such as dental drilling, caries removal and cavity preparation for tooth restoration. This paper focuses on the hardware design, development and evaluation aspects in relation to the dental training and educational requirements. Detailed discussions on how the system offers dental students a natural operational position are documented. An innovative design of measuring and connecting the dental tools to the haptic device is also shown. Evaluation of the impact on teaching and learning is discussed.
Resumo:
We present a novel, simple and effective approach for tele-operation of aerial robotic vehicles with haptic feedback. Such feedback provides the remote pilot with an intuitive feel of the robot’s state and perceived local environment that will ensure simple and safe operation in cluttered 3D environments common in inspection and surveillance tasks. Our approach is based on energetic considerations and uses the concepts of network theory and port-Hamiltonian systems. We provide a general framework for addressing problems such as mapping the limited stroke of a ‘master’ joystick to the infinite stroke of a ‘slave’ vehicle, while preserving passivity of the closed-loop system in the face of potential time delays in communications links and limited sensor data
Resumo:
This study examined the perceptual attunement of relatively skilled individuals to physical properties of striking implements in the sport of cricket. We also sought to assess whether utilising bats of different physical properties influenced performance of a specific striking action: the front foot straight drive. Eleven, skilled male cricketers (mean age = 16.6 ± 0.3 years) from an elite school cricket development programme consented to participate in the study. Whist blindfolded, participants wielded six bats exhibiting different mass and moment of inertia (MOI) characteristics and were asked to identify their three most preferred bats for hitting a ball to a maximum distance by performing a front foot straight drive (a common shot in cricket). Next, participants actually attempted to hit balls projected from a ball machine using each of the six bat configurations to enable kinematic analysis of front foot straight drive performance with each implement. Results revealed that, on first choice, the two bats with the smallest mass and MOI values (1 and 2) were most preferred by almost two-thirds (63.7%) of the participants. Kinematic analysis of movement patterns revealed that bat velocity, step length and bat-ball contact position measures significantly differed between bats. Data revealed how skilled youth cricketers were attuned to the different bat characteristics and harnessed movement system degeneracy to perform this complex interceptive action.
Resumo:
A simulation-based training system for surgical wound debridement was developed and comprises a multimedia introduction, a surgical simulator (tutorial component), and an assessment component. The simulator includes two PCs, a haptic device, and mirrored display. Debridement is performed on a virtual leg model with a shallow laceration wound superimposed. Trainees are instructed to remove debris with forceps, scrub with a brush, and rinse with saline solution to maintain sterility. Research and development issues currently under investigation include tissue deformation models using mass-spring system and finite element methods; tissue cutting using a high-resolution volumetric mesh and dynamic topology; and accurate collision detection, cutting, and soft-body haptic rendering for two devices within the same haptic space.
Resumo:
We present a real-time haptics-aided injection technique for biological cells using miniature compliant mechanisms. Our system consists of a haptic robot operated by a human hand, an XYZ stage for micro-positioning, a camera for image capture, and a polydimethylsiloxane (PDMS) miniature compliant device that serves the dual purpose of an injecting tool and a force-sensor. In contrast to existing haptics-based micromanipulation techniques where an external force sensor is used, we use visually captured displacements of the compliant mechanism to compute the applied and reaction forces. The human hand can feel the magnified manipulation force through the haptic device in real-time while the motion of the human hand is replicated on the mechanism side. The images are captured using a camera at the rate of 30 frames per second for extracting the displacement data. This is used to compute the forces at the rate of 30 Hz. The force computed in this manner is sent at the rate of 1000 Hz to ensure stable haptic interaction. The haptic cell-manipulation system was tested by injecting into a zebrafish egg cell after validating the technique at a size larger than that of the cell.