201 resultados para Haptic
Resumo:
Mémoire numérisé par la Direction des bibliothèques de l'Université de Montréal.
Resumo:
When teaching students with visual impairments educators generally rely on tactile tools to depict visual mathematical topics. Tactile media, such as embossed paper and simple manipulable materials, are typically used to convey graphical information. Although these tools are easy to use and relatively inexpensive, they are solely tactile and are not modifiable. Dynamic and interactive technologies such as pin matrices and haptic pens are also commercially available, but tend to be more expensive and less intuitive. This study aims to bridge the gap between easy-to-use tactile tools and dynamic, interactive technologies in order to facilitate the haptic learning of mathematical concepts. We developed an haptic assistive device using a Tanvas electrostatic touchscreen that provides the user with multimodal (haptic, auditory, and visual) output. Three methodological steps comprise this research: 1) a systematic literature review of the state of the art in the design and testing of tactile and haptic assistive devices, 2) a user-centered system design, and 3) testing of the system’s effectiveness via a usability study. The electrostatic touchscreen exhibits promise as an assistive device for displaying visual mathematical elements via the haptic modality.
Resumo:
Abstract : Many individuals that had a stroke have motor impairments such as timing deficits that hinder their ability to complete daily activities like getting dressed. Robotic rehabilitation is an increasingly popular therapeutic avenue in order to improve motor recovery among this population. Yet, most studies have focused on improving the spatial aspect of movement (e.g. reaching), and not the temporal one (e.g. timing). Hence, the main aim of this study was to compare two types of robotic rehabilitation on the immediate improvement of timing accuracy: haptic guidance (HG), which consists of guiding the person to make the correct movement, and thus decreasing his or her movement errors, and error amplification (EA), which consists of increasing the person’s movement errors. The secondary objective consisted of exploring whether the side of the stroke lesion had an effect on timing accuracy following HG and EA training. Thirty-four persons that had a stroke (average age 67 ± 7 years) participated in a single training session of a timing-based task (simulated pinball-like task), where they had to activate a robot at the correct moment to successfully hit targets that were presented a random on a computer screen. Participants were randomly divided into two groups, receiving either HG or EA. During the same session, a baseline phase and a retention phase were given before and after each training, and these phases were compared in order to evaluate and compare the immediate impact of HG and EA on movement timing accuracy. The results showed that HG helped improve the immediate timing accuracy (p=0.03), but not EA (p=0.45). After comparing both trainings, HG was revealed to be superior to EA at improving timing (p=0.04). Furthermore, a significant correlation was found between the side of stroke lesion and the change in timing accuracy following EA (r[subscript pb]=0.7, p=0.001), but not HG (r[subscript pb]=0.18, p=0.24). In other words, a deterioration in timing accuracy was found for participants with a lesion in the left hemisphere that had trained with EA. On the other hand, for the participants having a right-sided stroke lesion, an improvement in timing accuracy was noted following EA. In sum, it seems that HG helps improve the immediate timing accuracy for individuals that had a stroke. Still, the side of the stroke lesion seems to play a part in the participants’ response to training. This remains to be further explored, in addition to the impact of providing more training sessions in order to assess any long-term benefits of HG or EA.
Resumo:
Recent developments in interactive technologies have seen major changes in the manner in which artists, performers, and creative individuals interact with digital music technology; this is due to the increasing variety of interactive technologies that are readily available today. Digital Musical Instruments (DMIs) present musicians with performance challenges that are unique to this form of computer music. One of the most significant deviations from conventional acoustic musical instruments is the level of physical feedback conveyed by the instrument to the user. Currently, new interfaces for musical expression are not designed to be as physically communicative as acoustic instruments. Specifically, DMIs are often void of haptic feedback and therefore lack the ability to impart important performance information to the user. Moreover, there currently is no standardised way to measure the effect of this lack of physical feedback. Best practice would expect that there should be a set of methods to effectively, repeatedly, and quantifiably evaluate the functionality, usability, and user experience of DMIs. Earlier theoretical and technological applications of haptics have tried to address device performance issues associated with the lack of feedback in DMI designs and it has been argued that the level of haptic feedback presented to a user can significantly affect the user’s overall emotive feeling towards a musical device. The outcome of the investigations contained within this thesis are intended to inform new haptic interface.
Resumo:
This paper presents an automated system for 3D assembly of tissue engineering (TE) scaffolds made from biocompatible microscopic building blocks with relatively large fabrication error. It focuses on the pin-into-hole force control developed for this demanding microassembly task. A beam-like gripper with integrated force sensing at a 3 mN resolution with a 500 mN measuring range is designed, and is used to implement an admittance force-controlled insertion using commercial precision stages. Visual-based alignment followed by an insertion is complemented by a haptic exploration strategy using force and position information. The system demonstrates fully automated construction of TE scaffolds with 50 microparts whose dimension error is larger than 5%.
Resumo:
This paper proposes the use of optical flow from a moving robot to provide force feedback to an operator's joystick to facilitate collision free teleoperation. Optic flow is measured by wide angle cameras on board the vehicle and used to generate a virtual environmental force that is reflected to the user through the joystick, as well as feeding back into the control of the vehicle. The coupling between optical flow (velocity) and force is modelled as an impedance - in this case an optical impedance. We show that the proposed control is dissipative and prevents the vehicle colliding with the environment as well as providing the operator with a natural feel for the remote environment. The paper focuses on applications to aerial robotics vehicles, however, the ideas apply directly to other force actuated vehicles such as submersibles or space vehicles, and the authors believe the approach has potential for control of terrestrial vehicles and even teleoperation of manipulators. Experimental results are provided for a simulated aerial robot in a virtual environment controlled by a haptic joystick.
Resumo:
Autonomous development of sensorimotor coordination enables a robot to adapt and change its action choices to interact with the world throughout its lifetime. The Experience Network is a structure that rapidly learns coordination between visual and haptic inputs and motor action. This paper presents methods which handle the high dimensionality of the network state-space which occurs due to the simultaneous detection of multiple sensory features. The methods provide no significant increase in the complexity of the underlying representations and also allow emergent, task-specific, semantic information to inform action selection. Experimental results show rapid learning in a real robot, beginning with no sensorimotor mappings, to a mobile robot capable of wall avoidance and target acquisition.
Resumo:
With the release of the Nintendo Wii in 2006, the use of haptic force gestures has become a very popular form of input for interactive entertainment. However, current gesture recognition techniques utilised in Nintendo Wii games fall prey to a lack of control when it comes to recognising simple gestures. This paper presents a simple gesture recognition technique called Peak Testing which gives greater control over gesture interaction. This recognition technique locates force peaks in continuous force data (provided by a gesture device such as the Wiimote) and then cancels any peaks which are not meant for input. Peak Testing is therefore technically able to identify movements in any direction. This paper applies this recognition technique to control virtual instruments and investigates how users respond to this interaction. The technique is then explored as the basis for a robust way to navigate menus with a simple flick of the wrist. We propose that this flick-form of interaction could be a very intuitive way to navigate Nintendo Wii menus instead of the current pointer techniques implemented.
Resumo:
The silence of objects phenomenologically explores the experience and memory of trauma through object-based artwork. It springs from a desire to map difficult psychological terrain and does so by tracking the process of a coming into 'expression' to communicate notions of loss, detachment and powerlessness. It maps a journey from silence to a forming 'voice' that gives shape to the unsayable. This practice-led research is multifaceted. Whilst the creative element uses transformed objects as material metaphors to tap into the sensory and affective operations of art, the written component blends reflection with theory and is informed by art theorists Jill Bennett and Mignon Nixon. By establishing a dialogue between theoretical constructs and creative works I consider how giving form to deep consciousness can counter the effects of trauma manifest as silence and invisibility.
Resumo:
A simulation-based training system for surgical wound debridement was developed and comprises a multimedia introduction, a surgical simulator (tutorial component), and an assessment component. The simulator includes two PCs, a haptic device, and mirrored display. Debridement is performed on a virtual leg model with a shallow laceration wound superimposed. Trainees are instructed to remove debris with forceps, scrub with a brush, and rinse with saline solution to maintain sterility. Research and development issues currently under investigation include tissue deformation models using mass-spring system and finite element methods; tissue cutting using a high-resolution volumetric mesh and dynamic topology; and accurate collision detection, cutting, and soft-body haptic rendering for two devices within the same haptic space.
Resumo:
To feel another person’s pulse is an intimate and physical interaction. In these prototypes we use near field communications to extend the tangible reach of our heart beat, so another person can feel our heart beat at a distance. The work is an initial experiment in near field haptic interaction, and is used to explore the quality of interactions resulting from feeling another persons pulse. The work takes the form of two feathered white gauntlets, to be worn on the fore arm. Each of the gauntlets contain a pulse sensor, radio transmitter and vibrator. The pulse of the wearer is transmitted to the other feathered gauntlet and transformed into haptic feedback. When there are two wearers, their heart beats are exchanged. To be felt by of each other without physical contact.
Resumo:
The present study investigated whether memory for a room-sized spatial layout learned through auditory localization of sounds exhibits orientation dependence similar to that observed for spatial memory acquired from stationary viewing of the environment. Participants learned spatial layouts by viewing objects or localizing sounds and then performed judgments of relative direction among remembered locations. The results showed that direction judgments following auditory learning were performed most accurately at a particular orientation in the same way as were those following visual learning, indicating that auditorily encoded spatial memory is orientation dependent. In combination with previous findings that spatial memories derived from haptic and proprioceptive experiences are also orientation dependent, the present finding suggests that orientation dependence is a general functional property of human spatial memory independent of learning modality.
Resumo:
A telepresence-based interactive installation allowing people at three sites (The National Art Museum of China, Beijing; The Imperial City Art Museum, Beijing; CalPoly University, California, USA) to interact simultaneously using only their bodies. Each participant used a physical interface called a ‘Bodyshelf’ and wore a sound vibration transmission device called a ‘haptic pendant’ around their necks. By gently moving their bodies and engaging through this ‘smart furniture’, they instigated ‘intimate transactions’, which influenced an evolving computationally-generated ‘world’ created from digital imagery, multichannel sound and tactile feedback. Intimate Transactions (Version 4) was the culmination of a long-term interdisciplinary research project developed in four distinct stages. It was launched in in 2008 and subsequently acquired on invitation by Professor Peter Weibel for the ZKM Media Art History Museum Karlsruhe in 2012.
Resumo:
Introduction Different types of hallucinations are symptomatic of different conditions. Schizotypal hallucinations are unique in that they follow existing delusional narrative patterns: they are often bizarre, they are generally multimodal, and they are particularly vivid (the experience of a newsreader abusing you personally over the TV is both visual and aural. Patients who feel and hear silicone chips under their skin suffer from haptic hallucinations as well as aural ones, etc.) Although there are a number of hypotheses for hallucinations, few cogently grapple the sheer bizarreness of the ones experienced in schizotypal psychosis. Methods A review-based hypothesis, traversing theory from the molecular level to phenomenological expression as a distinct and recognizable symptomatology. Conclusion Hallucinations appear to be caused by a two-fold dysfunction in the mesofrontal dopamine pathway, which is considered here to mediate attention of different types: in the anterior medial frontal lobe, the receptors (largely D1 type) mediate declarative awareness, whereas the receptors in the striatum (largely D2 type) mediate latent awareness of known schemata. In healthy perception, most of the perceptual load is performed by the latter: by the top-down predictive and mimetic engine, with the bottom-up mechanism being used as a secondary tool to bring conscious deliberation to stimuli that fails to match up against expectations. In schizophrenia, the predictive mode is over-stimulated, while the bottom-up feedback mechanism atrophies. The dysfunctional distribution pattern effectively confines dopamine activity to the striatum, thereby stimulating the structural components of thought and behaviour: well-learned routines, narrative structures, lexica, grammar, schemata, archetypes, and other procedural resources. Meanwhile, the loss of activity in the frontal complex reduces the capacity for declarative awareness and for processing anything that fails to meet expectations.
Resumo:
Haptices and haptemes: A case study of developmental process in touch-based communication of acquired deafblind people This research is the first systematic, longitudinal process and development description of communication using touch and body with an acquired deafblind person. The research consists of observational and analysed written and video materials mainly from two informants´ experiences during period of 14 years. The research describes the adaptation of Social-Haptic methods between a couple, and other informants´ experiences, which have been collated from biographies and through giving national and international courses. When the hearing and sight deteriorates due to having an acquired deafblind condition, communication consists of multi-systematic and adaptive methods. A person`s expressive language, spoken or Sign Language, usually remains unchanged, but the methods of receiving information could change many times during a person s lifetime. Haptices are made from haptemes that determines which regulations are analysed. When defining haptemes the definition, classification and varied meanings of touch were discovered. Haptices include sharing a personal body space, meaning of touch-contact, context and using different communication channels. Communication distances are classified as exact distance, estimated distance and touch distance. Physical distance can be termed as very long, long, medium or very close. Social body space includes the body areas involved in sending and receiving haptices and applying different types of contacts. One or two hands can produce messages by using different hand shapes and orientations. This research classifies how the body can be identified into different areas such as body orientation, varied body postures, body position levels, social actions and which side of the body is used. Spatial body space includes environmental and situational elements. Haptemes of movements are recognised as the direction of movements, change of directions on the body, directions between people, pressure, speed, frequency, size, length, duration, pause, change of rhythm, shape, macro and micro movements. Haptices share multidimensional meanings and emotions. Research describes haptices in different situations enhancing sensory information and functioning also as an independent language. Haptices includes social-haptic confirmation system, social quick messages, body drawing, contact to the people and the environment, guiding and sharing art experiences through movements. Five stages of emotional differentiation were identified as very light, light, medium, heavy and very heavy touch. Haptices give the possibility to share different art, hobby and game experiences. A new communication system development based on the analysis of the research data is classified into different phases. These are experimental initiation, social deconstruction, developing the description of Social-Haptic communication and generalisation of the theory as well as finding and conceptualising the haptices and haptemes. The use and description of haptices is a social innovation, which illustrates the adaptive function of the body and perceptual senses that can be taught to a third party. Keywords: deafblindness, hapteme, haptic, haptices, movement, social-haptic communication, social-haptic confirmation system, tactile, touch