856 resultados para EYE-MOVEMENTS
Resumo:
This study investigates the human response to impulse perturbations at the midpoint of a haptically-guided straight-line point-to-point movement. Such perturbation response may be used as an assessment tool during robot-mediated neuro-rehabilitation therapy. Subjects show variety in their perturbation responses. Movements with a lower perturbation displacement exhibit high frequency oscillations, indicative of increased joint stiffness. Equally, movements with a high perturbation displacement exhibit lower frequency oscillations with higher amplitude and a longer settling time. Some subjects show unexpected transients during the perturbation impulse, which may be caused by complex joint interactions in the hand and arm.
Resumo:
Participants' eye-gaze is generally not captured or represented in immersive collaborative virtual environment (ICVE) systems. We present EyeCVE. which uses mobile eye-trackers to drive the gaze of each participant's virtual avatar, thus supporting remote mutual eye-contact and awareness of others' gaze in a perceptually unfragmented shared virtual workspace. We detail trials in which participants took part in three-way conferences between remote CAVE (TM) systems linked via EyeCVE. Eye-tracking data was recorded and used to evaluate interaction, confirming; the system's support for the use of gaze as a communicational and management resource in multiparty conversational scenarios. We point toward subsequent investigation of eye-tracking in ICVEs for enhanced remote social-interaction and analysis.
Resumo:
Visually impaired people have a very different view of the world such that seemingly simple environments as viewed by a ‘normally’ sighted people can be difficult for people with visual impairments to access and move around. This is a problem that can be hard to fully comprehend by people with ‘normal vision’ even when guidelines for inclusive design are available. This paper investigates ways in which image processing techniques can be used to simulate the characteristics of a number of common visual impairments in order to provide, planners, designers and architects, with a visual representation of how people with visual impairments view their environment, thereby promoting greater understanding of the issues, the creation of more accessible buildings and public spaces and increased accessibility for visually impaired people in everyday situations.
Marker placement to describe the wrist movements during activities of daily living in cyclical tasks
Resumo:
Objective. To describe the wrist kinematics during movement through free range of motion and activities of daily living using a cyclical task. Design. The wrist angles were initially calculated in a calibration trial and then in two selected activities of daily living (jar opening and carton pouring). Background. Existing studies which describe the wrist movement do not address the specific application of daily activities. Moreover, the data presented from subject to subject may differ simply because of the non-cyclical nature of the upper limbs movements. Methods. The coordinates of external markers attached to bone references on the forearm and dorsal side of the hand were obtained using an optical motion capture system. The wrist angles were derived from free motion trials and successively calculated in four healthy subjects for two specific cyclical daily activities (opening a jar and pouring from a carton). Results. The free motions trial highlighted the interaction between the wrist angles. Both the jar opening and the carton pouring activity showed a repetitive pattern for the three angles within the cycle length. In the jar-opening task, the standard deviation for the whole population was 10.8degrees for flexion-extension, 5.3degrees for radial-ulnar deviation and 10.4degrees for pronation-supination. In the carton-pouring task, the standard deviation for the whole population was 16.0degrees for flexion-extension, 3.4degrees for radial-ulnar deviation and 10.7degrees for pro nation-supination. Conclusion. Wrist kinematics in healthy subjects can be successfully described by the rotations about the axes of marker-defined coordinates systems during free range of motion and daily activities using cyclical tasks.
Resumo:
In collaborative situations, eye gaze is a critical element of behavior which supports and fulfills many activities and roles. In current computer-supported collaboration systems, eye gaze is poorly supported. Even in a state-of-the-art video conferencing system such as the access grid, although one can see the face of the user, much of the communicative power of eye gaze is lost. This article gives an overview of some preliminary work that looks towards integrating eye gaze into an immersive collaborative virtual environment and assessing the impact that this would have on interaction between the users of such a system. Three experiments were conducted to assess the efficacy of eye gaze within immersive virtual environments. In each experiment, subjects observed on a large screen the eye-gaze behavior of an avatar. The eye-gaze behavior of that avatar had previously been recorded from a user with the use of a head-mounted eye tracker. The first experiment was conducted to assess the difference between users' abilities to judge what objects an avatar is looking at with only head gaze being viewed and also with eye- and head-gaze data being displayed. The results from the experiment show that eye gaze is of vital importance to the subjects, correctly identifying what a person is looking at in an immersive virtual environment. The second experiment examined whether a monocular or binocular eye-tracker would be required. This was examined by testing subjects' ability to identify where an avatar was looking from their eye direction alone, or by eye direction combined with convergence. This experiment showed that convergence had a significant impact on the subjects' ability to identify where the avatar was looking. The final experiment looked at the effects of stereo and mono-viewing of the scene, with the subjects being asked to identify where the avatar was looking. This experiment showed that there was no difference in the subjects' ability to detect where the avatar was gazing. This is followed by a description of how the eye-tracking system has been integrated into an immersive collaborative virtual environment and some preliminary results from the use of such a system.
Resumo:
The main objective is to generate kinematic models for the head and neck movements. The motivation comes from our study of individuals with quadriplegia and the need to design rehabilitation aiding devices such as robots and teletheses that can be controlled by head-neck movements. It is then necessary to develop mathematical models for the head and neck movements. Two identification methods have been applied to study the kinematics of head-neck movements of able-body as well as neck-injured subjects. In particular, sagittal plane movements are well modeled by a planar two-revolute-joint linkage. In fact, the motion in joint space seems to indicate that sagittal plane movements may be classified as a single DOF motion. Finally, a spatial three-revolute-joint system has been employed to model 3D head-neck movements.
Resumo:
In this paper we have explored areas of application for health care manipulators and possible user groups. We have shown the steps in the design approach to the conceptual mechanism from the AAS. The future work will be measurement from properties of the muscle with the elbow parameterization test-bed to get a database to design one part of the control area from the AAS. More work on the mechanical design is required before a functional prototype can be built.
Resumo:
This paper describes the design, implementation and testing of a high speed controlled stereo “head/eye” platform which facilitates the rapid redirection of gaze in response to visual input. It details the mechanical device, which is based around geared DC motors, and describes hardware aspects of the controller and vision system, which are implemented on a reconfigurable network of general purpose parallel processors. The servo-controller is described in detail and higher level gaze and vision constructs outlined. The paper gives performance figures gained both from mechanical tests on the platform alone, and from closed loop tests on the entire system using visual feedback from a feature detector.