816 resultados para INDEX FINGER
Resumo:
Objective: To define and evaluate a Computer-Vision (CV) method for scoring Paced Finger-Tapping (PFT) in Parkinson's disease (PD) using quantitative motion analysis of index-fingers and to compare the obtained scores to the UPDRS (Unified Parkinson's Disease Rating Scale) finger-taps (FT). Background: The naked-eye evaluation of PFT in clinical practice results in coarse resolution to determine PD status. Besides, sensor mechanisms for PFT evaluation may cause patients discomfort. In order to avoid cost and effort of applying wearable sensors, a CV system for non-invasive PFT evaluation is introduced. Methods: A database of 221 PFT videos from 6 PD patients was processed. The subjects were instructed to position their hands above their shoulders besides the face and tap the index-finger against the thumb consistently with speed. They were facing towards a pivoted camera during recording. The videos were rated by two clinicians between symptom levels 0-to-3 using UPDRS-FT. The CV method incorporates a motion analyzer and a face detector. The method detects the face of testee in each video-frame. The frame is split into two images from face-rectangle center. Two regions of interest are located in each image to detect index-finger motion of left and right hands respectively. The tracking of opening and closing phases of dominant hand index-finger produces a tapping time-series. This time-series is normalized by the face height. The normalization calibrates the amplitude in tapping signal which is affected by the varying distance between camera and subject (farther the camera, lesser the amplitude). A total of 15 features were classified using K-nearest neighbor (KNN) classifier to characterize the symptoms levels in UPDRS-FT. The target ratings provided by the raters were averaged. Results: A 10-fold cross validation in KNN classified 221 videos between 3 symptom levels with 75% accuracy. An area under the receiver operating characteristic curves of 82.6% supports feasibility of the obtained features to replicate clinical assessments. Conclusions: The system is able to track index-finger motion to estimate tapping symptoms in PD. It has certain advantages compared to other technologies (e.g. magnetic sensors, accelerometers etc.) for PFT evaluation to improve and automate the ratings
Resumo:
En la interacción con el entorno que nos rodea durante nuestra vida diaria (utilizar un cepillo de dientes, abrir puertas, utilizar el teléfono móvil, etc.) y en situaciones profesionales (intervenciones médicas, procesos de producción, etc.), típicamente realizamos manipulaciones avanzadas que incluyen la utilización de los dedos de ambas manos. De esta forma el desarrollo de métodos de interacción háptica multi-dedo dan lugar a interfaces hombre-máquina más naturales y realistas. No obstante, la mayoría de interfaces hápticas disponibles en el mercado están basadas en interacciones con un solo punto de contacto; esto puede ser suficiente para la exploración o palpación del entorno pero no permite la realización de tareas más avanzadas como agarres. En esta tesis, se investiga el diseño mecánico, control y aplicaciones de dispositivos hápticos modulares con capacidad de reflexión de fuerzas en los dedos índice, corazón y pulgar del usuario. El diseño mecánico de la interfaz diseñada, ha sido optimizado con funciones multi-objetivo para conseguir una baja inercia, un amplio espacio de trabajo, alta manipulabilidad y reflexión de fuerzas superiores a 3 N en el espacio de trabajo. El ancho de banda y la rigidez del dispositivo se han evaluado mediante simulación y experimentación real. Una de las áreas más importantes en el diseño de estos dispositivos es el efector final, ya que es la parte que está en contacto con el usuario. Durante este trabajo se ha diseñado un dedal de bajo peso, adaptable a diferentes usuarios que, mediante la incorporación de sensores de contacto, permite estimar fuerzas normales y tangenciales durante la interacción con entornos reales y virtuales. Para el diseño de la arquitectura de control, se estudiaron los principales requisitos para estos dispositivos. Entre estos, cabe destacar la adquisición, procesado e intercambio a través de internet de numerosas señales de control e instrumentación; la computación de equaciones matemáticas incluyendo la cinemática directa e inversa, jacobiana, algoritmos de detección de agarres, etc. Todos estos componentes deben calcularse en tiempo real garantizando una frecuencia mínima de 1 KHz. Además, se describen sistemas para manipulación de precisión virtual y remota; así como el diseño de un método denominado "desacoplo cinemático iterativo" para computar la cinemática inversa de robots y la comparación con otros métodos actuales. Para entender la importancia de la interacción multimodal, se ha llevado a cabo un estudio para comprobar qué estímulos sensoriales se correlacionan con tiempos de respuesta más rápidos y de mayor precisión. Estos experimentos se desarrollaron en colaboración con neurocientíficos del instituto Technion Israel Institute of Technology. Comparando los tiempos de respuesta en la interacción unimodal (auditiva, visual y háptica) con combinaciones bimodales y trimodales de los mismos, se demuestra que el movimiento sincronizado de los dedos para generar respuestas de agarre se basa principalmente en la percepción háptica. La ventaja en el tiempo de procesamiento de los estímulos hápticos, sugiere que los entornos virtuales que incluyen esta componente sensorial generan mejores contingencias motoras y mejoran la credibilidad de los eventos. Se concluye que, los sistemas que incluyen percepción háptica dotan a los usuarios de más tiempo en las etapas cognitivas para rellenar información de forma creativa y formar una experiencia más rica. Una aplicación interesante de los dispositivos hápticos es el diseño de nuevos simuladores que permitan entrenar habilidades manuales en el sector médico. En colaboración con fisioterapeutas de Griffith University en Australia, se desarrolló un simulador que permite realizar ejercicios de rehabilitación de la mano. Las propiedades de rigidez no lineales de la articulación metacarpofalange del dedo índice se estimaron mediante la utilización del efector final diseñado. Estos parámetros, se han implementado en un escenario que simula el comportamiento de la mano humana y que permite la interacción háptica a través de esta interfaz. Las aplicaciones potenciales de este simulador están relacionadas con entrenamiento y educación de estudiantes de fisioterapia. En esta tesis, se han desarrollado nuevos métodos que permiten el control simultáneo de robots y manos robóticas en la interacción con entornos reales. El espacio de trabajo alcanzable por el dispositivo háptico, se extiende mediante el cambio de modo de control automático entre posición y velocidad. Además, estos métodos permiten reconocer el gesto del usuario durante las primeras etapas de aproximación al objeto para su agarre. Mediante experimentos de manipulación avanzada de objetos con un manipulador y diferentes manos robóticas, se muestra que el tiempo en realizar una tarea se reduce y que el sistema permite la realización de la tarea con precisión. Este trabajo, es el resultado de una colaboración con investigadores de Harvard BioRobotics Laboratory. ABSTRACT When we interact with the environment in our daily life (using a toothbrush, opening doors, using cell-phones, etc.), or in professional situations (medical interventions, manufacturing processes, etc.) we typically perform dexterous manipulations that involve multiple fingers and palm for both hands. Therefore, multi-Finger haptic methods can provide a realistic and natural human-machine interface to enhance immersion when interacting with simulated or remote environments. Most commercial devices allow haptic interaction with only one contact point, which may be sufficient for some exploration or palpation tasks but are not enough to perform advanced object manipulations such as grasping. In this thesis, I investigate the mechanical design, control and applications of a modular haptic device that can provide force feedback to the index, thumb and middle fingers of the user. The designed mechanical device is optimized with a multi-objective design function to achieve a low inertia, a large workspace, manipulability, and force-feedback of up to 3 N within the workspace; the bandwidth and rigidity for the device is assessed through simulation and real experimentation. One of the most important areas when designing haptic devices is the end-effector, since it is in contact with the user. In this thesis the design and evaluation of a thimble-like, lightweight, user-adaptable, and cost-effective device that incorporates four contact force sensors is described. This design allows estimation of the forces applied by a user during manipulation of virtual and real objects. The design of a real-time, modular control architecture for multi-finger haptic interaction is described. Requirements for control of multi-finger haptic devices are explored. Moreover, a large number of signals have to be acquired, processed, sent over the network and mathematical computations such as device direct and inverse kinematics, jacobian, grasp detection algorithms, etc. have to be calculated in Real Time to assure the required high fidelity for the haptic interaction. The Hardware control architecture has different modules and consists of an FPGA for the low-level controller and a RT controller for managing all the complex calculations (jacobian, kinematics, etc.); this provides a compact and scalable solution for the required high computation capabilities assuring a correct frequency rate for the control loop of 1 kHz. A set-up for dexterous virtual and real manipulation is described. Moreover, a new algorithm named the iterative kinematic decoupling method was implemented to solve the inverse kinematics of a robotic manipulator. In order to understand the importance of multi-modal interaction including haptics, a subject study was carried out to look for sensory stimuli that correlate with fast response time and enhanced accuracy. This experiment was carried out in collaboration with neuro-scientists from Technion Israel Institute of Technology. By comparing the grasping response times in unimodal (auditory, visual, and haptic) events with the response times in events with bimodal and trimodal combinations. It is concluded that in grasping tasks the synchronized motion of the fingers to generate the grasping response relies on haptic cues. This processing-speed advantage of haptic cues suggests that multimodalhaptic virtual environments are superior in generating motor contingencies, enhancing the plausibility of events. Applications that include haptics provide users with more time at the cognitive stages to fill in missing information creatively and form a richer experience. A major application of haptic devices is the design of new simulators to train manual skills for the medical sector. In collaboration with physical therapists from Griffith University in Australia, we developed a simulator to allow hand rehabilitation manipulations. First, the non-linear stiffness properties of the metacarpophalangeal joint of the index finger were estimated by using the designed end-effector; these parameters are implemented in a scenario that simulates the behavior of the human hand and that allows haptic interaction through the designed haptic device. The potential application of this work is related to educational and medical training purposes. In this thesis, new methods to simultaneously control the position and orientation of a robotic manipulator and the grasp of a robotic hand when interacting with large real environments are studied. The reachable workspace is extended by automatically switching between rate and position control modes. Moreover, the human hand gesture is recognized by reading the relative movements of the index, thumb and middle fingers of the user during the early stages of the approximation-to-the-object phase and then mapped to the robotic hand actuators. These methods are validated to perform dexterous manipulation of objects with a robotic manipulator, and different robotic hands. This work is the result of a research collaboration with researchers from the Harvard BioRobotics Laboratory. The developed experiments show that the overall task time is reduced and that the developed methods allow for full dexterity and correct completion of dexterous manipulations.
Resumo:
This study investigated the Kinaesthetic Fusion Effect (KFE) first described by Craske and Kenny in 1981. The current study did not replicate these findings. Participants did not perceive any reduction in the sagittal separation of a button pressed by the index finger of one arm and a probe touching the other, following repeated exposure to the tactile stimuli present on both unseen arms. This study’s failure to replicate the widely-cited KFE as described by Craske et al. (1984) suggests that it may be contingent on several aspects of visual information, especially the availability of a specific visual reference, the role of instructions regarding gaze direction, and the potential use of a line of sight strategy when referring felt positions to an interposed surface. In addition, a foreshortening effect was found; this may result from a line-of-sight judgment and represent a feature of the reporting method used. The transformed line of sight data were regressed against the participant reported values, resulting in a slope of 1.14 (right arm) and 1.11 (left arm), and r > 0.997 for each. The study also provides additional evidence that mis-perceptions of the mediolateral position of the limbs specifically their separation and consistent with notions of Gestalt grouping, is somewhat labile and can be influenced by active motions causing touch of one limb by the other. Finally, this research will benefit future studies that require participants to report the perceived locations of the unseen limbs.
Resumo:
This study investigated the Kinaesthetic Fusion Effect (KFE) first described by Craske and Kenny in 1981. The current study did not replicate these findings following a change in the reporting method used by participants. Participants did not perceive any reduction in the sagittal separation of a button pressed by the index finger of one arm and a probe touching the other, following repeated exposure to the tactile stimuli present on both unseen arms. This study’s failure to replicate the widely-cited KFE as described by Craske et al. (1984) suggests that it may be contingent on several aspects of visual information, especially the availability of a specific visual reference, the role of instructions regarding gaze direction, and the potential use of a line of sight strategy when referring felt positions to an interposed surface. In addition, a foreshortening effect was found; this may result from a line-of-sight judgment and represent a feature of the reporting method used. Finally, this research will benefit future studies that require participants to report the perceived locations of the unseen limbs.
Resumo:
This study investigated the Kinaesthetic Fusion Effect (KFE) first described by Craske and Kenny in 1981. In Experiment 1 the study did not replicate these findings following a change in the reporting method used by participants. Participants did not perceive any reduction in the sagittal separation of a button pressed by the index finger of one arm and a probe touching the other, following repeated exposure to the tactile stimuli present on both unseen arms. This study’s failure to replicate the widely-cited KFE as described by Craske et al. (1984) suggests that it may be contingent on several aspects of visual information, especially the availability of a specific visual reference, the role of instructions regarding gaze direction, and the potential use of a line of sight strategy when referring felt positions to an interposed surface. In addition, a foreshortening effect was found; this may result from a line-of-sight judgment and represent a feature of the reporting method used. Finally, this research will benefit future studies that require participants to report the perceived locations of the unseen limbs. Experiment 2 investigated the KFE when the visual reference was removed and participants made reports of touched position, blindfolded. A number of interesting outcomes arose from this change and may provide clarification to the phenomena.
Resumo:
This research is concerned with the development of tactual displays to supplement the information available through lipreading. Because voicing carries a high informational load in speech and is not well transmitted through lipreading, the efforts are focused on providing tactual displays of voicing to supplement the information available on the lips of the talker. This research includes exploration of 1) signal-processing schemes to extract information about voicing from the acoustic speech signal, 2) methods of displaying this information through a multi-finger tactual display, and 3) perceptual evaluations of voicing reception through the tactual display alone (T), lipreading alone (L), and the combined condition (L+T). Signal processing for the extraction of voicing information used amplitude-envelope signals derived from filtered bands of speech (i.e., envelopes derived from a lowpass-filtered band at 350 Hz and from a highpass-filtered band at 3000 Hz). Acoustic measurements made on the envelope signals of a set of 16 initial consonants represented through multiple tokens of C1VC2 syllables indicate that the onset-timing difference between the low- and high-frequency envelopes (EOA: envelope-onset asynchrony) provides a reliable and robust cue for distinguishing voiced from voiceless consonants. This acoustic cue was presented through a two-finger tactual display such that the envelope of the high-frequency band was used to modulate a 250-Hz carrier signal delivered to the index finger (250-I) and the envelope of the low-frequency band was used to modulate a 50-Hz carrier delivered to the thumb (50T). The temporal-onset order threshold for these two signals, measured with roving signal amplitude and duration, averaged 34 msec, sufficiently small for use of the EOA cue. Perceptual evaluations of the tactual display of EOA with speech signal indicated: 1) that the cue was highly effective for discrimination of pairs of voicing contrasts; 2) that the identification of 16 consonants was improved by roughly 15 percentage points with the addition of the tactual cue over L alone; and 3) that no improvements in L+T over L were observed for reception of words in sentences, indicating the need for further training on this task
Resumo:
Although it has long been supposed that resistance training causes adaptive changes in the CNS, the sites and nature of these adaptations have not previously been identified. In order to determine whether the neural adaptations to resistance training occur to a greater extent at cortical or subcortical sites in the CNS, we compared the effects of resistance training on the electromyographic (EMG) responses to transcranial magnetic (TMS) and electrical (TES) stimulation. Motor evoked potentials (MEPs) were recorded from the first dorsal interosseous muscle of 16 individuals before and after 4 weeks of resistance training for the index finger abductors (n=8), or training involving finger abduction-adduction without external resistance (n=8). TMS was delivered at rest at intensities from 5% below the passive threshold to the maximal output of the stimulator. TMS and TES were also delivered at the active threshold intensity while the participants exerted torques ranging from 5 to 60% of their maximum voluntary contraction (MVC) torque. The average latency of MEPs elicited by TES was significantly shorter than that of TMS MEPs (TES latency=21.5+/-1.4 ms; TMS latency=23.4+/-1.4 ms; P
Resumo:
The control of movement is predicated upon a system of constraints of musculoskeletal and neural origin. The focus of the present study was upon the manner in which such constraints are adapted or superseded during the acquisition of motor skill. Individuals participated in five experimental sessions, ill which they attempted to produce abduction-adduction movements of the index finger in time with an auditory metronome. During each trial, the metronome frequency was increased in eight steps from an individually determined base frequency. Electromyographic (EMC) activity was recorded from first dorsal interosseous (FDI), first volar interosseous (FVI), flexor digitorum superficialis (FDS), and extensor digitorum communis (EDC) muscles. The movements produced on the final day of acquisition more accurately matched the required profile, and exhibited greater spatial and temporal stability, than those generated during initial performance. Tn the early stages of skill acquisition, an alternating pattern of activation in FDI and FVI was maintained, even at the highest frequencies. Tn contrast, as the frequency of movement was increased, activity in FDS and EDC was either tonic or intermittent. As learning proceeded, alterations in recruitment patterns were expressed primarily in the extrinsic muscles (EDC and FDS). These changes took the form of increases in the postural role of these muscles, shifts to phasic patterns of activation, or selective disengagement of these muscles. These findings suggest that there is considerable flexibility in the composition of muscle synergies, which is exploited by individuals during the acquisition of coordination.
Resumo:
Strategics for the control of human movement are constrained by the neuroanatomical characteristics of the motor system. In particular, there is evidence that the capacity of muscles for producing force has a strong influence on the stability of coordination in certain movement tasks. In the present experiment, our aim was to determine whether physiological adaptations that cause relatively long-lasting changes in the ability of muscles to produce force can influence the stability of coordination in a systematic manner. We assessed the effects of resistance training on the performance of a difficult coordination task that required participants to synchronize or syncopate movements of their index finger with an auditory metronome. Our results revealed that training that increased isometric finger strength also enhanced the stability of movement coordination. These changes were accompanied by alterations in muscle recruitment patterns. In Particular, the trained muscles were recruited in a more consistent fashion following the programme of resistance training. These results indicate that resistance training produces functional adaptations of the neuroanatomical constraints that underlie the control of voluntary movement.
Resumo:
The ability to synchronise actions with environmental events is a fundamental skill supporting a variety of group activities. In such situations, multiple sensory cues are usually available for synchronisation, yet previous studies have suggested that auditory cues dominate those from other modalities. We examine the control of rhythmic action on the basis of auditory and haptic cues and show that performance is sensitive to both sources of information for synchronisation. Participants were required to tap the dominant hand index finger in synchrony with a metronome defined by periodic auditory tones, imposed movements of the non-dominant index finger, or both cues together. Synchronisation was least variable with the bimodal metronome as predicted by a maximum likelihood estimation (MLE) model. However, increases in timing variability of the auditory cue resulted in some departures from the MLE model. Our findings indicate the need for further investigation of the MLE account of the integration of multisensory signals in the temporal control of action.
Resumo:
The Wing-Kristofferson (WK) model of movement timing emphasises the separation of central timer and motor processes. Several studies of repetitive timing have shown that increase in variability at longer intervals is attributable to timer processes; however, relatively little is known about the way motor aspects of timing are affected by task movement constraints. In the present study, we examined timing variability in finger tapping with differences in interval to assess central timer effects, and with differences in movement amplitude to assess motor implementation effects. Then, we investigated whether effects of motor timing observed at the point of response (flexion offset/tap) are also evident in extension, which would suggest that both phases are subject to timing control. Eleven participants performed bimanual simultaneous tapping, at two target intervals (400, 600 ms) with the index finger of each hand performing movements of equal (3 or 6 cm) or unequal amplitude (left hand 3, right hand 6 cm and vice versa). As expected, timer variability increased with the mean interval but showed only small, non-systematic effects with changes in movement amplitude. Motor implementation variability was greater in unequal amplitude conditions. The same pattern of motor variability was observed both at flexion and extension phases of movement. These results suggest that intervals are generated by a central timer, triggering a series of events at the motor output level including flexion and the following extension, which are explicitly represented in the timing system.
Resumo:
‘Temporally urgent’ reactions are extremely rapid, spatially precise movements that are evoked following discrete stimuli. The involvement of primary motor cortex (M1) and its relationship to stimulus intensity in such reactions is not well understood. Continuous theta burst stimulation (cTBS) suppresses focal regions of the cortex and can assess the involvement of motor cortex in speed of processing. The primary objective of this study was to explore the involvement of M1 in speed of processing with respect to stimulus intensity. Thirteen healthy young adults participated in this experiment. Behavioral testing consisted of a simple button press using the index finger following median nerve stimulation of the opposite limb, at either high or low stimulus intensity. Reaction time was measured by the onset of electromyographic activity from the first dorsal interosseous (FDI) muscle of each limb. Participants completed a 30 min bout of behavioral testing prior to, and 15 min following, the delivery of cTBS to the motor cortical representation of the right FDI. The effect of cTBS on motor cortex was measured by recording the average of 30 motor evoked potentials (MEPs) just prior to, and 5 min following, cTBS. Paired t-tests revealed that, of thirteen participants, five demonstrated a significant attenuation, three demonstrated a significant facilitation and five demonstrated no significant change in MEP amplitude following cTBS. Of the group that demonstrated attenuated MEPs, there was a biologically significant interaction between stimulus intensity and effect of cTBS on reaction time and amplitude of muscle activation. This study demonstrates the variability of potential outcomes associated with the use of cTBS and further study on the mechanisms that underscore the methodology is required. Importantly, changes in motor cortical excitability may be an important determinant of speed of processing following high intensity stimulation.
Resumo:
The process of learning to play a musical instrument necessarily alters the functional organisation of the cortical motor areas that are involved in generating the required movements. In the case of the harp, the demands placed on the motor system are quite specific. During performance, all digits with the sole exception of the little finger are used to pluck the strings. With a view to elucidating the impact of having acquired this highly specialized musical skill on the characteristics of corticospinal projections to the intrinsic hand muscles, focal transcranial magnetic stimulation (TMS) was used to elicit motor evoked potentials (MEPs) in three muscles (of the left hand): abductor pollicis brevis (APB); first dorsal interosseous (FDI); and abductor digiti minimi (ADM) in seven harpists. Seven non-musicians served as controls. With respect to the FDI muscle–which moves the index finger, the harpists exhibited reliably larger MEP amplitudes than those in the control group. In contrast, MEPs evoked in the ADM muscle–which activates the little finger, were smaller in the harpists than in the non-musicians. The locations on the scalp over which magnetic stimulation elicited discriminable responses in ADM also differed between the harpists and the non-musicians. This specific pattern of variation in the excitability of corticospinal projections to these intrinsic hand muscles exhibited by harpists is in accordance with the idiosyncratic functional demands that are imposed in playing this instrument.
Resumo:
Background Appropriate sensorimotor correlations can result in the illusion of ownership of exogenous body parts. Nevertheless, whether and how the illusion of owning a new body part affects human perception, and in particular pain detection, is still poorly investigated. Recent findings have shown that seeing one’s own body is analgesic, but it is not known whether this effect is transferable to newly embodied, but exogenous, body parts. In recent years, results from our laboratory have demonstrated that a virtual body can be felt as one’s own, provided realistic multisensory correlations. Methods The current work aimed at investigating the impact of virtual body ownership on pain threshold. An immersive virtual environment allowed a first-person perspective of a virtual body that replaced the own. Passive movement of the index finger congruent with the movement of the virtual index finger was used in the “synchronous” condition to induce ownership of the virtual arm. The pain threshold was tested by thermal stimulation under four conditions: 1) synchronous movements of the real and virtual fingers, 2) asynchronous movements, 3) seeing a virtual object instead of an arm, and 4) not seeing any limb in real world. Results Our results show that, independently of attentional and stimulus adaptation processes, the ownership of a virtual arm per se can significantly increase the thermal pain threshold. Conclusions This finding may be relevant for the development and improvement of digital solutions for rehabilitation and pain treatment.