3 resultados para posture-movement problem
em CentAUR: Central Archive University of Reading - UK
Resumo:
The coding of body part location may depend upon both visual and proprioceptive information, and allows targets to be localized with respect to the body. The present study investigates the interaction between visual and proprioceptive localization systems under conditions of multisensory conflict induced by optokinetic stimulation (OKS). Healthy subjects were asked to estimate the apparent motion speed of a visual target (LED) that could be located either in the extrapersonal space (visual encoding only, V), or at the same distance, but stuck on the subject's right index finger-tip (visual and proprioceptive encoding, V-P). Additionally, the multisensory condition was performed with the index finger kept in position both passively (V-P passive) and actively (V-P active). Results showed that the visual stimulus was always perceived to move, irrespective of its out- or on-the-body location. Moreover, this apparent motion speed varied consistently with the speed of the moving OKS background in all conditions. Surprisingly, no differences were found between V-P active and V-P passive conditions in the speed of apparent motion. The persistence of the visual illusion during the active posture maintenance reveals a novel condition in which vision totally dominates over proprioceptive information, suggesting that the hand-held visual stimulus was perceived as a purely visual, external object despite its contact with the hand.
Resumo:
Recent behavioural and neuroimaging studies have found that observation of human movement, but not of robotic movement, gives rise to visuomotor priming. This implies that the 'mirror neuron' or 'action observation–execution matching' system in the premotor and parietal cortices is entirely unresponsive to robotic movement. The present study investigated this hypothesis using an 'automatic imitation' stimulus–response compatibility procedure. Participants were required to perform a prespecified movement (e.g. opening their hand) on presentation of a human or robotic hand in the terminal posture of a compatible movement (opened) or an incompatible movement (closed). Both the human and the robotic stimuli elicited automatic imitation; the prespecified action was initiated faster when it was cued by the compatible movement stimulus than when it was cued by the incompatible movement stimulus. However, even when the human and robotic stimuli were of comparable size, colour and brightness, the human hand had a stronger effect on performance. These results suggest that effector shape is sufficient to allow the action observation–matching system to distinguish human from robotic movement. They also indicate, as one would expect if this system develops through learning, that to varying degrees both human and robotic action can be 'simulated' by the premotor and parietal cortices.