947 resultados para Audio-visual Integration


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Neuroimaging of the self has focused on high-level mechanisms such as language, memory or imagery of the self and implicated widely distributed brain networks. Yet recent evidence suggests that low-level mechanisms such as multisensory and sensorimotor integration may play a fundamental role in self-related processing. In the present study we used visuotactile multisensory conflict, robotics, virtual reality, and fMRI to study such low-level mechanisms by experimentally inducing changes in self-location. Participants saw a video of a person's back (body) or an empty room (no-body) being stroked while a MR-compatible robotic device stroked their back. The latter tactile input was synchronous or asynchronous with respect to the seen stroking. Self-location was estimated behaviorally confirming previous data that self-location only differed between the two body conditions. fMRI results showed a bilateral activation of the temporo-parietal cortex with a significantly higher BOLD signal increase in the synchronous/body condition with respect to the other conditions. Sensorimotor cortex and extrastriate-body-area were also activated. We argue that temporo-parietal activity reflects the experience of the conscious 'I' as embodied and localized within bodily space, compatible with clinical data in neurological patients with out-of-body experiences.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Current models of brain organization include multisensory interactions at early processing stages and within low-level, including primary, cortices. Embracing this model with regard to auditory-visual (AV) interactions in humans remains problematic. Controversy surrounds the application of an additive model to the analysis of event-related potentials (ERPs), and conventional ERP analysis methods have yielded discordant latencies of effects and permitted limited neurophysiologic interpretability. While hemodynamic imaging and transcranial magnetic stimulation studies provide general support for the above model, the precise timing, superadditive/subadditive directionality, topographic stability, and sources remain unresolved. We recorded ERPs in humans to attended, but task-irrelevant stimuli that did not require an overt motor response, thereby circumventing paradigmatic caveats. We applied novel ERP signal analysis methods to provide details concerning the likely bases of AV interactions. First, nonlinear interactions occur at 60-95 ms after stimulus and are the consequence of topographic, rather than pure strength, modulations in the ERP. AV stimuli engage distinct configurations of intracranial generators, rather than simply modulating the amplitude of unisensory responses. Second, source estimations (and statistical analyses thereof) identified primary visual, primary auditory, and posterior superior temporal regions as mediating these effects. Finally, scalar values of current densities in all of these regions exhibited functionally coupled, subadditive nonlinear effects, a pattern increasingly consistent with the mounting evidence in nonhuman primates. In these ways, we demonstrate how neurophysiologic bases of multisensory interactions can be noninvasively identified in humans, allowing for a synthesis across imaging methods on the one hand and species on the other.