998 resultados para Cross-modal
Resumo:
Approaching or looming sounds (L-sounds) have been shown to selectively increase visual cortex excitability [Romei, V., Murray, M. M., Cappe, C., & Thut, G. Preperceptual and stimulus-selective enhancement of low-level human visual cortex excitability by sounds. Current Biology, 19, 1799-1805, 2009]. These cross-modal effects start at an early, preperceptual stage of sound processing and persist with increasing sound duration. Here, we identified individual factors contributing to cross-modal effects on visual cortex excitability and studied the persistence of effects after sound offset. To this end, we probed the impact of different L-sound velocities on phosphene perception postsound as a function of individual auditory versus visual preference/dominance using single-pulse TMS over the occipital pole. We found that the boosting of phosphene perception by L-sounds continued for several tens of milliseconds after the end of the L-sound and was temporally sensitive to different L-sound profiles (velocities). In addition, we found that this depended on an individual's preferred sensory modality (auditory vs. visual) as determined through a divided attention task (attentional preference), but not on their simple threshold detection level per sensory modality. Whereas individuals with "visual preference" showed enhanced phosphene perception irrespective of L-sound velocity, those with "auditory preference" showed differential peaks in phosphene perception whose delays after sound-offset followed the different L-sound velocity profiles. These novel findings suggest that looming signals modulate visual cortex excitability beyond sound duration possibly to support prompt identification and reaction to potentially dangerous approaching objects. The observed interindividual differences favor the idea that unlike early effects this late L-sound impact on visual cortex excitability is influenced by cross-modal attentional mechanisms rather than low-level sensory processes.
Resumo:
A variety of studies have demonstrated enhanced blood oxygenation level dependent responses to auditory and tactile stimuli within occipital cortex as a result of early blindness. However, little is known about the organizational principles that drive this cross-modal plasticity. We compared BOLD responses to a wide variety of auditory and tactile tasks (vs. rest) in early-blind and sighted subjects. As expected, cross-modal responses were larger in blind than in sighted subjects in occipital cortex for all tasks (cross-modal plasticity). Within both blind and sighted subject groups, we found patterns of cross-modal activity that were remarkably similar across tasks: a large proportion of cross-modal responses within occipital cortex are neither task nor stimulus specific. We next examined the mechanisms underlying enhanced BOLD responses within early-blind subjects. We found that the enhancement of cross-modal responses due to early blindness was best described as an additive shift, suggesting that cross-modal plasticity within blind subjects does not originate from either a scaling or unmasking of cross-modal responsivities found in sighted subjects.
Resumo:
In this research, a cross-model paradigm was chosen to test the hypothesis that affective olfactory and auditory cues paired with neutral visual stimuli bearing no resemblance or logical connection to the affective cues can evoke preference shifts in those stimuli. Neutral visual stimuli of abstract paintings were presented simultaneously with liked and disliked odours and sounds, with neutral-neutral pairings serving as controls. The results confirm previous findings that the affective evaluation of previously neutral visual stimuli shifts in the direction of contingently presented affective auditory stimuli. In addition, this research shows the presence of conditioning with affective odours having no logical connection with the pictures.
Resumo:
Synesthesia entails a special kind of sensory perception, where stimulation in one sensory modality leads to an internally generated perceptual experience of another, not stimulated sensory modality. This phenomenon can be viewed as an abnormal multisensory integration process as here the synesthetic percept is aberrantly fused with the stimulated modality. Indeed, recent synesthesia research has focused on multimodal processing even outside of the specific synesthesia-inducing context and has revealed changed multimodal integration, thus suggesting perceptual alterations at a global level. Here, we focused on audio-visual processing in synesthesia using a semantic classification task in combination with visually or auditory-visually presented animated and in animated objects in an audio-visual congruent and incongruent manner. Fourteen subjects with auditory-visual and/or grapheme-color synesthesia and 14 control subjects participated in the experiment. During presentation of the stimuli, event-related potentials were recorded from 32 electrodes. The analysis of reaction times and error rates revealed no group differences with best performance for audio-visually congruent stimulation indicating the well-known multimodal facilitation effect. We found enhanced amplitude of the N1 component over occipital electrode sites for synesthetes compared to controls. The differences occurred irrespective of the experimental condition and therefore suggest a global influence on early sensory processing in synesthetes.
Resumo:
During stereotactic functional neurosurgery, stimulation procedure to control for proper target localization provides a unique opportunity to investigate pathophysiological phenomena that cannot be addressed in experimental setups. Here we report on the distribution of response modalities to 487 intraoperative thalamic stimulations performed in 24 neurogenic pain (NP), 17 parkinsonian (PD) and 10 neuropsychiatric (Npsy) patients. Threshold responses were subdivided into somatosensory, motor and affective, and compared between medial (central lateral nucleus) and lateral (ventral anterior, ventral lateral and ventral medial) thalamic nuclei and between patients groups. Major findings were as follows: in the medial thalamus, evoked responses were for a large majority (95%) somatosensory in NP patients, 47% were motor in PD patients, and 54% affective in Npsy patients. In the lateral thalamus, a much higher proportion of somatosensory (83%) than motor responses (5%) was evoked in NP patients, while the proportion was reversed in PD patients (69% motor vs. 21% somatosensory). These results provide the first evidence for functional cross-modal changes in lateral and medial thalamic nuclei in response to intraoperative stimulations in different functional disorders. This extensive functional reorganization sheds new light on wide-range plasticity in the adult human thalamocortical system.
Resumo:
Background: There is evidence showing that men and women differ with regard to the processing of emotional information. However, the mechanisms behind these differences are not fully understood. Method: The sample comprised of 275 (167 female) right-handed, healthy participants, recruited from the community. We employed a customized affective priming task, which consisted of three subtests, differing in the modality of the prime (face, written word, and sound). The targets were always written words of either positive or negative valence. The priming effect was measured as reaction time facilitation in conditions where both prime and target were emotional (of the same positive or negative valence) compared with conditions where the emotional targets were preceded by neutral primes. Results: The priming effect was observed across all three modalities, with an interaction of gender by valence: the priming effect in the emotionally negative condition in male participants was stronger compared with females. This was accounted for by the differential priming effect within the female group where priming was significantly smaller in the emotionally negative conditions compared with the positive conditions. The male participants revealed a comparable priming effect across both the emotionally negative and positive conditions. Conclusion: Reduced priming in negative conditions in women may reflect interference processes due to greater sensitivity to negative valence of stimuli. This in turn could underlie the gender-related differences in susceptibility to emotional disorders.
Resumo:
Resolution of multisensory deficits has been observed in teenagers with Autism Spectrum Disorders (ASD) for complex, social speech stimuli; this resolution extends to more basic multisensory processing, involving low-level stimuli. In particular, a delayed transition of multisensory integration (MSI) from a default state of competition to one of facilitation has been observed in ASD children. In other terms, the complete maturation of MSI is achieved later in ASD. In the present study a neuro-computational model is used to reproduce some patterns of behavior observed experimentally, modeling a bisensory reaction time task, in which auditory and visual stimuli are presented in random sequence alone (A or V) or together (AV). The model explains how the default competitive state can be implemented via mutual inhibition between primary sensory areas, and how the shift toward the classical multisensory facilitation, observed in adults, is the result of inhibitory cross-modal connections becoming excitatory during the development. Model results are consistent with a stronger cross-modal inhibition in ASD children, compared to normotypical (NT) ones, suggesting that the transition toward a cooperative interaction between sensory modalities takes longer to occur. Interestingly, the model also predicts the difference between unisensory switch trials (in which sensory modality switches) and unisensory repeat trials (in which sensory modality repeats). This is due to an inhibitory mechanism, characterized by a slow dynamics, driven by the preceding stimulus and inhibiting the processing of the incoming one, when of the opposite sensory modality. These findings link the cognitive framework delineated by the empirical results to a plausible neural implementation.
Resumo:
Diminished balance ability poses a serious health risk due to the increased likelihood of falling, and impaired postural stability is significantly associated with blindness and poor vision. Noise stimulation (by improving the detection of sub-threshold somatosensory information) and tactile supplementation (i.e. additional haptic information provided by an external contact surface) have been shown to improve the performance of the postural control system. Moreover, vibratory noise added to the source of tactile supplementation (e.g. applied to a surface that the fingertip touches) has been shown to enhance balance stability more effectively than tactile supplementation alone. In view of the above findings, in addition to the well established consensus that blind subjects show superior abilities in the use of tactile information, we hypothesized that blind subjects may take extra benefits from the vibratory noise added to the tactile supplementation and hence show greater improvements in postural stability than those observed for sighted subjects. If confirmed, this hypothesis may lay the foundation for the development of noise-based assistive devices (e.g. canes, walking sticks) for improving somatosensation and hence prevent falls in blind individuals. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
The coordination of movement is governed by a coalition of constraints. The expression of these constraints ranges from the concrete—the restricted range of motion offered by the mechanical configuration of our muscles and joints; to the abstract—the difficulty that we experience in combining simple movements into complex rhythms. We seek to illustrate that the various constraints on coordination are complementary and inclusive, and the means by which their expression and interaction are mediated systematically by the integrative action of the central nervous system (CNS). Beyond identifying the general principles at the behavioural level that govern the mutual interplay of constraints, we attempt to demonstrate that these principles have as their foundation specific functional properties of the cortical motor systems. We propose that regions of the brain upstream of the motor cortex may play a significant role in mediating interactions between the functional representations of muscles engaged in sensorimotor coordination tasks. We also argue that activity in these ldquosupramotorrdquo regions may mediate the stabilising role of augmented sensory feedback.
Resumo:
The processing of biological motion is a critical, everyday task performed with remarkable efficiency by human sensory systems. Interest in this ability has focused to a large extent on biological motion processing in the visual modality (see, for example, Cutting, J. E., Moore, C., & Morrison, R. (1988). Masking the motions of human gait. Perception and Psychophysics, 44(4), 339-347). In naturalistic settings, however, it is often the case that biological motion is defined by input to more than one sensory modality. For this reason, here in a series of experiments we investigate behavioural correlates of multisensory, in particular audiovisual, integration in the processing of biological motion cues. More specifically, using a new psychophysical paradigm we investigate the effect of suprathreshold auditory motion on perceptions of visually defined biological motion. Unlike data from previous studies investigating audiovisual integration in linear motion processing [Meyer, G. F. & Wuerger, S. M. (2001). Cross-modal integration of auditory and visual motion signals. Neuroreport, 12(11), 2557-2560; Wuerger, S. M., Hofbauer, M., & Meyer, G. F. (2003). The integration of auditory and motion signals at threshold. Perception and Psychophysics, 65(8), 1188-1196; Alais, D. & Burr, D. (2004). No direction-specific bimodal facilitation for audiovisual motion detection. Cognitive Brain Research, 19, 185-194], we report the existence of direction-selective effects: relative to control (stationary) auditory conditions, auditory motion in the same direction as the visually defined biological motion target increased its detectability, whereas auditory motion in the opposite direction had the inverse effect. Our data suggest these effects do not arise through general shifts in visuo-spatial attention, but instead are a consequence of motion-sensitive, direction-tuned integration mechanisms that are, if not unique to biological visual motion, at least not common to all types of visual motion. Based on these data and evidence from neurophysiological and neuroimaging studies we discuss the neural mechanisms likely to underlie this effect.
Resumo:
Modern cochlear implantation technologies allow deaf patients to understand auditory speech; however, the implants deliver only a coarse auditory input and patients must use long-term adaptive processes to achieve coherent percepts. In adults with post-lingual deafness, the high progress of speech recovery is observed during the first year after cochlear implantation, but there is a large range of variability in the level of cochlear implant outcomes and the temporal evolution of recovery. It has been proposed that when profoundly deaf subjects receive a cochlear implant, the visual cross-modal reorganization of the brain is deleterious for auditory speech recovery. We tested this hypothesis in post-lingually deaf adults by analysing whether brain activity shortly after implantation correlated with the level of auditory recovery 6 months later. Based on brain activity induced by a speech-processing task, we found strong positive correlations in areas outside the auditory cortex. The highest positive correlations were found in the occipital cortex involved in visual processing, as well as in the posterior-temporal cortex known for audio-visual integration. The other area, which positively correlated with auditory speech recovery, was localized in the left inferior frontal area known for speech processing. Our results demonstrate that the visual modality's functional level is related to the proficiency level of auditory recovery. Based on the positive correlation of visual activity with auditory speech recovery, we suggest that visual modality may facilitate the perception of the word's auditory counterpart in communicative situations. The link demonstrated between visual activity and auditory speech perception indicates that visuoauditory synergy is crucial for cross-modal plasticity and fostering speech-comprehension recovery in adult cochlear-implanted deaf patients.
Resumo:
Early blindness results in occipital cortex neurons responding to a wide range of auditory and tactile stimuli. These changes in tuning properties are accompanied by an extensive reorganization of the occipital cortex that includes alterations in anatomical structure, neurochemical and metabolic pathways. Although it has been established in animal models that neurochemical pathways are heavily affected by early visual deprivation, the effects of blindness on these pathways in humans is still not well characterized. Here, using (1)H magnetic resonance spectroscopy in nine early blind and normally sighted subjects, we find that early blindness is associated with higher levels of creatine, choline and myo-Inositol and indications of lower levels of GABA within the occipital cortex. These results suggest that the cross-modal responses associated with early blindness may, at least in part, be driven by changes within occipital biochemical pathways.
Resumo:
Vision provides a primary sensory input for food perception. It raises expectations on taste and nutritional value and drives acceptance or rejection. So far, the impact of visual food cues varying in energy content on subsequent taste integration remains unexplored. Using electrical neuroimaging, we assessed whether high- and low-calorie food cues differentially influence the brain processing and perception of a subsequent neutral electric taste. When viewing high-calorie food images, participants reported the subsequent taste to be more pleasant than when low-calorie food images preceded the identical taste. Moreover, the taste-evoked neural activity was stronger in the bilateral insula and the adjacent frontal operculum (FOP) within 100 ms after taste onset when preceded by high- versus low-calorie cues. A similar pattern evolved in the anterior cingulate (ACC) and medial orbitofrontal cortex (OFC) around 180 ms, as well as, in the right insula, around 360 ms. The activation differences in the OFC correlated positively with changes in taste pleasantness, a finding that is an accord with the role of the OFC in the hedonic evaluation of taste. Later activation differences in the right insula likely indicate revaluation of interoceptive taste awareness. Our findings reveal previously unknown mechanisms of cross-modal, visual-gustatory, sensory interactions underlying food evaluation.