37 resultados para multisensory


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The processing of biological motion is a critical, everyday task performed with remarkable efficiency by human sensory systems. Interest in this ability has focused to a large extent on biological motion processing in the visual modality (see, for example, Cutting, J. E., Moore, C., & Morrison, R. (1988). Masking the motions of human gait. Perception and Psychophysics, 44(4), 339-347). In naturalistic settings, however, it is often the case that biological motion is defined by input to more than one sensory modality. For this reason, here in a series of experiments we investigate behavioural correlates of multisensory, in particular audiovisual, integration in the processing of biological motion cues. More specifically, using a new psychophysical paradigm we investigate the effect of suprathreshold auditory motion on perceptions of visually defined biological motion. Unlike data from previous studies investigating audiovisual integration in linear motion processing [Meyer, G. F. & Wuerger, S. M. (2001). Cross-modal integration of auditory and visual motion signals. Neuroreport, 12(11), 2557-2560; Wuerger, S. M., Hofbauer, M., & Meyer, G. F. (2003). The integration of auditory and motion signals at threshold. Perception and Psychophysics, 65(8), 1188-1196; Alais, D. & Burr, D. (2004). No direction-specific bimodal facilitation for audiovisual motion detection. Cognitive Brain Research, 19, 185-194], we report the existence of direction-selective effects: relative to control (stationary) auditory conditions, auditory motion in the same direction as the visually defined biological motion target increased its detectability, whereas auditory motion in the opposite direction had the inverse effect. Our data suggest these effects do not arise through general shifts in visuo-spatial attention, but instead are a consequence of motion-sensitive, direction-tuned integration mechanisms that are, if not unique to biological visual motion, at least not common to all types of visual motion. Based on these data and evidence from neurophysiological and neuroimaging studies we discuss the neural mechanisms likely to underlie this effect.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Evidence of multisensory interactions within low-level cortices and at early post-stimulus latencies has prompted a paradigm shift in conceptualizations of sensory organization. However, the mechanisms of these interactions and their link to behavior remain largely unknown. One behaviorally salient stimulus is a rapidly approaching (looming) object, which can indicate potential threats. Based on findings from humans and nonhuman primates suggesting there to be selective multisensory (auditory-visual) integration of looming signals, we tested whether looming sounds would selectively modulate the excitability of visual cortex. We combined transcranial magnetic stimulation (TMS) over the occipital pole and psychophysics for "neurometric" and psychometric assays of changes in low-level visual cortex excitability (i.e., phosphene induction) and perception, respectively. Across three experiments we show that structured looming sounds considerably enhance visual cortex excitability relative to other sound categories and white-noise controls. The time course of this effect showed that modulation of visual cortex excitability started to differ between looming and stationary sounds for sound portions of very short duration (80 ms) that were significantly below (by 35 ms) perceptual discrimination threshold. Visual perceptions are thus rapidly and efficiently boosted by sounds through early, preperceptual and stimulus-selective modulation of neuronal excitability within low-level visual cortex.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: Living in a multisensory world entails the continuous sensory processing of environmental information in order to enact appropriate motor routines. The interaction between our body and our brain is the crucial factor for achieving such sensorimotor integration ability. Several clinical conditions dramatically affect the constant body-brain exchange, but the latest developments in biomedical engineering provide promising solutions for overcoming this communication breakdown. NEW METHOD: The ultimate technological developments succeeded in transforming neuronal electrical activity into computational input for robotic devices, giving birth to the era of the so-called brain-machine interfaces. Combining rehabilitation robotics and experimental neuroscience the rise of brain-machine interfaces into clinical protocols provided the technological solution for bypassing the neural disconnection and restore sensorimotor function. RESULTS: Based on these advances, the recovery of sensorimotor functionality is progressively becoming a concrete reality. However, despite the success of several recent techniques, some open issues still need to be addressed. COMPARISON WITH EXISTING METHOD(S): Typical interventions for sensorimotor deficits include pharmaceutical treatments and manual/robotic assistance in passive movements. These procedures achieve symptoms relief but their applicability to more severe disconnection pathologies is limited (e.g. spinal cord injury or amputation). CONCLUSIONS: Here we review how state-of-the-art solutions in biomedical engineering are continuously increasing expectances in sensorimotor rehabilitation, as well as the current challenges especially with regards to the translation of the signals from brain-machine interfaces into sensory feedback and the incorporation of brain-machine interfaces into daily activities.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dans le domaine de la perception, l'apprentissage est contraint par la présence d'une architecture fonctionnelle constituée d'aires corticales distribuées et très spécialisées. Dans le domaine des troubles visuels d'origine cérébrale, l'apprentissage d'un patient hémi-anopsique ou agnosique sera limité par ses capacités perceptives résiduelles, mais un déficit de reconnaissance visuelle de nature apparemment perceptive, peut également être associé à une altération des représentations en mémoire à long terme. Des réseaux neuronaux distincts pour la reconnaissance - cortex temporal - et pour la localisation des sons - cortex pariétal - ont été décrits chez l'homme. L'étude de patients cérébro-lésés confirme le rôle des indices spatiaux dans un traitement auditif explicite du « where » et dans la discrimination implicite du « what ». Cette organisation, similaire à ce qui a été décrit dans la modalité visuelle, faciliterait les apprentissages perceptifs. Plus généralement, l'apprentissage implicite fonde une grande partie de nos connaissances sur le monde en nous rendant sensible, à notre insu, aux règles et régularités de notre environnement. Il serait impliqué dans le développement cognitif, la formation des réactions émotionnelles ou encore l'apprentissage par le jeune enfant de sa langue maternelle. Le caractère inconscient de cet apprentissage est confirmé par l'étude des temps de réaction sériels de patients amnésiques dans l'acquisition d'une grammaire artificielle. Son évaluation pourrait être déterminante dans la prise en charge ré-adaptative. [In the field of perception, learning is formed by a distributed functional architecture of very specialized cortical areas. For example, capacities of learning in patients with visual deficits - hemianopia or visual agnosia - from cerebral lesions are limited by perceptual abilities. Moreover a visual deficit in link with abnormal perception may be associated with an alteration of representations in long term (semantic) memory. Furthermore, perception and memory traces rely on parallel processing. This has been recently demonstrated for human audition. Activation studies in normal subjects and psychophysical investigations in patients with focal hemispheric lesions have shown that auditory information relevant to sound recognition and that relevant to sound localisation are processed in parallel, anatomically distinct cortical networks, often referred to as the "What" and "Where" processing streams. Parallel processing may appear counterintuitive from the point of view of a unified perception of the auditory world, but there are advantages, such as rapidity of processing within a single stream, its adaptability in perceptual learning or facility of multisensory interactions. More generally, implicit learning mechanisms are responsible for the non-conscious acquisition of a great part of our knowledge about the world, using our sensitivity to the rules and regularities structuring our environment. Implicit learning is involved in cognitive development, in the generation of emotional processing and in the acquisition of natural language. Preserved implicit learning abilities have been shown in amnesic patients with paradigms like serial reaction time and artificial grammar learning tasks, confirming that implicit learning mechanisms are not sustained by the cognitive processes and the brain structures that are damaged in amnesia. In a clinical perspective, the assessment of implicit learning abilities in amnesic patients could be critical for building adapted neuropsychological rehabilitation programs.]

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Multisensory interactions have been documented within low-level, even primary, cortices and at early post-stimulus latencies. These effects are in turn linked to behavioral and perceptual modulations. In humans, visual cortex excitability, as measured by transcranial magnetic stimulation (TMS) induced phosphenes, can be reliably enhanced by the co-presentation of sounds. This enhancement occurs at pre-perceptual stages and is selective for different types of complex sounds. However, the source(s) of auditory inputs effectuating these excitability changes in primary visual cortex remain disputed. The present study sought to determine if direct connections between low-level auditory cortices and primary visual cortex are mediating these kinds of effects by varying the pitch and bandwidth of the sounds co-presented with single-pulse TMS over the occipital pole. Our results from 10 healthy young adults indicate that both the central frequency and bandwidth of a sound independently affect the excitability of visual cortex during processing stages as early as 30 msec post-sound onset. Such findings are consistent with direct connections mediating early-latency, low-level multisensory interactions within visual cortices.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Summary : Four distinct olfactory subsystems compose the mouse olfactory system, the main olfactory epithelium (MOE), the septal organ of Masera (SO), the vomeronasal organ (VNO) and the Grueneberg ganglion (GG). They are implicated in the sensory modalities of the animal and they evolved to analyse and discriminate molecules carrying chemical messages, such as odorants and pheromones. In this thesis, the VNO, principally implicated in pheromonal communications as well as the GG, which had no function attributed until this work, were investigated from their morphology to their physiological functions, using an array of biochemical and physiological methods. First, the roles of a particular protein, the CNGA4 ion channel, were investigated in the VNO. In the MOE, CNGA4 is expressed as a modulatory channel subunit implicated in odour discrimination and adaptation. Interestingly, this calcium channel is the unique member of the cyclic nucleotide-gated (CNG) family to be expressed in the VNO and up to this work its functions remained unknown. Using a combination of transgenic and knockout mice, as well as histological and physiological approaches, we have characterized CNGA4 expression in the VNO. A strong expression in immature neurons was found as well as in the microvilli of mature neurons (putative site of chemodetection). Interestingly and confirming its dual localisation, the genetic invalidation of the CNGA4 channel has, as consequences, a strong impairment in vomeronasal maturation as well as deficit in pheromone sensing. Thus the CNGA4 channel appears to be a multifunctional protein in the mouse VNO playing essential role(s) in this organ. During the second part of the work, the morphology of the most recently described olfactory subsystem, the Grueneberg ganglion, was investigated in detail. Interestingly we found that glial cells and ciliated neurons compose this olfactory ganglion. This particular morphological aspect was similar to the olfactory AWC neurons from C. elegans which was used for further comparisons. Thus as for AWC neurons, we found that GG neurons are sensitive to temperature changes and are able to detect highly volatile molecules. Indeed, the presence of alarm pheromones (APs) secreted by stressed mice, elicit strong cellular responses, as well as a GG dependent behavioural changes. Investigations on the signaling elements present in GG neurons revealed that, as for AWC neurons, or pGC-D expressing neurons from the MOE, proteins participating in a cGMP pathway were found in GG neurons such as pGC-G and CNGA3 channels. These two proteins might be implicated in chemosensing as well as in thermosensing, two apparent properties of this organ. In this thesis, the multisensory modalities of two mouse olfactory subsystems were described and are related to a high degree of complexity required for the animal to sense its environment

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Single-trial encounters with multisensory stimuli affect both memory performance and early-latency brain responses to visual stimuli. Whether and how auditory cortices support memory processes based on single-trial multisensory learning is unknown and may differ qualitatively and quantitatively from comparable processes within visual cortices due to purported differences in memory capacities across the senses. We recorded event-related potentials (ERPs) as healthy adults (n = 18) performed a continuous recognition task in the auditory modality, discriminating initial (new) from repeated (old) sounds of environmental objects. Initial presentations were either unisensory or multisensory; the latter entailed synchronous presentation of a semantically congruent or a meaningless image. Repeated presentations were exclusively auditory, thus differing only according to the context in which the sound was initially encountered. Discrimination abilities (indexed by d') were increased for repeated sounds that were initially encountered with a semantically congruent image versus sounds initially encountered with either a meaningless or no image. Analyses of ERPs within an electrical neuroimaging framework revealed that early stages of auditory processing of repeated sounds were affected by prior single-trial multisensory contexts. These effects followed from significantly reduced activity within a distributed network, including the right superior temporal cortex, suggesting an inverse relationship between brain activity and behavioural outcome on this task. The present findings demonstrate how auditory cortices contribute to long-term effects of multisensory experiences on auditory object discrimination. We propose a new framework for the efficacy of multisensory processes to impact both current multisensory stimulus processing and unisensory discrimination abilities later in time.