988 resultados para Auditory temporal processing
Resumo:
The role of GABA in the central processing of complex auditory signals is not fully understood. We have studied the involvement of GABA(A)-mediated inhibition in the processing of birdsong, a learned vocal communication signal requiring intact hearing for its development and maintenance. We focused on caudomedial nidopallium (NCM), an area analogous to parts of the mammalian auditory cortex with selective responses to birdsong. We present evidence that GABA(A)-mediated inhibition plays a pronounced role in NCM`s auditory processing of birdsong. Using immunocytochemistry, we show that approximately half of NCM`s neurons are GABAergic. Whole cell patch-clamp recordings in a slice preparation demonstrate that, at rest, spontaneously active GABAergic synapses inhibit excitatory inputs onto NCM neurons via GABA(A) receptors. Multi-electrode electrophysiological recordings in awake birds show that local blockade of GABA(A)-mediated inhibition in NCM markedly affects the temporal pattern of song-evoked responses in NCM without modifications in frequency tuning. Surprisingly, this blockade increases the phasic and largely suppresses the tonic response component, reflecting dynamic relationships of inhibitory networks that could include disinhibition. Thus processing of learned natural communication sounds in songbirds, and possibly other vocal learners, may depend on complex interactions of inhibitory networks.
Resumo:
Although the effects of cannabis on perception are well documented, little is known about their neural basis or how these may contribute to the formation of psychotic symptoms. We used functional magnetic resonance imaging (fMRI) to assess the effects of Delta-9-tetrahydrocannabinol (THC) and cannabidiol (CBD) during visual and auditory processing in healthy volunteers. In total, 14 healthy volunteers were scanned on three occasions. Identical 10mg THC, 600mg CBD, and placebo capsules were allocated in a balanced double-blinded pseudo-randomized crossover design. Plasma levels of each substance, physiological parameters, and measures of psychopathology were taken at baseline and at regular intervals following ingestion of substances. Volunteers listened passively to words read and viewed a radial visual checkerboard in alternating blocks during fMRI scanning. Administration of THC was associated with increases in anxiety, intoxication, and positive psychotic symptoms, whereas CBD had no significant symptomatic effects. THC decreased activation relative to placebo in bilateral temporal cortices during auditory processing, and increased and decreased activation in different visual areas during visual processing. CBD was associated with activation in right temporal cortex during auditory processing, and when contrasted, THC and CBD had opposite effects in the right posterior superior temporal gyrus, the right-sided homolog to Wernicke`s area. Moreover, the attenuation of activation in this area (maximum 61, -15, -2) by THC during auditory processing was correlated with its acute effect on psychotic symptoms. Single doses of THC and CBD differently modulate brain function in areas that process auditory and visual stimuli and relate to induced psychotic symptoms. Neuropsychopharmacology (2011) 36, 1340-1348; doi:10.1038/npp.2011.17; published online 16 March 2011
Resumo:
Benign focal epilepsy in childhood with centro-temporal spikes (BECTS) is one of the most common forms of idiopathic epilepsy, with onset from age 3 to 14 years. Although the prognosis for children with BECTS is excellent, some studies have revealed neuropsychological deficits in many domains, including language. Auditory event-related potentials (AERPs) reflect activation of different neuronal populations and are suggested to contribute to the evaluation of auditory discrimination (N1), attention allocation and phonological categorization (N2), and echoic memory (mismatch negativity – MMN). The scarce existing literature about this theme motivated the present study, which aims to investigate and document the existing AERP changes in a group of children with BECTS. AERPs were recorded, during the day, to pure and vocal tones and in a conventional auditory oddball paradigm in five children with BECTS (aged 8–12; mean = 10 years; male = 5) and in six gender and age-matched controls. Results revealed high amplitude of AERPs for the group of children with BECTS with a slight latency delay more pronounced in fronto-central electrodes. Children with BECTS may have abnormal central auditory processing, reflected by electrophysiological measures such as AERPs. In advance, AERPs seem a good tool to detect and reliably reveal cortical excitability in children with typical BECTS.
Resumo:
Interaural intensity and time differences (IID and ITD) are two binaural auditory cues for localizing sounds in space. This study investigated the spatio-temporal brain mechanisms for processing and integrating IID and ITD cues in humans. Auditory-evoked potentials were recorded, while subjects passively listened to noise bursts lateralized with IID, ITD or both cues simultaneously, as well as a more frequent centrally presented noise. In a separate psychophysical experiment, subjects actively discriminated lateralized from centrally presented stimuli. IID and ITD cues elicited different electric field topographies starting at approximately 75 ms post-stimulus onset, indicative of the engagement of distinct cortical networks. By contrast, no performance differences were observed between IID and ITD cues during the psychophysical experiment. Subjects did, however, respond significantly faster and more accurately when both cues were presented simultaneously. This performance facilitation exceeded predictions from probability summation, suggestive of interactions in neural processing of IID and ITD cues. Supra-additive neural response interactions as well as topographic modulations were indeed observed approximately 200 ms post-stimulus for the comparison of responses to the simultaneous presentation of both cues with the mean of those to separate IID and ITD cues. Source estimations revealed differential processing of IID and ITD cues initially within superior temporal cortices and also at later stages within temporo-parietal and inferior frontal cortices. Differences were principally in terms of hemispheric lateralization. The collective psychophysical and electrophysiological results support the hypothesis that IID and ITD cues are processed by distinct, but interacting, cortical networks that can in turn facilitate auditory localization.
Resumo:
Resumen basado en el de la publicación. Resumen en español
Resumo:
It has been previously demonstrated that extensive activation in the dorsolateral temporal lobes associated with masking a speech target with a speech masker, consistent with the hypothesis that competition for central auditory processes is an important factor in informational masking. Here, masking from speech and two additional maskers derived from the original speech were investigated. One of these is spectrally rotated speech, which is unintelligible and has a similar (inverted) spectrotemporal profile to speech. The authors also controlled for the possibility of “glimpsing” of the target signal during modulated masking sounds by using speech-modulated noise as a masker in a baseline condition. Functional imaging results reveal that masking speech with speech leads to bilateral superior temporal gyrus (STG) activation relative to a speech-in-noise baseline, while masking speech with spectrally rotated speech leads solely to right STG activation relative to the baseline. This result is discussed in terms of hemispheric asymmetries for speech perception, and interpreted as showing that masking effects can arise through two parallel neural systems, in the left and right temporal lobes. This has implications for the competition for resources caused by speech and rotated speech maskers, and may illuminate some of the mechanisms involved in informational masking.
Resumo:
It has been previously demonstrated that extensive activation in the dorsolateral temporal lobes associated with masking a speech target with a speech masker, consistent with the hypothesis that competition for central auditory processes is an important factor in informational masking. Here, masking from speech and two additional maskers derived from the original speech were investigated. One of these is spectrally rotated speech, which is unintelligible and has a similar (inverted) spectrotemporal profile to speech. The authors also controlled for the possibility of "glimpsing" of the target signal during modulated masking sounds by using speech-modulated noise as a masker in a baseline condition. Functional imaging results reveal that masking speech with speech leads to bilateral superior temporal gyrus (STG) activation relative to a speech-in-noise baseline, while masking speech with spectrally rotated speech leads solely to right STG activation relative to the baseline. This result is discussed in terms of hemispheric asymmetries for speech perception, and interpreted as showing that masking effects can arise through two parallel neural systems, in the left and right temporal lobes. This has implications for the competition for resources caused by speech and rotated speech maskers, and may illuminate some of the mechanisms involved in informational masking.
Resumo:
The amygdala has been studied extensively for its critical role in associative fear conditioning in animals and humans. Noxious stimuli, such as those used for fear conditioning, are most effective in eliciting behavioral responses and amygdala activation when experienced in an unpredictable manner. Here, we show, using a translational approach in mice and humans, that unpredictability per se without interaction with motivational information is sufficient to induce sustained neural activity in the amygdala and to elicit anxiety-like behavior. Exposing mice to mere temporal unpredictability within a time series of neutral sound pulses in an otherwise neutral sensory environment increased expression of the immediate-early gene c-fos and prevented rapid habituation of single neuron activity in the basolateral amygdala. At the behavioral level, unpredictable, but not predictable, auditory stimulation induced avoidance and anxiety-like behavior. In humans, functional magnetic resonance imaging revealed that temporal unpredictably causes sustained neural activity in amygdala and anxiety-like behavior as quantified by enhanced attention toward emotional faces. Our findings show that unpredictability per se is an important feature of the sensory environment influencing habituation of neuronal activity in amygdala and emotional behavior and indicate that regulation of amygdala habituation represents an evolutionary-conserved mechanism for adapting behavior in anticipation of temporally unpredictable events.
Resumo:
Edges are crucial for the formation of coherent objects from sequential sensory inputs within a single modality. Moreover, temporally coincident boundaries of perceptual objects across different sensory modalities facilitate crossmodal integration. Here, we used functional magnetic resonance imaging in order to examine the neural basis of temporal edge detection across modalities. Onsets of sensory inputs are not only related to the detection of an edge but also to the processing of novel sensory inputs. Thus, we used transitions from input to rest (offsets) as convenient stimuli for studying the neural underpinnings of visual and acoustic edge detection per se. We found, besides modality-specific patterns, shared visual and auditory offset-related activity in the superior temporal sulcus and insula of the right hemisphere. Our data suggest that right hemispheric regions known to be involved in multisensory processing are crucial for detection of edges in the temporal domain across both visual and auditory modalities. This operation is likely to facilitate cross-modal object feature binding based on temporal coincidence. Hum Brain Mapp, 2008. (c) 2008 Wiley-Liss, Inc.