906 resultados para auditory masking


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Auditory hallucinations (AH) occur in various neurological and psychiatric disorders. In psychosis, increased neuronal activity in the primary auditory cortex (PAC) contributes to AH. We investigated functional neuroanatomy of epileptic hallucinations by measuring cerebral perfusion in three patients with AH during simple partial status epilepticus. Hyperperfusion in the temporal lobe covering the PAC occurred in all patients. Our perfusion data support the hypothesis of PAC being a constituting element in the genesis of AH independent of their aetiology.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Transcranial magnetic stimulation (TMS) is a novel therapeutic approach, used in patients with pharmacoresistant auditory verbal hallucinations (AVH). To investigate the neurobiological effects of TMS on AVH, we measured cerebral blood flow with pseudo-continuous magnetic resonance-arterial spin labeling 20 ± 6 hours before and after TMS treatment.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this functional magnetic resonance imaging (fMRI) study was to identify human brain areas that are sensitive to the direction of auditory motion. Such directional sensitivity was assessed in a hypothesis-free manner by analyzing fMRI response patterns across the entire brain volume using a spherical-searchlight approach. In addition, we assessed directional sensitivity in three predefined brain areas that have been associated with auditory motion perception in previous neuroimaging studies. These were the primary auditory cortex, the planum temporale and the visual motion complex (hMT/V5+). Our whole-brain analysis revealed that the direction of sound-source movement could be decoded from fMRI response patterns in the right auditory cortex and in a high-level visual area located in the right lateral occipital cortex. Our region-of-interest-based analysis showed that the decoding of the direction of auditory motion was most reliable with activation patterns of the left and right planum temporale. Auditory motion direction could not be decoded from activation patterns in hMT/V5+. These findings provide further evidence for the planum temporale playing a central role in supporting auditory motion perception. In addition, our findings suggest a cross-modal transfer of directional information to high-level visual cortex in healthy humans.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article, we will link neuroimaging, data analysis, and intervention methods in an important psychiatric condition: auditory verbal hallucinations (AVH). The clinical and phenomenological background as well as neurophysiological findings will be covered and discussed with respect to noninvasive brain stimulation. Additionally, methods of noninvasive brain stimulation will be presented as ways to intervene with AVH. Finally, preliminary conclusions and possible future perspectives will be proposed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Auditory verbal hallucinations (AVH) in schizophrenia patients assumingly result from a state inadequate activation of the primary auditory system. We tested brain responsiveness to auditory stimulation in healthy controls (n=26), and in schizophrenia patients that frequently (n=18) or never (n=11) experienced AVH. Responsiveness was assessed by driving the EEG with click-tones at 20, 30 and 40Hz. We compared stimulus induced EEG changes between groups using spectral amplitude maps and a global measure of phase-locking (GFS). As expected, the 40Hz stimulation elicited the strongest changes. However, while controls and non-hallucinators increased 40Hz EEG activity during stimulation, a left-lateralized decrease was observed in the hallucinators. These differences were significant (p=.02). As expected, GFS increased during stimulation in controls (p=.08) and non-hallucinating patients (p=.06), which was significant when combining the two groups (p=.01). In contrast, GFS decreased with stimulation in hallucinating patients (p=0.13), resulting in a significantly different GFS response when comparing subjects with and without AVH (p<.01). Our data suggests that normally, 40Hz stimulation leads to the activation of a synchronized network representing the sensory input, but in hallucinating patients, the same stimulation partly disrupts ongoing activity in this network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Avoidance of excessively deep sedation levels is problematic in intensive care patients. Electrophysiologic monitoring may offer an approach to solving this problem. Since electroencephalogram (EEG) responses to different sedation regimens vary, we assessed electrophysiologic responses to two sedative drug regimens in 10 healthy volunteers. Dexmedetomidine/remifentanil (dex/remi group) and midazolam/remifentanil (mida/remi group) were infused 7 days apart. Each combination of medications was given at stepwise intervals to reach Ramsay scores (RS) 2, 3, and 4. Resting EEG, bispectral index (BIS), and the N100 amplitudes of long-latency auditory-evoked potentials (ERP) were recorded at each level of sedation. During dex/remi, resting EEG was characterized by a recurrent high-power low-frequency pattern which became more pronounced at deeper levels of sedation. BIS Index decreased uniformly in only the dex/remi group (from 94 +/- 3 at baseline to 58 +/- 14 at RS 4) compared to the mida/remi group (from 94 +/- 2 to 76 +/- 10; P = 0.029 between groups). The ERP amplitudes decreased from 5.3 +/- 1.3 at baseline to 0.4 +/- 1.1 at RS 4 (P = 0.003) in only the mida/remi group. We conclude that ERPs in volunteers sedated with dex/remi, in contrast to mida/remi, indicate a cortical response to acoustic stimuli, even when sedation reaches deeper levels. Consequently, ERP can monitor sedation with midazolam but not with dexmedetomidine. The reverse is true for BIS.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Among other auditory operations, the analysis of different sound levels received at both ears is fundamental for the localization of a sound source. These so-called interaural level differences, in animals, are coded by excitatory-inhibitory neurons yielding asymmetric hemispheric activity patterns with acoustic stimuli having maximal interaural level differences. In human auditory cortex, the temporal blood oxygen level-dependent (BOLD) response to auditory inputs, as measured by functional magnetic resonance imaging (fMRI), consists of at least two independent components: an initial transient and a subsequent sustained signal, which, on a different time scale, are consistent with electrophysiological human and animal response patterns. However, their specific functional role remains unclear. Animal studies suggest these temporal components being based on different neural networks and having specific roles in representing the external acoustic environment. Here we hypothesized that the transient and sustained response constituents are differentially involved in coding interaural level differences and therefore play different roles in spatial information processing. Healthy subjects underwent monaural and binaural acoustic stimulation and BOLD responses were measured using high signal-to-noise-ratio fMRI. In the anatomically segmented Heschl's gyrus the transient response was bilaterally balanced, independent of the side of stimulation, while in opposite the sustained response was contralateralized. This dissociation suggests a differential role at these two independent temporal response components, with an initial bilateral transient signal subserving rapid sound detection and a subsequent lateralized sustained signal subserving detailed sound characterization.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The auditory cortex is anatomically segregated into a central core and a peripheral belt region, which exhibit differences in preference to bandpassed noise and in temporal patterns of response to acoustic stimuli. While it has been shown that visual stimuli can modify response magnitude in auditory cortex, little is known about differential patterns of multisensory interactions in core and belt. Here, we used functional magnetic resonance imaging and examined the influence of a short visual stimulus presented prior to acoustic stimulation on the spatial pattern of blood oxygen level-dependent signal response in auditory cortex. Consistent with crossmodal inhibition, the light produced a suppression of signal response in a cortical region corresponding to the core. In the surrounding areas corresponding to the belt regions, however, we found an inverse modulation with an increasing signal in centrifugal direction. Our data suggest that crossmodal effects are differentially modulated according to the hierarchical core-belt organization of auditory cortex.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Auditory neuroscience has not tapped fMRI's full potential because of acoustic scanner noise emitted by the gradient switches of conventional echoplanar fMRI sequences. The scanner noise is pulsed, and auditory cortex is particularly sensitive to pulsed sounds. Current fMRI approaches to avoid stimulus-noise interactions are temporally inefficient. Since the sustained BOLD response to pulsed sounds decreases with repetition rate and becomes minimal with unpulsed sounds, we developed an fMRI sequence emitting continuous rather than pulsed gradient sound by implementing a novel quasi-continuous gradient switch pattern. Compared to conventional fMRI, continuous-sound fMRI reduced auditory cortex BOLD baseline and increased BOLD amplitude with graded sound stimuli, short sound events, and sounds as complex as orchestra music with preserved temporal resolution. Response in subcortical auditory nuclei was enhanced, but not the response to light in visual cortex. Finally, tonotopic mapping using continuous-sound fMRI demonstrates that enhanced functional signal-to-noise in BOLD response translates into improved spatial separability of specific sound representations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Edges are important cues defining coherent auditory objects. As a model of auditory edges, sound on- and offset are particularly suitable to study their neural underpinnings because they contrast a specific physical input against no physical input. Change from silence to sound, that is onset, has extensively been studied and elicits transient neural responses bilaterally in auditory cortex. However, neural activity associated with sound onset is not only related to edge detection but also to novel afferent inputs. Edges at the change from sound to silence, that is offset, are not confounded by novel physical input and thus allow to examine neural activity associated with sound edges per se. In the first experiment, we used silent acquisition functional magnetic resonance imaging and found that the offset of pulsed sound activates planum temporale, superior temporal sulcus and planum polare of the right hemisphere. In the planum temporale and the superior temporal sulcus, offset response amplitudes were related to the pulse repetition rate of the preceding stimulation. In the second experiment, we found that these offset-responsive regions were also activated by single sound pulses, onset of sound pulse sequences and single sound pulse omissions within sound pulse sequences. However, they were not active during sustained sound presentation. Thus, our data show that circumscribed areas in right temporal cortex are specifically involved in identifying auditory edges. This operation is crucial for translating acoustic signal time series into coherent auditory objects.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: It is well known that there are specific peripheral activation patterns associated with the emotional valence of sounds. However, it is unclear how these effects adapt over time. The personality traits influencing these processes are also not clear. Anxiety disorders influence the autonomic activation related to emotional processing. However, personality anxiety traits have never been studied in the context of affective auditory stimuli. METHODS: Heart rate, skin conductance, zygomatic muscle activity and subjective rating of emotional valence and arousal were recorded in healthy subjects during the presentation of pleasant, unpleasant, and neutral sounds. Recordings were repeated 1 week later to examine possible time-dependent changes related to habituation and sensitization processes. RESULTS AND CONCLUSION: There was not a generalized habituation or sensitization process related to the repeated presentation of affective sounds, but rather, specific adaptation processes for each physiological measure. These observations are consistent with previous studies performed with affective pictures and simple tones. Thus, the measures of skin conductance activity showed the strongest changes over time, including habituation during the first presentation session and sensitization at the end of the second presentation session, whereas the facial electromyographic activity habituated only for the neutral stimuli and the heart rate did not habituate at all. Finally, we showed that the measure of personality trait anxiety influenced the orienting reaction to affective sounds, but not the adaptation processes related to the repeated presentation of these sounds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Edges are crucial for the formation of coherent objects from sequential sensory inputs within a single modality. Moreover, temporally coincident boundaries of perceptual objects across different sensory modalities facilitate crossmodal integration. Here, we used functional magnetic resonance imaging in order to examine the neural basis of temporal edge detection across modalities. Onsets of sensory inputs are not only related to the detection of an edge but also to the processing of novel sensory inputs. Thus, we used transitions from input to rest (offsets) as convenient stimuli for studying the neural underpinnings of visual and acoustic edge detection per se. We found, besides modality-specific patterns, shared visual and auditory offset-related activity in the superior temporal sulcus and insula of the right hemisphere. Our data suggest that right hemispheric regions known to be involved in multisensory processing are crucial for detection of edges in the temporal domain across both visual and auditory modalities. This operation is likely to facilitate cross-modal object feature binding based on temporal coincidence. Hum Brain Mapp, 2008. (c) 2008 Wiley-Liss, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Functional magnetic resonance imaging (fMRI) studies can provide insight into the neural correlates of hallucinations. Commonly, such studies require self-reports about the timing of the hallucination events. While many studies have found activity in higher-order sensory cortical areas, only a few have demonstrated activity of the primary auditory cortex during auditory verbal hallucinations. In this case, using self-reports as a model of brain activity may not be sensitive enough to capture all neurophysiological signals related to hallucinations. We used spatial independent component analysis (sICA) to extract the activity patterns associated with auditory verbal hallucinations in six schizophrenia patients. SICA decomposes the functional data set into a set of spatial maps without the use of any input function. The resulting activity patterns from auditory and sensorimotor components were further analyzed in a single-subject fashion using a visualization tool that allows for easy inspection of the variability of regional brain responses. We found bilateral auditory cortex activity, including Heschl's gyrus, during hallucinations of one patient, and unilateral auditory cortex activity in two more patients. The associated time courses showed a large variability in the shape, amplitude, and time of onset relative to the self-reports. However, the average of the time courses during hallucinations showed a clear association with this clinical phenomenon. We suggest that detection of this activity may be facilitated by examining hallucination epochs of sufficient length, in combination with a data-driven approach.