71 resultados para Visual and auditory processing
Resumo:
BACKGROUND: Reports on the effects of focal hemispheric damage on sleep EEG are rare and contradictory. PATIENTS AND METHODS: Twenty patients (mean age +/- SD 53 +/- 14 years) with a first acute hemispheric stroke and no sleep apnea were studied. Stroke severity [National Institute of Health Stroke Scale (NIHSS)], volume (diffusion-weighted brain MRI), and short-term outcome (Rankin score) were assessed. Within the first 8 days after stroke onset, 1-3 sleep EEG recordings per patient were performed. Sleep scoring and spectral analysis were based on the central derivation of the healthy hemisphere. Data were compared with those of 10 age-matched and gender-matched hospitalized controls with no brain damage and no sleep apnea. RESULTS: Stroke patients had higher amounts of wakefulness after sleep onset (112 +/- 53 min vs. 60 +/- 38 min, p < 0.05) and a lower sleep efficiency (76 +/- 10% vs. 86 +/- 8%, p < 0.05) than controls. Time spent in slow-wave sleep (SWS) and rapid eye movement (REM) sleep and total sleep time were lower in stroke patients, but differences were not significant. A positive correlation was found between the amount of SWS and stroke volume (r = 0.79). The slow-wave activity (SWA) ratio NREM sleep/wakefulness was lower in patients than in controls (p < 0.05), and correlated with NIHSS (r = -0.47). CONCLUSION: Acute hemispheric stroke is accompanied by alterations of sleep EEG over the healthy hemisphere that correlate with stroke volume and outcome. The increased SWA during wakefulness and SWS over the healthy hemisphere contralaterally to large strokes may reflect neuronal hypometabolism induced transhemispherically (diaschisis).
Resumo:
Strategies of cognitive control are helpful in reducing anxiety experienced during anticipation of unpleasant or potentially unpleasant events. We investigated the associated cerebral information processing underlying the use of a specific cognitive control strategy during the anticipation of affect-laden events. Using functional magnetic resonance imaging, we examined differential brain activity during anticipation of events of unknown and negative emotional valence in a group of eighteen healthy subjects that used a cognitive control strategy, similar to "reality checking" as used in psychotherapy, compared with a group of sixteen subjects that did not exert cognitive control. While expecting unpleasant stimuli, the "cognitive control" group showed higher activity in left medial and dorsolateral prefrontal cortex areas but reduced activity in the left extended amygdala, pulvinar/lateral geniculate nucleus and fusiform gyrus. Cognitive control during the "unknown" expectation was associated with reduced amygdalar activity as well and further with reduced insular and thalamic activity. The amygdala activations associated with cognitive control correlated negatively with the reappraisal scores of an emotion regulation questionnaire. The results indicate that cognitive control of particularly unpleasant emotions is associated with elevated prefrontal cortex activity that may serve to attenuate emotion processing in for instance amygdala, and, notably, in perception related brain areas.
Resumo:
Recent functional magnetic resonance imaging (fMRI) studies consistently revealed contributions of fronto-parietal and related networks to the execution of a visuospatial judgment task, the so-called "Clock Task". However, due to the low temporal resolution of fMRI, the exact cortical dynamics and timing of processing during task performance could not be resolved until now. In order to clarify the detailed cortical activity and temporal dynamics, 14 healthy subjects performed an established version of the "Clock Task", which comprises a visuospatial task (angle discrimination) and a control task (color discrimination) with the same stimulus material, in an electroencephalography (EEG) experiment. Based on the time-resolved analysis of network activations (microstate analysis), differences in timing between the angle compared to the color discrimination task were found after sensory processing in a time window starting around 200ms. Significant differences between the two tasks were observed in an analysis window from 192ms to 776ms. We divided this window in two parts: an early phase - from 192ms to ∼440ms, and a late phase - from ∼440ms to 776ms. For both tasks, the order of network activations and the types of networks were the same, but, in each phase, activations for the two conditions were dominated by differing network states with divergent temporal dynamics. Our results provide an important basis for the assessment of deviations in processing dynamics during visuospatial tasks in clinical populations.
Resumo:
The momentary, global functional state of the brain is reflected by its electric field configuration. Cluster analytical approaches consistently extracted four head-surface brain electric field configurations that optimally explain the variance of their changes across time in spontaneous EEG recordings. These four configurations are referred to as EEG microstate classes A, B, C, and D and have been associated with verbal/phonological, visual, attention reorientation, and subjective interoceptive-autonomic processing, respectively. The present study tested these associations via an intra-individual and inter-individual analysis approach. The intra-individual approach tested the effect of task-induced increased modality-specific processing on EEG microstate parameters. The inter-individual approach tested the effect of personal modality-specific parameters on EEG microstate parameters. We obtained multichannel EEG from 61 healthy, right-handed, male students during four eyes-closed conditions: object-visualization, spatial-visualization, verbalization (6 runs each), and resting (7 runs). After each run, we assessed participants' degrees of object-visual, spatial-visual, and verbal thinking using subjective reports. Before and after the recording, we assessed modality-specific cognitive abilities and styles using nine cognitive tests and two questionnaires. The EEG of all participants, conditions, and runs was clustered into four classes of EEG microstates (A, B, C, and D). RMANOVAs, ANOVAs and post-hoc paired t-tests compared microstate parameters between conditions. TANOVAs compared microstate class topographies between conditions. Differences were localized using eLORETA. Pearson correlations assessed interrelationships between personal modality-specific parameters and EEG microstate parameters during no-task resting. As hypothesized, verbal as opposed to visual conditions consistently affected the duration, occurrence, and coverage of microstate classes A and B. Contrary to associations suggested by previous reports, parameters were increased for class A during visualization, and class B during verbalization. In line with previous reports, microstate D parameters were increased during no-task resting compared to the three internal, goal-directed tasks. Topographic differences between conditions concerned particular sub-regions of components of the metabolic default mode network. Modality-specific personal parameters did not consistently correlate with microstate parameters except verbal cognitive style which correlated negatively with microstate class A duration and positively with class C occurrence. This is the first study that aimed to induce EEG microstate class parameter changes based on their hypothesized functional significance. Beyond, the associations of microstate classes A and B with visual and verbal processing, respectively and microstate class D with interoceptive-autonomic processing, our results suggest that a finely-tuned interplay between all four EEG microstate classes is necessary for the continuous formation of visual and verbal thoughts, as well as interoceptive-autonomic processing. Our results point to the possibility that the EEG microstate classes may represent the head-surface measured activity of intra-cortical sources primarily exhibiting inhibitory functions. However, additional studies are needed to verify and elaborate on this hypothesis.
Resumo:
Perceptual accuracy is known to be influenced by stimuli location within the visual field. In particular, it seems to be enhanced in the lower visual hemifield (VH) for motion and space processing, and in the upper VH for object and face processing. The origins of such asymmetries are attributed to attentional biases across the visual field, and in the functional organization of the visual system. In this article, we tested content-dependent perceptual asymmetries in different regions of the visual field. Twenty-five healthy volunteers participated in this study. They performed three visual tests involving perception of shapes, orientation and motion, in the four quadrants of the visual field. The results of the visual tests showed that perceptual accuracy was better in the lower than in the upper visual field for motion perception, and better in the upper than in the lower visual field for shape perception. Orientation perception did not show any vertical bias. No difference was found when comparing right and left VHs. The functional organization of the visual system seems to indicate that the dorsal and the ventral visual streams, responsible for motion and shape perception, respectively, show a bias for the lower and upper VHs, respectively. Such a bias depends on the content of the visual information.
Resumo:
Patients with schizophrenia are impaired in many aspects of auditory processing, but indirect evidence suggests that intensity perception is intact. However, because the extraction of meaning from dynamic intensity relies on structures that appear to be altered in schizophrenia, we hypothesized that the perception of auditory looming is impaired as well. Twenty inpatients with schizophrenia and 20 control participants, matched for age, gender, and education, gave intensity ratings of rising (looming) and falling intensity sounds with different mean intensities. Intensity change was overestimated in looming as compared with receding sounds in both groups. However, healthy individuals showed a stronger effect at higher mean intensity, in keeping with previous findings, while patients with schizophrenia lacked this modulation. We discuss how this might support the notion of a more general deficit in extracting emotional meaning from different sensory cues, including intensity and pitch.
Resumo:
Auditory neuroscience has not tapped fMRI's full potential because of acoustic scanner noise emitted by the gradient switches of conventional echoplanar fMRI sequences. The scanner noise is pulsed, and auditory cortex is particularly sensitive to pulsed sounds. Current fMRI approaches to avoid stimulus-noise interactions are temporally inefficient. Since the sustained BOLD response to pulsed sounds decreases with repetition rate and becomes minimal with unpulsed sounds, we developed an fMRI sequence emitting continuous rather than pulsed gradient sound by implementing a novel quasi-continuous gradient switch pattern. Compared to conventional fMRI, continuous-sound fMRI reduced auditory cortex BOLD baseline and increased BOLD amplitude with graded sound stimuli, short sound events, and sounds as complex as orchestra music with preserved temporal resolution. Response in subcortical auditory nuclei was enhanced, but not the response to light in visual cortex. Finally, tonotopic mapping using continuous-sound fMRI demonstrates that enhanced functional signal-to-noise in BOLD response translates into improved spatial separability of specific sound representations.
Resumo:
OBJECT: The localization of any given target in the brain has become a challenging issue because of the increased use of deep brain stimulation to treat Parkinson disease, dystonia, and nonmotor diseases (for example, Tourette syndrome, obsessive compulsive disorders, and depression). The aim of this study was to develop an automated method of adapting an atlas of the human basal ganglia to the brains of individual patients. METHODS: Magnetic resonance images of the brain specimen were obtained before extraction from the skull and histological processing. Adaptation of the atlas to individual patient anatomy was performed by reshaping the atlas MR images to the images obtained in the individual patient using a hierarchical registration applied to a region of interest centered on the basal ganglia, and then applying the reshaping matrix to the atlas surfaces. RESULTS: Results were evaluated by direct visual inspection of the structures visible on MR images and atlas anatomy, by comparison with electrophysiological intraoperative data, and with previous atlas studies in patients with Parkinson disease. The method was both robust and accurate, never failing to provide an anatomically reliable atlas to patient registration. The registration obtained did not exceed a 1-mm mismatch with the electrophysiological signatures in the region of the subthalamic nucleus. CONCLUSIONS: This registration method applied to the basal ganglia atlas forms a powerful and reliable method for determining deep brain stimulation targets within the basal ganglia of individual patients.
Resumo:
Effective visual exploration is required for many activities of daily living and instruments to assess visual exploration are important for the evaluation of the visual and the oculomotor system. In this article, the development of a new instrument to measure central and peripheral target recognition is described. The measurement setup consists of a hemispherical projection which allows presenting images over a large area of ±90° horizontal and vertical angle. In a feasibility study with 14 younger (21–49 years) and 12 older (50–78 years) test persons, 132 targets and 24 distractors were presented within naturalistic color photographs of everyday scenes at 10°, 30°, and 50° eccentricity. After the experiment, both younger and older participants reported in a questionnaire that the task is easy to understand, fun and that it measures a competence that is relevant for activities of daily living. A main result of the pilot study was that younger participants recognized more targets with smaller reaction times than older participants. The group differences were most pronounced for peripheral target detection. This test is feasible and appropriate to assess the functional field of view in younger and older adults.
Resumo:
The vestibular system contributes to the control of posture and eye movements and is also involved in various cognitive functions including spatial navigation and memory. These functions are subtended by projections to a vestibular cortex, whose exact location in the human brain is still a matter of debate (Lopez and Blanke, 2011). The vestibular cortex can be defined as the network of all cortical areas receiving inputs from the vestibular system, including areas where vestibular signals influence the processing of other sensory (e.g. somatosensory and visual) and motor signals. Previous neuroimaging studies used caloric vestibular stimulation (CVS), galvanic vestibular stimulation (GVS), and auditory stimulation (clicks and short-tone bursts) to activate the vestibular receptors and localize the vestibular cortex. However, these three methods differ regarding the receptors stimulated (otoliths, semicircular canals) and the concurrent activation of the tactile, thermal, nociceptive and auditory systems. To evaluate the convergence between these methods and provide a statistical analysis of the localization of the human vestibular cortex, we performed an activation likelihood estimation (ALE) meta-analysis of neuroimaging studies using CVS, GVS, and auditory stimuli. We analyzed a total of 352 activation foci reported in 16 studies carried out in a total of 192 healthy participants. The results reveal that the main regions activated by CVS, GVS, or auditory stimuli were located in the Sylvian fissure, insula, retroinsular cortex, fronto-parietal operculum, superior temporal gyrus, and cingulate cortex. Conjunction analysis indicated that regions showing convergence between two stimulation methods were located in the median (short gyrus III) and posterior (long gyrus IV) insula, parietal operculum and retroinsular cortex (Ri). The only area of convergence between all three methods of stimulation was located in Ri. The data indicate that Ri, parietal operculum and posterior insula are vestibular regions where afferents converge from otoliths and semicircular canals, and may thus be involved in the processing of signals informing about body rotations, translations and tilts. Results from the meta-analysis are in agreement with electrophysiological recordings in monkeys showing main vestibular projections in the transitional zone between Ri, the insular granular field (Ig), and SII.
Resumo:
Music is an intriguing stimulus widely used in movies to increase the emotional experience. However, no brain imaging study has to date examined this enhancement effect using emotional pictures (the modality mostly used in emotion research) and musical excerpts. Therefore, we designed this functional magnetic resonance imaging study to explore how musical stimuli enhance the feeling of affective pictures. In a classical block design carefully controlling for habituation and order effects, we presented fearful and sad pictures (mostly taken from the IAPS) either alone or combined with congruent emotional musical excerpts (classical pieces). Subjective ratings clearly indicated that the emotional experience was markedly increased in the combined relative to the picture condition. Furthermore, using a second-level analysis and regions of interest approach, we observed a clear functional and structural dissociation between the combined and the picture condition. Besides increased activation in brain areas known to be involved in auditory as well as in neutral and emotional visual-auditory integration processes, the combined condition showed increased activation in many structures known to be involved in emotion processing (including for example amygdala, hippocampus, parahippocampus, insula, striatum, medial ventral frontal cortex, cerebellum, fusiform gyrus). In contrast, the picture condition only showed an activation increase in the cognitive part of the prefrontal cortex, mainly in the right dorsolateral prefrontal cortex. Based on these findings, we suggest that emotional pictures evoke a more cognitive mode of emotion perception, whereas congruent presentations of emotional visual and musical stimuli rather automatically evoke strong emotional feelings and experiences.
Resumo:
Most previous neurophysiological studies evoked emotions by presenting visual stimuli. Models of the emotion circuits in the brain have for the most part ignored emotions arising from musical stimuli. To our knowledge, this is the first emotion brain study which examined the influence of visual and musical stimuli on brain processing. Highly arousing pictures of the International Affective Picture System and classical musical excerpts were chosen to evoke the three basic emotions of happiness, sadness and fear. The emotional stimuli modalities were presented for 70 s either alone or combined (congruent) in a counterbalanced and random order. Electroencephalogram (EEG) Alpha-Power-Density, which is inversely related to neural electrical activity, in 30 scalp electrodes from 24 right-handed healthy female subjects, was recorded. In addition, heart rate (HR), skin conductance responses (SCR), respiration, temperature and psychometrical ratings were collected. Results showed that the experienced quality of the presented emotions was most accurate in the combined conditions, intermediate in the picture conditions and lowest in the sound conditions. Furthermore, both the psychometrical ratings and the physiological involvement measurements (SCR, HR, Respiration) were significantly increased in the combined and sound conditions compared to the picture conditions. Finally, repeated measures ANOVA revealed the largest Alpha-Power-Density for the sound conditions, intermediate for the picture conditions, and lowest for the combined conditions, indicating the strongest activation in the combined conditions in a distributed emotion and arousal network comprising frontal, temporal, parietal and occipital neural structures. Summing up, these findings demonstrate that music can markedly enhance the emotional experience evoked by affective pictures.