981 resultados para Auditory cortex


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Action selection and organization are very complex processes that need to exploit contextual information and the retrieval of previously memorized information, as well as the integration of these different types of data. On the basis of anatomical connection with premotor and parietal areas involved in action goal coding, and on the data about the literature it seems appropriate to suppose that one of the most candidate involved in the selection of neuronal pools for the selection and organization of intentional actions is the prefrontal cortex. We recorded single ventrolateral prefrontal (VLPF) neurons activity while monkeys performed simple and complex manipulative actions aimed at distinct final goals, by employing a modified and more strictly controlled version of the grasp-to-eat(a food pellet)/grasp-to-place(an object) paradigm used in previous studies on parietal (Fogassi et al., 2005) and premotor neurons (Bonini et al., 2010). With this task we have been able both to evaluate the processing and integration of distinct (visual and auditory) contextual sequentially presented information in order to select the forthcoming action to perform and to examine the possible presence of goal-related activity in this portion of cortex. Moreover, we performed an observation task to clarify the possible contribution of VLPF neurons to the understanding of others’ goal-directed actions. Simple Visuo Motor Task (sVMT). We found four main types of neurons: unimodal sensory-driven, motor-related, unimodal sensory-and-motor, and multisensory neurons. We found a substantial number of VLPF neurons showing both a motor-related discharge and a visual presentation response (sensory-and-motor neurons), with remarkable visuo-motor congruence for the preferred target. Interestingly the discharge of multisensory neurons reflected a behavioural decision independently from the sensory modality of the stimulus allowing the monkey to make it: some encoded a decision to act/refraining from acting (the majority), while others specified one among the four behavioural alternatives. Complex Visuo Motor Task (cVMT). The cVMT was similar to the sVMT, but included a further grasping motor act (grasping a lid in order to remove it, before grasping the target) and was run in two modalities: randomized and in blocks. Substantially, motor-related and sensory-and-motor neurons tested in the cVMTrandomized were activated already during the first grasping motor act, but the selectivity for one of the two graspable targets emerged only during the execution of the second grasping. In contrast, when the cVMT was run in block, almost all these neurons not only discharged during the first grasping motor act, but also displayed the same target selectivity showed in correspondence of the hand contact with the target. Observation Task (OT). A great part of the neurons active during the OT showed a firing rate modulation in correspondence with the action performed by the experimenter. Among them, we found neurons significantly activated during the observation of the experimenter’s action (action observation-related neurons) and neurons responding not only to the action observation, but also to the presented cue stimuli (sensory-and-action observation-related neurons. Among the neurons of the first set, almost the half displayed a target selectivity, with a not clear difference between the two presented targets; Concerning to the second neuronal set, sensory-and-action related neurons, we found a low target selectivity and a not strictly congruence between the selectivity exhibited in the visual response and in the action observation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Children with autistic spectrum disorder (ASD) may have poor audio-visual integration, possibly reflecting dysfunctional 'mirror neuron' systems which have been hypothesised to be at the core of the condition. In the present study, a computer program, utilizing speech synthesizer software and a 'virtual' head (Baldi), delivered speech stimuli for identification in auditory, visual or bimodal conditions. Children with ASD were poorer than controls at recognizing stimuli in the unimodal conditions, but once performance on this measure was controlled for, no group difference was found in the bimodal condition. A group of participants with ASD were also trained to develop their speech-reading ability. Training improved visual accuracy and this also improved the children's ability to utilize visual information in their processing of speech. Overall results were compared to predictions from mathematical models based on integration and non-integration, and were most consistent with the integration model. We conclude that, whilst they are less accurate in recognizing stimuli in the unimodal condition, children with ASD show normal integration of visual and auditory speech stimuli. Given that training in recognition of visual speech was effective, children with ASD may benefit from multi-modal approaches in imitative therapy and language training. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Auditory sensory gating (ASG) is the ability in individuals to suppress incoming irrelevant sensory input, indexed by evoked response to paired auditory stimuli. ASG is impaired in psychopathology such as schizophrenia, in which it has been proposed as putative endophenotype. This study aims to characterise electrophysiological properties of the phenomenon using MEG in time and frequency domains as well as to localise putative networks involved in the process at both sensor and source level. We also investigated the relationship between ASG measures and personality profiles in healthy participants in the light of its candidate endophenotype role in psychiatric disorders. Auditory evoked magnetic fields were recorded in twenty seven healthy participants by P50 ‘paired-click’ paradigm presented in pairs (conditioning stimulus S1- testing stimulus S2) at 80dB, separated by 250msec with inter trial interval of 7-10 seconds. Gating ratio in healthy adults ranged from 0.5 to 0.8 suggesting dimensional nature of P50 ASG. The brain regions active during this process were bilateral superior temporal gyrus (STG) and bilateral inferior frontal gyrus (IFG); activation was significantly stronger in IFG during S2 as compared to S1 (at p<0.05). Measures of effective connectivity between these regions using DCM modelling revealed the role of frontal cortex in modulating ASG as suggested by intracranial studies, indicating major role of inhibitory interneuron connections. Findings from this study identified a unique event-related oscillatory pattern for P50 ASG with alpha (STG)-beta (IFG) desynchronization and increase in cortical oscillatory gamma power (IFG) during S2 condition as compared to S1. These findings show that the main generator for P50 response is within temporal lobe and that inhibitory interneurons and gamma oscillations in the frontal cortex contributes substantially towards sensory gating. Our findings also show that ASG is a predictor of personality profiles (introvert vs extrovert dimension).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Once thought to be predominantly the domain of cortex, multisensory integration has now been found at numerous sub-cortical locations in the auditory pathway. Prominent ascending and descending connection within the pathway suggest that the system may utilize non-auditory activity to help filter incoming sounds as they first enter the ear. Active mechanisms in the periphery, particularly the outer hair cells (OHCs) of the cochlea and middle ear muscles (MEMs), are capable of modulating the sensitivity of other peripheral mechanisms involved in the transduction of sound into the system. Through indirect mechanical coupling of the OHCs and MEMs to the eardrum, motion of these mechanisms can be recorded as acoustic signals in the ear canal. Here, we utilize this recording technique to describe three different experiments that demonstrate novel multisensory interactions occurring at the level of the eardrum. 1) In the first experiment, measurements in humans and monkeys performing a saccadic eye movement task to visual targets indicate that the eardrum oscillates in conjunction with eye movements. The amplitude and phase of the eardrum movement, which we dub the Oscillatory Saccadic Eardrum Associated Response or OSEAR, depended on the direction and horizontal amplitude of the saccade and occurred in the absence of any externally delivered sounds. 2) For the second experiment, we use an audiovisual cueing task to demonstrate a dynamic change to pressure levels in the ear when a sound is expected versus when one is not. Specifically, we observe a drop in frequency power and variability from 0.1 to 4kHz around the time when the sound is expected to occur in contract to a slight increase in power at both lower and higher frequencies. 3) For the third experiment, we show that seeing a speaker say a syllable that is incongruent with the accompanying audio can alter the response patterns of the auditory periphery, particularly during the most relevant moments in the speech stream. These visually influenced changes may contribute to the altered percept of the speech sound. Collectively, we presume that these findings represent the combined effect of OHCs and MEMs acting in tandem in response to various non-auditory signals in order to manipulate the receptive properties of the auditory system. These influences may have a profound, and previously unrecognized, impact on how the auditory system processes sounds from initial sensory transduction all the way to perception and behavior. Moreover, we demonstrate that the entire auditory system is, fundamentally, a multisensory system.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been recently shownthat localfield potentials (LFPs)fromthe auditory and visual cortices carry information about sensory stimuli, but whether this is a universal property of sensory cortices remains to be determined. Moreover, little is known about the temporal dynamics of sensory information contained in LFPs following stimulus onset. Here we investigated the time course of the amount of stimulus information in LFPs and spikes from the gustatory cortex of awake rats subjected to tastants and water delivery on the tongue. We found that the phase and amplitude of multiple LFP frequencies carry information about stimuli, which have specific time courses after stimulus delivery. The information carried by LFP phase and amplitude was independent within frequency bands, since the joint information exhibited neither synergy nor redundancy. Tastant information in LFPs was also independent and had a different time course from the information carried by spikes. These findings support the hypothesis that the brain uses different frequency channels to dynamically code for multiple features of a stimulus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been recently shownthat localfield potentials (LFPs)fromthe auditory and visual cortices carry information about sensory stimuli, but whether this is a universal property of sensory cortices remains to be determined. Moreover, little is known about the temporal dynamics of sensory information contained in LFPs following stimulus onset. Here we investigated the time course of the amount of stimulus information in LFPs and spikes from the gustatory cortex of awake rats subjected to tastants and water delivery on the tongue. We found that the phase and amplitude of multiple LFP frequencies carry information about stimuli, which have specific time courses after stimulus delivery. The information carried by LFP phase and amplitude was independent within frequency bands, since the joint information exhibited neither synergy nor redundancy. Tastant information in LFPs was also independent and had a different time course from the information carried by spikes. These findings support the hypothesis that the brain uses different frequency channels to dynamically code for multiple features of a stimulus.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Gabor representations have been widely used in facial analysis (face recognition, face detection and facial expression detection) due to their biological relevance and computational properties. Two popular Gabor representations used in literature are: 1) Log-Gabor and 2) Gabor energy filters. Even though these representations are somewhat similar, they also have distinct differences as the Log-Gabor filters mimic the simple cells in the visual cortex while the Gabor energy filters emulate the complex cells, which causes subtle differences in the responses. In this paper, we analyze the difference between these two Gabor representations and quantify these differences on the task of facial action unit (AU) detection. In our experiments conducted on the Cohn-Kanade dataset, we report an average area underneath the ROC curve (A`) of 92.60% across 17 AUs for the Gabor energy filters, while the Log-Gabor representation achieved an average A` of 96.11%. This result suggests that small spatial differences that the Log-Gabor filters pick up on are more useful for AU detection than the differences in contours and edges that the Gabor energy filters extract.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The relationship between neuronal acuity and behavioral performance was assessed in the barn owl (Tyto alba), a nocturnal raptor renowned for its ability to localize sounds and for the topographic representation of auditory space found in the midbrain. We measured discrimination of sound-source separation using a newly developed procedure involving the habituation and recovery of the pupillary dilation response. The smallest discriminable change of source location was found to be about two times finer in azimuth than in elevation. Recordings from neurons in its midbrain space map revealed that their spatial tuning, like the spatial discrimination behavior, was also better in azimuth than in elevation by a factor of about two. Because the PDR behavioral assay is mediated by the same circuitry whether discrimination is assessed in azimuth or in elevation, this difference in vertical and horizontal acuity is likely to reflect a true difference in sensory resolution, without additional confounding effects of differences in motor performance in the two dimensions. Our results, therefore, are consistent with the hypothesis that the acuity of the midbrain space map determines auditory spatial discrimination.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The primary objective of the experiments reported here was to demonstrate the effects of opening up the design envelope for auditory alarms on the ability of people to learn the meanings of a set of alarms. Two sets of alarms were tested, one already extant and one newly-designed set for the same set of functions, designed according to a rationale set out by the authors aimed at increasing the heterogeneity of the alarm set and incorporating some well-established principles of alarm design. For both sets of alarms, a similarity-rating experiment was followed by a learning experiment. The results showed that the newly-designed set was judged to be more internally dissimilar, and easier to learn, than the extant set. The design rationale outlined in the paper is useful for design purposes in a variety of practical domains and shows how alarm designers, even at a relatively late stage in the design process, can improve the efficacy of an alarm set.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study aimed to examine the effects on driving, usability and subjective workload of performing music selection tasks using a touch screen interface. Additionally, to explore whether the provision of visual and/or auditory feedback offers any performance and usability benefits. Thirty participants performed music selection tasks with a touch screen interface while driving. The interface provided four forms of feedback: no feedback, auditory feedback, visual feedback, and a combination of auditory and visual feedback. Performance on the music selection tasks significantly increased subjective workload and degraded performance on a range of driving measures including lane keeping variation and number of lane excursions. The provision of any form of feedback on the touch screen interface did not significantly affect driving performance, usability or subjective workload, but was preferred by users over no feedback. Overall, the results suggest that touch screens may not be a suitable input device for navigating scrollable lists.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Time-varying bispectra, computed using a classical sliding window short-time Fourier approach, are analyzed for scalp EEG potentials evoked by an auditory stimulus and new observations are presented. A single, short duration tone is presented from the left or the right, direction unknown to the test subject. The subject responds by moving the eyes to the direction of the sound. EEG epochs sampled at 200 Hz for repeated trials are processed between -70 ms and +1200 ms with reference to the stimulus. It is observed that for an ensemble of correctly recognized cases, the best matching timevarying bispectra at (8 Hz, 8Hz) are for PZ-FZ channels and this is also largely the case for grand averages but not for power spectra at 8 Hz. Out of 11 subjects, the only exception for time-varying bispectral match was a subject with family history of Alzheimer’s disease and the difference was in bicoherence, not biphase.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Summary:  Objective: We performed spike triggered functional MRI (fMRI) in a 12 year old girl with Benign Epilepsy with Centro-temporal Spikes (BECTS) and left-sided spikes. Our aim was to demonstrate the cerebral origin of her interictal spikes. Methods: EEG was recorded within the 3 Tesla MRI. Whole brain fMRI images were acquired, beginning 2–3 seconds after spikes. Baseline fMRI images were acquired when there were no spikes for 20 seconds. Image sets were compared with the Student's t-test. Results: Ten spike and 20 baseline brain volumes were analysed. Focal activiation was seen in the inferior left sensorimotor cortex near the face area. The anterior cingulate was more active during baseline than spikes. Conclusions: Left sided epileptiform activity in this patient with BECTS is associated with fMRI activation in the left face region of the somatosensory cortex, which would be consistent with the facial sensorimotor involvement in BECT seizures. The presence of BOLD signal change in other regions raises the possibility that the scalp recorded field of this patient with BECTs may reflect electrical change in more than one brain region.