963 resultados para auditory cortex
Resumo:
ABSTRACT: Recent progress in neuroscience revealed diverse regions of the CNS which moderate autonomic and affective responses. The ventro-medial prefrontal cortex (vmPFC) plays a key role in these regulations. There is evidence that vmPFC activity is associated with cardiovascular changes during a motor task that are mediated by parasympathetic activity. Moreover, vmPFC activity makes important contributions to regulations of affective and stressful situations.This review selectively summarizes literature in which vmPFC activation was studied in healthy subjects as well as in patients with affective disorders. The reviewed literature suggests that vmPFC activity plays a pivotal role in biopsychosocial processes of disease. Activity in the vmPFC might link affective disorders, stressful environmental conditions, and immune function.
Resumo:
OBJECTIVES: With more children receiving cochlear implants during infancy, there is a need for validated assessments of pre-verbal and early verbal auditory skills. The LittlEARS Auditory Questionnaire is presented here as the first module of the LittlEARS test battery. The LittlEARS Auditory Questionnaire was developed and piloted to assess the auditory behaviour of normal hearing children and hearing impaired children who receive a cochlear implant or hearing aid prior to 24 months of age. This paper presents results from two studies: one validating the LittlEARS Auditory Questionnaire on children with normal hearing who are German speaking and a second validating the norm curves found after adaptation and administration of the questionnaire to children with normal hearing in 15 different languages. METHODS: Scores from a group of 218 German and Austrian children with normal hearing between 5 days and 24 months of age were used to create a norm curve. The questionnaire was adapted from the German original into English and then 15 other languages to date. Regression curves were found based on parental responses from 3309 normal hearing infants and toddlers. Curves for each language were compared to the original German validation curve. RESULTS: The results of the first study were a norm curve which reflects the age-dependence of auditory behaviour, reliability and homogeneity as a measure of auditory behaviour, and calculations of expected and critical values as a function of age. Results of the second study show that the regression curves found for all the adapted languages are essentially equal to the German norm curve, as no statistically significant differences were found. CONCLUSIONS: The LittlEARS Auditory Questionnaire is a valid, language-independent tool for assessing the early auditory behaviour of infants and toddlers with normal hearing. The results of this study suggest that the LittlEARS Auditory Questionnaire could also be very useful for documenting children's progress with their current amplification, providing evidence of the need for implantation, or highlighting the need for follow-up in other developmental areas.
Resumo:
BACKGROUND: Sedation protocols, including the use of sedation scales and regular sedation stops, help to reduce the length of mechanical ventilation and intensive care unit stay. Because clinical assessment of depth of sedation is labor-intensive, performed only intermittently, and interferes with sedation and sleep, processed electrophysiological signals from the brain have gained interest as surrogates. We hypothesized that auditory event-related potentials (ERPs), Bispectral Index (BIS), and Entropy can discriminate among clinically relevant sedation levels. METHODS: We studied 10 patients after elective thoracic or abdominal surgery with general anesthesia. Electroencephalogram, BIS, state entropy (SE), response entropy (RE), and ERPs were recorded immediately after surgery in the intensive care unit at Richmond Agitation-Sedation Scale (RASS) scores of -5 (very deep sedation), -4 (deep sedation), -3 to -1 (moderate sedation), and 0 (awake) during decreasing target-controlled sedation with propofol and remifentanil. Reference measurements for baseline levels were performed before or several days after the operation. RESULTS: At baseline, RASS -5, RASS -4, RASS -3 to -1, and RASS 0, BIS was 94 [4] (median, IQR), 47 [15], 68 [9], 75 [10], and 88 [6]; SE was 87 [3], 46 [10], 60 [22], 74 [21], and 87 [5]; and RE was 97 [4], 48 [9], 71 [25], 81 [18], and 96 [3], respectively (all P < 0.05, Friedman Test). Both BIS and Entropy had high variabilities. When ERP N100 amplitudes were considered alone, ERPs did not differ significantly among sedation levels. Nevertheless, discriminant ERP analysis including two parameters of principal component analysis revealed a prediction probability PK value of 0.89 for differentiating deep sedation, moderate sedation, and awake state. The corresponding PK for RE, SE, and BIS was 0.88, 0.89, and 0.85, respectively. CONCLUSIONS: Neither ERPs nor BIS or Entropy can replace clinical sedation assessment with standard scoring systems. Discrimination among very deep, deep to moderate, and no sedation after general anesthesia can be provided by ERPs and processed electroencephalograms, with similar P(K)s. The high inter- and intraindividual variability of Entropy and BIS precludes defining a target range of values to predict the sedation level in critically ill patients using these parameters. The variability of ERPs is unknown.
Resumo:
OBJECTIVE: To study the neurocognitive profile and its relationship to prefrontal dysfunction in non-demented Parkinson's disease (PD) with deficient haptic perception. METHODS: Twelve right-handed patients with PD and 12 healthy control subjects underwent thorough neuropsychological testing including Rey complex figure, Rey auditory verbal and figural learning test, figural and verbal fluency, and Stroop test. Test scores reflecting significant differences between patients and healthy subjects were correlated with the individual expression coefficients of one principal component, obtained in a principal component analysis of an oxygen-15-labeled water PET study exploring somatosensory discrimination that differentiated between the two groups and involved prefrontal cortices. RESULTS: We found significantly decreased total scores for the verbal learning trials and verbal delayed free recall in PD patients compared with normal volunteers. Further analysis of these parameters using Spearman's ranking correlation showed a significantly negative correlation of deficient verbal recall with expression coefficients of the principal component whose image showed a subcortical-cortical network, including right dorsolateral-prefrontal cortex, in PD patients. CONCLUSION: PD patients with disrupted right dorsolateral prefrontal cortex function and associated diminished somatosensory discrimination are impaired also in verbal memory functions. A negative correlation between delayed verbal free recall and PET activation in a network including the prefrontal cortices suggests that verbal cues and accordingly declarative memory processes may be operative in PD during activities that demand sustained attention such as somatosensory discrimination. Verbal cues may be compensatory in nature and help to non-specifically enhance focused attention in the presence of a functionally disrupted prefrontal cortex.
Resumo:
In this article, it is shown that IWD incorporates topological perceptual characteristics of both spoken and written language, and it is argued that these characteristics should not be ignored or given up when synchronous textual CMC is technologically developed and upgraded.
Resumo:
The integration of the auditory modality in virtual reality environments is known to promote the sensations of immersion and presence. However it is also known from psychophysics studies that auditory-visual interaction obey to complex rules and that multisensory conflicts may disrupt the adhesion of the participant to the presented virtual scene. It is thus important to measure the accuracy of the auditory spatial cues reproduced by the auditory display and their consistency with the spatial visual cues. This study evaluates auditory localization performances under various unimodal and auditory-visual bimodal conditions in a virtual reality (VR) setup using a stereoscopic display and binaural reproduction over headphones in static conditions. The auditory localization performances observed in the present study are in line with those reported in real conditions, suggesting that VR gives rise to consistent auditory and visual spatial cues. These results validate the use of VR for future psychophysics experiments with auditory and visual stimuli. They also emphasize the importance of a spatially accurate auditory and visual rendering for VR setups.
Resumo:
This article describes a series of experiments which were carried out to measure the sense of presence in auditory virtual environments. Within the study a comparison of self-created signals to signals created by the surrounding environment is drawn. Furthermore, it is investigated if the room characteristics of the simulated environment have consequences on the perception of presence during vocalization or when listening to speech. Finally the experiments give information about the influence of background signals on the sense of presence. In the experiments subjects rated the degree of perceived presence in an auditory virtual environment on a perceptual scale. It is described which parameters have the most influence on the perception of presence and which ones are of minor influence. The results show that on the one hand an external speaker has more influence on the sense of presence than an adequate presentation of one’s own voice. On the other hand both room reflections and adequately presented background signals significantly increase the perceived presence in the virtual environment.
Resumo:
The characteristics of moving sound sources have strong implications on the listener's distance perception and the estimation of velocity. Modifications of the typical sound emissions as they are currently occurring due to the tendency towards electromobility have an impact on the pedestrian's safety in road traffic. Thus, investigations of the relevant cues for velocity and distance perception of moving sound sources are not only of interest for the psychoacoustic community, but also for several applications, like e.g. virtual reality, noise pollution and safety aspects of road traffic. This article describes a series of psychoacoustic experiments in this field. Dichotic and diotic stimuli of a set of real-life recordings taken from a passing passenger car and a motorcycle were presented to test subjects who in turn were asked to determine the velocity of the object and its minimal distance from the listener. The results of these psychoacoustic experiments show that the estimated velocity is strongly linked to the object's distance. Furthermore, it could be shown that binaural cues contribute significantly to the perception of velocity. In a further experiment, it was shown that - independently of the type of the vehicle - the main parameter for distance determination is the maximum sound pressure level at the listener's position. The article suggests a system architecture for the adequate consideration of moving sound sources in virtual auditory environments. Virtual environments can thus be used to investigate the influence of new vehicle powertrain concepts and the related sound emissions of these vehicles on the pedestrians' ability to estimate the distance and velocity of moving objects.