914 resultados para Level of processing
Resumo:
Purpose We hypothesized that reduced arousability (Richmond Agitation Sedation Scale, RASS, scores −2 to −3) for any reason during delirium assessment increases the apparent prevalence of delirium in intensive care patients. To test this hypothesis, we assessed delirium using the Confusion Assessment Method for the Intensive Care Unit (CAM-ICU) and Intensive Care Delirium Screening Checklist (ICDSC) in intensive care patients during sedation stops, and related the findings to the level of sedation, as assessed with RASS score. Methods We assessed delirium in 80 patients with ICU stay longer than 48 h using CAM-ICU and ICDSC during daily sedation stops. Sedation was assessed using RASS. The effect of including patients with a RASS of −2 and −3 during sedation stop (“light to moderate sedation”, eye contact less than 10 s or not at all, respectively) on prevalence of delirium was analyzed. Results A total of 467 patient days were assessed. The proportion of CAM-ICU-positive evaluations decreased from 53 to 31 % (p < 0.001) if assessments from patients at RASS −2/−3 (22 % of all assessments) were excluded. Similarly, the number of positive ICDSC results decreased from 51 to 29 % (p < 0.001). Conclusions Sedation per se can result in positive items of both CAM-ICU and ICDSC, and therefore in a diagnosis of delirium. Consequently, apparent prevalence of delirium is dependent on how a depressed level of consciousness after sedation stop is interpreted (delirium vs persisting sedation). We suggest that any reports on delirium using these assessment tools should be stratified for a sedation score during the assessment.
Resumo:
Previous syntheses on the effects of environmental conditions on the outcome of plant-plant interactions summarize results from pairwise studies. However, the upscaling to the community-level of such studies is problematic because of the existence of multiple species assemblages and species-specific responses to both the environmental conditions and the presence of neighbors. We conducted the first global synthesis of community-level studies from harsh environments, which included data from 71 alpine and 137 dryland communities to: (i) test how important are facilitative interactions as a driver of community structure, (ii) evaluate whether we can predict the frequency of positive plant-plant interactions across differing environmental conditions and habitats, and (iii) assess whether thresholds in the response of plant-plant interactions to environmental gradients exists between ``moderate'' and ``extreme'' environments. We also used those community-level studies performed across gradients of at least three points to evaluate how the average environmental conditions, the length of the gradient studied, and the number of points sampled across such gradient affect the form and strength of the facilitation-environmental conditions relationship. Over 25% of the species present were more spatially associated to nurse plants than expected by chance in both alpine and chyland areas, illustrating the high importance of positive plant-plant interactions for the maintenance of plant diversity in these environments. Facilitative interactions were more frequent, and more related to environmental conditions, in alpine than in dryland areas, perhaps because drylands are generally characterized by a larger variety of environmental stress factors and plant functional traits. The frequency of facilitative interactions in alpine communities peaked at 1000 mm of annual rainfall, and globally decreased with elevation. The frequency of positive interactions in dtyland communities decreased globally with water scarcity or temperature annual range. Positive facilitation-drought stress relationships are more likely in shorter regional gradients, but these relationships are obscured in regions with a greater species turnover or with complex environmental gradients. By showing the different climatic drivers and behaviors of plant-plant interactions in dryland and alpine areas, our results will improve predictions regarding the effect of facilitation on the assembly of plant communities and their response to changes in environmental conditions.
Resumo:
PRINCIPLES Patients with carotid artery stenosis (CAS) are at risk of ipsilateral stroke and chronic compromise of cerebral blood flow. It is under debate whether the hypo-perfusion or embolism in CAS is directly related to cognitive impairment. Alternatively, CAS may be a marker for underlying risk factors, which themselves influence cognition. We aimed to determine cognitive performance level and the emotional state of patients with CAS. We hypothesised that patients with high grade stenosis, bilateral stenosis, symptomatic patients and/or those with relevant risk factors would suffer impairment of their cognitive performance and emotional state. METHODS A total of 68 patients with CAS of ≥70% were included in a prospective exploratory study design. All patients underwent structured assessment of executive functions, language, verbal and visual memory, motor speed, anxiety and depression. RESULTS Significantly more patients with CAS showed cognitive impairments (executive functions, word production, verbal and visual memory, motor speed) and anxiety than expected in a normative sample. Bilateral and symptomatic stenosis was associated with slower processing speed. Cognitive performance and anxiety level were not influenced by the side and the degree of stenosis or the presence of collaterals. Factors associated with less cognitive impairment included higher education level, female gender, ambidexterity and treated hypercholesterolemia. CONCLUSIONS Cognitive impairment and increased level of anxiety are frequent in patients with carotid stenosis. The lack of a correlation between cognitive functioning and degree of stenosis or the presence of collaterals, challenges the view that CAS per se leads to cognitive impairment.
Resumo:
In personal and in society related context, people often evaluate the risk of environmental and technological hazards. Previous research addressing neuroscience of risk evaluation assessed particularly the direct personal risk of presented stimuli, which may have comprised for instance aspects of fear. Further, risk evaluation primarily was compared to tasks of other cognitive domains serving as control conditions, thus revealing general risk related brain activity, but not such specifically associated with estimating a higher level of risk. We here investigated the neural basis on which lay-persons individually evaluated the risk of different potential hazards for the society. Twenty healthy subjects underwent functional magnetic resonance imaging while evaluating the risk of fifty more or less risky conditions presented as written terms. Brain activations during the individual estimations of 'high' against 'low' risk, and of negative versus neutral and positive emotional valences were analyzed. Estimating hazards to be of high risk was associated with activation in medial thalamus, anterior insula, caudate nucleus, cingulate cortex and further prefrontal and temporo-occipital areas. These areas were not involved according to an analysis of the emotion ratings. In conclusion, we emphasize a contribution of the mentioned brain areas involved to signal high risk, here not primarily associated with the emotional valence of the risk items. These areas have earlier been reported to be associated with, beside emotional, viscerosensitive and implicit processing. This leads to assumptions of an intuitive contribution, or a "gut-feeling", not necessarily dependent of the subjective emotional valence, when estimating a high risk of environmental hazards.
Resumo:
The present article analyzed, how need for cognition (NFC) influences the formation of performance expectancies. When processing information, individuals with lower NFC often rely on salient information and shortcuts compared to individuals higher in NFC. We assume that these preferences of processing will also make individuals low in NFC more responsive to salient achievement-related cues because the processing of salient cues is cognitively less demanding than the processing of non-salient cues. Therefore, individuals lower in NFC should tend to draw wider ranging inferences from salient achievement-related information. In a sample of N = 197 secondary school students, achievement-related feedback (grade on an English examination) affected changes in expectancies in non-corresponding academic subjects (e.g., expectation of final grade in mathematics or history) when NFC was lower, whereas for students with higher NFC, changes in expectancies in non-corresponding academic subjects were not affected.
Resumo:
OBJECTIVES In Europe and elsewhere, health inequalities among HIV-positive individuals are of concern. We investigated late HIV diagnosis and late initiation of combination antiretroviral therapy (cART) by educational level, a proxy of socioeconomic position. DESIGN AND METHODS We used data from nine HIV cohorts within COHERE in Austria, France, Greece, Italy, Spain and Switzerland, collecting data on level of education in categories of the UNESCO/International Standard Classification of Education standard classification: non-completed basic, basic, secondary and tertiary education. We included individuals diagnosed with HIV between 1996 and 2011, aged at least 16 years, with known educational level and at least one CD4 cell count within 6 months of HIV diagnosis. We examined trends by education level in presentation with advanced HIV disease (AHD) (CD4 <200 cells/μl or AIDS within 6 months) using logistic regression, and distribution of CD4 cell count at cART initiation overall and among presenters without AHD using median regression. RESULTS Among 15 414 individuals, 52, 45,37, and 31% with uncompleted basic, basic, secondary and tertiary education, respectively, presented with AHD (P trend <0.001). Compared to patients with tertiary education, adjusted odds ratios of AHD were 1.72 (95% confidence interval 1.48-2.00) for uncompleted basic, 1.39 (1.24-1.56) for basic and 1.20 (1.08-1.34) for secondary education (P < 0.001). In unadjusted and adjusted analyses, median CD4 cell count at cART initiation was lower with poorer educational level. CONCLUSIONS Socioeconomic inequalities in delayed HIV diagnosis and initiation of cART are present in European countries with universal healthcare systems and individuals with lower educational level do not equally benefit from timely cART initiation.
Resumo:
BACKGROUND A cost-effective strategy to increase the density of available markers within a population is to sequence a small proportion of the population and impute whole-genome sequence data for the remaining population. Increased densities of typed markers are advantageous for genome-wide association studies (GWAS) and genomic predictions. METHODS We obtained genotypes for 54 602 SNPs (single nucleotide polymorphisms) in 1077 Franches-Montagnes (FM) horses and Illumina paired-end whole-genome sequencing data for 30 FM horses and 14 Warmblood horses. After variant calling, the sequence-derived SNP genotypes (~13 million SNPs) were used for genotype imputation with the software programs Beagle, Impute2 and FImpute. RESULTS The mean imputation accuracy of FM horses using Impute2 was 92.0%. Imputation accuracy using Beagle and FImpute was 74.3% and 77.2%, respectively. In addition, for Impute2 we determined the imputation accuracy of all individual horses in the validation population, which ranged from 85.7% to 99.8%. The subsequent inclusion of Warmblood sequence data further increased the correlation between true and imputed genotypes for most horses, especially for horses with a high level of admixture. The final imputation accuracy of the horses ranged from 91.2% to 99.5%. CONCLUSIONS Using Impute2, the imputation accuracy was higher than 91% for all horses in the validation population, which indicates that direct imputation of 50k SNP-chip data to sequence level genotypes is feasible in the FM population. The individual imputation accuracy depended mainly on the applied software and the level of admixture.
Resumo:
We regularize compact and non-compact Abelian Chern–Simons–Maxwell theories on a spatial lattice using the Hamiltonian formulation. We consider a doubled theory with gauge fields living on a lattice and its dual lattice. The Hilbert space of the theory is a product of local Hilbert spaces, each associated with a link and the corresponding dual link. The two electric field operators associated with the link-pair do not commute. In the non-compact case with gauge group R, each local Hilbert space is analogous to the one of a charged “particle” moving in the link-pair group space R2 in a constant “magnetic” background field. In the compact case, the link-pair group space is a torus U(1)2 threaded by k units of quantized “magnetic” flux, with k being the level of the Chern–Simons theory. The holonomies of the torus U(1)2 give rise to two self-adjoint extension parameters, which form two non-dynamical background lattice gauge fields that explicitly break the manifest gauge symmetry from U(1) to Z(k). The local Hilbert space of a link-pair then decomposes into representations of a magnetic translation group. In the pure Chern–Simons limit of a large “photon” mass, this results in a Z(k)-symmetric variant of Kitaev’s toric code, self-adjointly extended by the two non-dynamical background lattice gauge fields. Electric charges on the original lattice and on the dual lattice obey mutually anyonic statistics with the statistics angle . Non-Abelian U(k) Berry gauge fields that arise from the self-adjoint extension parameters may be interesting in the context of quantum information processing.
Resumo:
Histone pre-mRNA 3' processing is controlled by a hairpin element preceding the processing site that interacts with a hairpin-binding protein (HBP) and a downstream spacer element that serves as anchoring site for the U7 snRNP. In addition, the nucleotides following the hairpin and surrounding the processing site (ACCCA'CA) are conserved among vertebrate histone genes. Single to triple nucleotide mutations of this sequence were tested for their ability to be processed in nuclear extract from animal cells. Changing the first four nucleotides had no qualitative and little if any quantitative effects on histone RNA 3' processing in mouse K21 cell extract, where processing of this gene is virtually independent of the HBP. A gel mobility shift assay revealing HBP interactions and a processing assay in HeLa cell extract (where the contribution of HBP to efficient processing is more important) showed that only one of these mutations, predicted to extend the hairpin by one base pair, affected the interaction with HBP. Mutations in the next three nucleotides affected both the cleavage efficiency and the choice of processing sites. Analysis of these novel sites indicated a preference for the nucleotide 5' of the cleavage site in the order A > C > U > G. Moreover, a guanosine in the 3' position inhibited cleavage. The preference for an A is shared with the cleavage/polyadenylation reaction, but the preference order for the other nucleotides is different [Chen F, MacDonald CC, Wilusz J, 1995, Nucleic Acids Res 23:2614-2620].
Resumo:
Recent functional magnetic resonance imaging (fMRI) studies consistently revealed contributions of fronto-parietal and related networks to the execution of a visuospatial judgment task, the so-called "Clock Task". However, due to the low temporal resolution of fMRI, the exact cortical dynamics and timing of processing during task performance could not be resolved until now. In order to clarify the detailed cortical activity and temporal dynamics, 14 healthy subjects performed an established version of the "Clock Task", which comprises a visuospatial task (angle discrimination) and a control task (color discrimination) with the same stimulus material, in an electroencephalography (EEG) experiment. Based on the time-resolved analysis of network activations (microstate analysis), differences in timing between the angle compared to the color discrimination task were found after sensory processing in a time window starting around 200ms. Significant differences between the two tasks were observed in an analysis window from 192ms to 776ms. We divided this window in two parts: an early phase - from 192ms to ∼440ms, and a late phase - from ∼440ms to 776ms. For both tasks, the order of network activations and the types of networks were the same, but, in each phase, activations for the two conditions were dominated by differing network states with divergent temporal dynamics. Our results provide an important basis for the assessment of deviations in processing dynamics during visuospatial tasks in clinical populations.
Resumo:
OBJECTIVES
To test the applicability, accuracy, precision, and reproducibility of various 3D superimposition techniques for radiographic data, transformed to triangulated surface data.
METHODS
Five superimposition techniques (3P: three-point registration; AC: anterior cranial base; AC + F: anterior cranial base + foramen magnum; BZ: both zygomatic arches; 1Z: one zygomatic arch) were tested using eight pairs of pre-existing CT data (pre- and post-treatment). These were obtained from non-growing orthodontic patients treated with rapid maxillary expansion. All datasets were superimposed by three operators independently, who repeated the whole procedure one month later. Accuracy was assessed by the distance (D) between superimposed datasets on three form-stable anatomical areas, located on the anterior cranial base and the foramen magnum. Precision and reproducibility were assessed using the distances between models at four specific landmarks. Non parametric multivariate models and Bland-Altman difference plots were used for analyses.
RESULTS
There was no difference among operators or between time points on the accuracy of each superimposition technique (p>0.05). The AC + F technique was the most accurate (D<0.17 mm), as expected, followed by AC and BZ superimpositions that presented similar level of accuracy (D<0.5 mm). 3P and 1Z were the least accurate superimpositions (0.79
Resumo:
The majority of first-episode psychoses are preceded by a prodromal phase that is several years on average, frequently leads to some decline in psychosocial functioning and offers the opportunity for early detection within the framework of an indicated prevention. To this, two approaches are currently mainly followed. The ultra-high-risk (UHR) criteria were explicitly developed to predict first-episode psychosis within 12 months, and indeed the majority of conversions in clinical UHR samples seem to occur within the first 12 months of initial assessment. Their main criterion, the attenuated psychotic symptoms criterion, captures symptoms that resemble positive symptoms of psychosis (i.e. delusions, hallucinations and formal thought disorders) with the exception that some level of insight is still maintained, and these frequently compromise functioning already. In contrast, the basic symptom criteria try to catch patients at increased risk of psychoses at the earliest possible time, i.e. ideally when only the first subtle disturbances in information processing have developed that are experienced with full insight and do not yet overload the person's coping abilities, and thus have not yet resulted in any functional decline. First results from prospective studies not only support this view, but indicate that the combination of both approaches might be a more favorable way to increase sensitivity and detect risk earlier, as well as to establish a change-sensitive risk stratification approach.
Resumo:
We have analysed the extent of base-pairing interactions between spacer sequences of histone pre-mRNA and U7 snRNA present in the trans-acting U7 snRNP and their importance for histone RNA 3' end processing in vitro. For the efficiently processed mouse H4-12 gene, a computer analysis revealed that additional base pairs could be formed with U7 RNA outside of the previously recognised spacer element (stem II). One complementarity (stem III) is located more 3' and involves nucleotides from the very 5' end of U7 RNA. The other, more 5' located complementarity (stem I) involves nucleotides of the Sm binding site of U7 RNA, a part known to interact with snRNP structural proteins. These potential stem structures are separated from each other by short internal loops of unpaired nucleotides. Mutational analyses of the pre-mRNA indicate that stems II and III are equally important for interaction with the U7 snRNP and for processing, whereas mutations in stem I have moderate effects on processing efficiency, but do not impair complex formation with the U7 snRNP. Thus nucleotides near the processing site may be important for processing, but do not contribute to the assembly of an active complex by forming a stem I structure. The importance of stem III was confirmed by the ability of a complementary mutation in U7 RNA to suppress a stem III mutation in a complementation assay using Xenopus laevis oocytes. The main role of the factor(s) binding to the upstream hairpin loop is to stabilise the U7-pre-mRNA complex. This was shown by either stabilising (by mutation) or destabilising (by increased temperature) the U7-pre-mRNA base-pairing under conditions where hairpin factor binding was either allowed or prevented (by mutation or competition). The hairpin dependence of processing was found to be inversely related to the strength of the U7-pre-mRNA interaction.
Resumo:
The usage of intensity modulated radiotherapy (IMRT) treatments necessitates a significant amount of patient-specific quality assurance (QA). This research has investigated the precision and accuracy of Kodak EDR2 film measurements for IMRT verifications, the use of comparisons between 2D dose calculations and measurements to improve treatment plan beam models, and the dosimetric impact of delivery errors. New measurement techniques and software were developed and used clinically at M. D. Anderson Cancer Center. The software implemented two new dose comparison parameters, the 2D normalized agreement test (NAT) and the scalar NAT index. A single-film calibration technique using multileaf collimator (MLC) delivery was developed. EDR2 film's optical density response was found to be sensitive to several factors: radiation time, length of time between exposure and processing, and phantom material. Precision of EDR2 film measurements was found to be better than 1%. For IMRT verification, EDR2 film measurements agreed with ion chamber results to 2%/2mm accuracy for single-beam fluence map verifications and to 5%/2mm for transverse plane measurements of complete plan dose distributions. The same system was used to quantitatively optimize the radiation field offset and MLC transmission beam modeling parameters for Varian MLCs. While scalar dose comparison metrics can work well for optimization purposes, the influence of external parameters on the dose discrepancies must be minimized. The ability of 2D verifications to detect delivery errors was tested with simulated data. The dosimetric characteristics of delivery errors were compared to patient-specific clinical IMRT verifications. For the clinical verifications, the NAT index and percent of pixels failing the gamma index were exponentially distributed and dependent upon the measurement phantom but not the treatment site. Delivery errors affecting all beams in the treatment plan were flagged by the NAT index, although delivery errors impacting only one beam could not be differentiated from routine clinical verification discrepancies. Clinical use of this system will flag outliers, allow physicists to examine their causes, and perhaps improve the level of agreement between radiation dose distribution measurements and calculations. The principles used to design and evaluate this system are extensible to future multidimensional dose measurements and comparisons. ^
Resumo:
A study of the association of Herpes simplex virus 1 and 2 exposure to early atherosclerosis using high C-reactive protein level as a marker was carried out in US born, non-pregnant, 20-49 year olds participating in a national survey between 1999 and 2004. Participants were required to have valid results for Herpes simplex virus 1 and 2 and C-Reactive Protein for inclusion. Cases were those found to have a high C-reactive protein level of 0.3-1 mg/dL, while controls had low to normal values (0.01-0.29 mg/dL). Overall, there were 1211 cases and 2870 controls. Mexican American and non-Hispanic black women were much more likely to fall into the high cardiac risk group than the other sex race groups with proportions of 44% and 39%, respectively. ^ Herpesvirus exposure was categorized such that Herpes simplex virus 1 and 2 exposure could be studied simultaneously within the same individual and models. The HSV 1+, HSV 2- category included the highest percentage (45.63%) of participants, followed by HSV 1-, HSV 2- (30.16%); HSV 1+, HSV 2+ (15.09%); and HSV 1-, HSV 2+ (9.12%) respectively. The proportion of participants in the HSV 1+, HSV 2- category was substantially higher in Mexican Americans (63%-66%). Further, the proportion in the HSV 1+, HSV 2+ category was notably higher in the non-Hispanic black participants (23%-44%). Non-Hispanic black women also had the highest percentage of HSV 1-, HSV 2+ exposure of all the sex race groups at 17%. ^ Overall, the unadjusted odds ratios for atherosclerotic disease defined by C-reactive protein with HSV 1-, HSV 2- as the referent group was 1.62 (95% CI 1.23-2.14) for HSV 1 +, HSV 2+; 1.3 (95% CI 1.10-1.69 for HSV 1+, HSV 2-; and 1.52 (95% CI 1.14-2.01). When the study was stratified into sex-race groups, only HSV 1+, HSV 2- in the Non-Hispanic white men remained significant (OR=1.6; 95% CI 1.06-2.43). Adjustment for selected covariates was made in the multivariate model for both the overall and sex-race stratified studies. High C-reactive protein values were not associated with any of the Herpesvirus exposure levels in either the overall or stratified analyses. ^