38 resultados para formal and informal control
Resumo:
Erythropoietin (EPO) and iron deficiency as causes of anemia in patients with limited renal function or end-stage renal disease are well addressed. The concomitant impairment of red blood cell (RBC) survival has been largely neglected. Properties of the uremic environment like inflammation, increased oxidative stress and uremic toxins seem to be responsible for the premature changes in RBC membrane and cytoskeleton. The exposure of antigenic sites and breakdown of the phosphatidylserine asymmetry promote RBC phagocytosis. While the individual response to treatment with EPO-stimulating agents (ESA) depends on both the RBC's lifespan and the production rate, uniform dosing algorithms do not meet that demand. The clinical use of mathematical models predicting ESA-induced changes in hematocrit might be greatly improved once independent estimates of RBC production rate and/or lifespan become available, thus making the concomitant estimation of both parameters unnecessary. Since heme breakdown by the hemoxygenase pathway results in carbon monoxide (CO) which is exhaled, a simple CO breath test has been used to calculate hemoglobin turnover and therefore RBC survival and lifespan. Future research will have to be done to validate and implement this method in patients with kidney failure. This will result in new insights into RBC kinetics in renal patients. Eventually, these findings are expected to improve our understanding of the hemoglobin variability in response to ESA.
Resumo:
Bovine spongiform encephalopathy (BSE) rapid tests and routine BSE-testing laboratories underlie strict regulations for approval. Due to the lack of BSE-positive control samples, however, full assay validation at the level of individual test runs and continuous monitoring of test performance on-site is difficult. Most rapid tests use synthetic prion protein peptides, but it is not known to which extend they reflect the assay performance on field samples, and whether they are sufficient to indicate on-site assay quality problems. To address this question we compared the test scores of the provided kit peptide controls to those of standardized weak BSE-positive tissue samples in individual test runs as well as continuously over time by quality control charts in two widely used BSE rapid tests. Our results reveal only a weak correlation between the weak positive tissue control and the peptide control scores. We identified kit-lot related shifts in the assay performances that were not reflected by the peptide control scores. Vice versa, not all shifts indicated by the peptide control scores indeed reflected a shift in the assay performance. In conclusion these data highlight that the use of the kit peptide controls for continuous quality control purposes may result in unjustified rejection or acceptance of test runs. However, standardized weak positive tissue controls in combination with Shewhart-CUSUM control charts appear to be reliable in continuously monitoring assay performance on-site to identify undesired deviations.
Resumo:
HIV-1 sequence diversity is affected by selection pressures arising from host genomic factors. Using paired human and viral data from 1071 individuals, we ran >3000 genome-wide scans, testing for associations between host DNA polymorphisms, HIV-1 sequence variation and plasma viral load (VL), while considering human and viral population structure. We observed significant human SNP associations to a total of 48 HIV-1 amino acid variants (p<2.4 × 10−12). All associated SNPs mapped to the HLA class I region. Clinical relevance of host and pathogen variation was assessed using VL results. We identified two critical advantages to the use of viral variation for identifying host factors: (1) association signals are much stronger for HIV-1 sequence variants than VL, reflecting the ‘intermediate phenotype’ nature of viral variation; (2) association testing can be run without any clinical data. The proposed genome-to-genome approach highlights sites of genomic conflict and is a strategy generally applicable to studies of host–pathogen interaction.
Resumo:
Introduction Prospective memory (PM), the ability to remember to perform intended activities in the future (Kliegel & Jäger, 2007), is crucial to succeed in everyday life. PM seems to improve gradually over the childhood years (Zimmermann & Meier, 2006), but yet little is known about PM competences in young school children in general, and even less is known about factors influencing its development. Currently, a number of studies suggest that executive functions (EF) are potentially influencing processes (Ford, Driscoll, Shum & Macaulay, 2012; Mahy & Moses, 2011). Additionally, metacognitive processes (MC: monitoring and control) are assumed to be involved while optimizing one’s performance (Krebs & Roebers, 2010; 2012; Roebers, Schmid, & Roderer, 2009). Yet, the relations between PM, EF and MC remain relatively unspecified. We intend to empirically examine the structural relations between these constructs. Method A cross-sectional study including 119 2nd graders (mage = 95.03, sdage = 4.82) will be presented. Participants (n = 68 girls) completed three EF tasks (stroop, updating, shifting), a computerised event-based PM task and a MC spelling task. The latent variables PM, EF and MC that were represented by manifest variables deriving from the conducted tasks, were interrelated by structural equation modelling. Results Analyses revealed clear associations between the three cognitive constructs PM, EF and MC (rpm-EF = .45, rpm-MC = .23, ref-MC = .20). A three factor model, as opposed to one or two factor models, appeared to fit excellently to the data (chi2(17, 119) = 18.86, p = .34, remsea = .030, cfi = .990, tli = .978). Discussion The results indicate that already in young elementary school children, PM, EF and MC are empirically well distinguishable, but nevertheless substantially interrelated. PM and EF seem to share a substantial amount of variance while for MC, more unique processes may be assumed.
Resumo:
The study that aimed at understanding the dynamics of forced livestock movements and pastoral livelihood and development options was conducted in Lindi and Ruvuma regions, using both formal and informal approaches. Data were collected from 60 randomly selected Agro-pastoralists/Pastoralists and native farmers using a structured questionnaire. Four villages were involved; two in Lindi region (Matandu and Mkwajuni) and the other two in Ruvuma region (Gumbiro and Muhuwesi). Data were analyzed using descriptive statistics of SPSS to generate means and frequencies. The results indicate that a large number of animals moved into the study area following the eviction order of the government in Ihefu wetlands in 2006/2007. Lindi region was earmarked by the government to receive all the evicted pastoralists. However, by 2008 only 30% of the total cattle that were expected to move into the region had been received. Deaths of many animals on transit, selling of the animals to pay for transportation and other costs while on transit and many pastoralists settling in Coastal and Ruvuma regions before reaching their destinations were reported to be the reasons for the discrepancy observed. To mitigate anticipated conflicts between farmers and pastoralists, Participatory Land Use Management (PLUM) plans were developed in all the study villages in order to demarcate village land area into different uses, including grazing, cropping, settlement and forests. Land units for grazing were supposed to be provided with all necessary livestock infrastructures (dips, charcoal dams, livestock markets and stock routes). However, the land use plans were not able to prevent the anticipated conflicts because most of the livestock infrastructures were lacking, the land use boundaries were not clearly demarcated and there was limited enforcement of village by-laws, since most had not been enacted by the respective district councils. Similarly, the areas allocated for grazing were inadequate for the number of livestock available and thus the carrying capacity exceeded. Thus, land resource-based conflicts between farmers and pastoralists were emerging in the study areas for the reason that most of the important components in the PLUM plans were not in place. Nevertheless, the arrival of pastoralists in the study areas had positive effects on food security and growth of social interactions between pastoralists and farmers including marriages between them. Environmental degradations due to the arrival of livestock were also not evident. Thus, there is a need for the government to purposely set aside enough grazing land with all necessary infrastructures in place for the agro-pastoral/pastoral communities in the country.
Resumo:
Patients with first-episode psychosis (FEP) often show dysfunctional coping patterns, low self-efficacy, and external control beliefs that are considered to be risk factors for the development of psychosis. Therefore, these factors should already be present in patients at-risk for psychosis (AR). We compared frequencies of deficits in coping strategies (Stress-Coping-Questionnaires, SVF-120/SVF-KJ), self-efficacy, and control beliefs (Competence and Control Beliefs Questionnaire, FKK) between AR (n=21) and FEP (n=22) patients using a cross-sectional design. Correlations among coping, self-efficacy, and control beliefs were assessed in both groups. The majority of AR and FEP patients demonstrated deficits in coping skills, self-efficacy, and control beliefs. However, AR patients more frequently reported a lack of positive coping strategies, low self-efficacy, and a fatalistic externalizing bias. In contrast, FEP patients were characterized by being overly self-confident. These findings suggest that dysfunctional coping, self-efficacy, and control beliefs are already evident in AR patients, though different from those in FEP patients. The pattern of deficits in AR patients closely resembles that of depressive patients, which may reflect high levels of depressiveness in AR patients. Apart from being worthwhile treatment targets, these coping and belief patterns are promising candidates for predicting outcome in AR patients, including the conversion to psychosis
Resumo:
During the development of the somatic genome from the Paramecium germline genome the bulk of the copies of ∼45 000 unique, internal eliminated sequences (IESs) are deleted. IES targeting is facilitated by two small RNA (sRNA) classes: scnRNAs, which relay epigenetic information from the parental nucleus to the developing nucleus, and iesRNAs, which are produced and used in the developing nucleus. Why only certain IESs require sRNAs for their removal has been enigmatic. By analyzing the silencing effects of three genes: PGM (responsible for DNA excision), DCL2/3 (scnRNA production) and DCL5 (iesRNA production), we identify key properties required for IES elimination. Based on these results, we propose that, depending on the exact combination of their lengths and end bases, some IESs are less efficiently recognized or excised and have a greater requirement for targeting by scnRNAs and iesRNAs. We suggest that the variation in IES retention following silencing of DCL2/3 is not primarily due to scnRNA density, which is comparatively uniform relative to IES retention, but rather the genetic properties of IESs. Taken together, our analyses demonstrate that in Paramecium the underlying genetic properties of developmentally deleted DNA sequences are essential in determining the sensitivity of these sequences to epigenetic control.
Resumo:
Modeling of future water systems at the regional scale is a difficult task due to the complexity of current structures (multiple competing water uses, multiple actors, formal and informal rules) both temporally and spatially. Representing this complexity in the modeling process is a challenge that can be addressed by an interdisciplinary and holistic approach. The assessment of the water system of the Crans-Montana-Sierre area (Switzerland) and its evolution until 2050 were tackled by combining glaciological, hydrogeological, and hydrological measurements and modeling with the evaluation of water use through documentary, statistical and interview-based analyses. Four visions of future regional development were co-produced with a group of stakeholders and were then used as a basis for estimating future water demand. The comparison of the available water resource and the water demand at monthly time scale allowed us to conclude that for the four scenarios socioeconomic factors will impact on the future water systems more than climatic factors. An analysis of the sustainability of the current and future water systems based on four visions of regional development allowed us to identify those scenarios that will be more sustainable and that should be adopted by the decision-makers. The results were then presented to the stakeholders through five key messages. The challenges of communicating the results in such a way with stakeholders are discussed at the end of the article.
Resumo:
Disruption of function of left, but not right, lateral prefrontal cortex (LPFC) with low-frequency repetitive transcranial magnetic stimulation (rTMS) increased choices of immediate rewards over larger delayed rewards. rTMS did not change choices involving only delayed rewards or valuation judgments of immediate and delayed rewards, providing causal evidence for a neural lateral-prefrontal cortex-based self-control mechanism in intertemporal choice.
Resumo:
Imagine you are overweight and you spot your favorite pastry in the storefront of a bakery. How do you manage to resist this temptation? Or to give other examples, how do you manage to restrain yourself from overspending or succumbing to sexual temptations? The present article summarizes two recent studies stressing the fundamental importance of inhibition in the process of decision making. Based on the results of these studies, we dare to claim that the capacity to resist temptation depends on the activity level of the right prefrontal cortex (PFC).