953 resultados para Probability of detection
Resumo:
Captan and folpet are fungicides largely used in agriculture. They have similar chemical structures, except that folpet has an aromatic ring unlike captan. Their half-lives in blood are very short, given that they are readily broken down to tetrahydrophthalimide (THPI) and phthalimide (PI), respectively. Few authors measured these biomarkers in plasma or urine, and analysis was conducted either by gas chromatography coupled to mass spectrometry or liquid chromatography with UV detection. The objective of this study was thus to develop simple, sensitive and specific liquid chromatography-atmospheric pressure chemical ionization-tandem mass spectrometry (LC/APCI-MS/MS) methods to quantify both THPI and PI in human plasma and urine. Briefly, deuterated THPI was added as an internal standard and purification was performed by solid-phase extraction followed by LC/APCI-MS/MS analysis in negative ion mode for both compounds. Validation of the methods was conducted using spiked blank plasma and urine samples at concentrations ranging from 1 to 250 μg/L and 1 to 50 μg/L, respectively, along with samples of volunteers and workers exposed to captan or folpet. The methods showed a good linearity (R (2) > 0.99), recovery (on average 90% for THPI and 75% for PI), intra- and inter-day precision (RSD, <15%) and accuracy (<20%), and stability. The limit of detection was 0.58 μg/L in urine and 1.47 μg/L in plasma for THPI and 1.14 and 2.17 μg/L, respectively, for PI. The described methods proved to be accurate and suitable to determine the toxicokinetics of both metabolites in human plasma and urine.
Resumo:
In order to upgrade the reliability of xenodiagnosis, attention has been directed towards population dynamics of the parasite, with particular interest for the following factors: 1. Parasite density which by itself is not a research objective, but by giving an accurate portrayal of parasite development and multiplication, has been incorporated in screening of bugs for xenodiagnosis. 2. On the assumption that food availability might increase parasite density, bugs from xenodiagnosis have been refed at biweekly intervals on chicken blood. 3. Infectivity rates and positives harbouring large parasite yields were based on gut infections, in which the parasite population comprised of all developmental forms was more abundant and easier to detect than in fecal infections, thus minimizing the probability of recording false negatives. 4. Since parasite density, low in the first 15 days of infection, increases rapidly in the following 30 days, the interval of 45 days has been adopted for routine examination of bugs from xenodiagnosis. By following the enumerated measures, all aiming to reduce false negative cases, we are getting closer to a reliable xenodiagnostic procedure. Upgrading the efficacy of xenodiagnosis is also dependent on the xenodiagnostic agent. Of 9 investigated vector species, Panstrongylus megistus deserves top priority as a xenodiagnostic agent. Its extraordinary capability to support fast development and vigorous multiplication of the few parasites, ingested from the host with chronic Chagas' disease, has been revealed by the strikingly close infectivity rates of 91.2% vs. 96.4% among bugs engorged from the same host in the chronic and acute phase of the disease respectively (Table V), the latter comporting an estimated number of 12.3 x 10[raised to the power of 3] parasites in the circulation at the time of xenodiagnosis, as reported previously by the authors (1982).
Resumo:
In traditional criminal investigation, uncertainties are often dealt with using a combination of common sense, practical considerations and experience, but rarely with tailored statistical models. For example, in some countries, in order to search for a given profile in the national DNA database, it must have allelic information for six or more of the ten SGM Plus loci for a simple trace. If the profile does not have this amount of information then it cannot be searched in the national DNA database (NDNAD). This requirement (of a result at six or more loci) is not based on a statistical approach, but rather on the feeling that six or more would be sufficient. A statistical approach, however, could be more rigorous and objective and would take into consideration factors such as the probability of adventitious matches relative to the actual database size and/or investigator's requirements in a sensible way. Therefore, this research was undertaken to establish scientific foundations pertaining to the use of partial SGM Plus loci profiles (or similar) for investigation.
Resumo:
BACKGROUND: The synthesis of published research in systematic reviews is essential when providing evidence to inform clinical and health policy decision-making. However, the validity of systematic reviews is threatened if journal publications represent a biased selection of all studies that have been conducted (dissemination bias). To investigate the extent of dissemination bias we conducted a systematic review that determined the proportion of studies published as peer-reviewed journal articles and investigated factors associated with full publication in cohorts of studies (i) approved by research ethics committees (RECs) or (ii) included in trial registries. METHODS AND FINDINGS: Four bibliographic databases were searched for methodological research projects (MRPs) without limitations for publication year, language or study location. The searches were supplemented by handsearching the references of included MRPs. We estimated the proportion of studies published using prediction intervals (PI) and a random effects meta-analysis. Pooled odds ratios (OR) were used to express associations between study characteristics and journal publication. Seventeen MRPs (23 publications) evaluated cohorts of studies approved by RECs; the proportion of published studies had a PI between 22% and 72% and the weighted pooled proportion when combining estimates would be 46.2% (95% CI 40.2%-52.4%, I2 = 94.4%). Twenty-two MRPs (22 publications) evaluated cohorts of studies included in trial registries; the PI of the proportion published ranged from 13% to 90% and the weighted pooled proportion would be 54.2% (95% CI 42.0%-65.9%, I2 = 98.9%). REC-approved studies with statistically significant results (compared with those without statistically significant results) were more likely to be published (pooled OR 2.8; 95% CI 2.2-3.5). Phase-III trials were also more likely to be published than phase II trials (pooled OR 2.0; 95% CI 1.6-2.5). The probability of publication within two years after study completion ranged from 7% to 30%. CONCLUSIONS: A substantial part of the studies approved by RECs or included in trial registries remains unpublished. Due to the large heterogeneity a prediction of the publication probability for a future study is very uncertain. Non-publication of research is not a random process, e.g., it is associated with the direction of study findings. Our findings suggest that the dissemination of research findings is biased.
Resumo:
The high density of slope failures in western Norway is due to the steep relief and to the concentration of various structures that followed protracted ductile and brittle tectonics. On the 72 investigated rock slope instabilities, 13 were developed in soft weathered mafic and phyllitic allochthons. Only the intrinsic weakness of such rocks increases the susceptibility to gravitational deformation. In contrast, the gravitational structures in the hard gneisses reactivate prominent ductile or/and brittle fabrics. At 30 rockslides along cataclinal slopes, weak mafic layers of foliation are reactivated as basal planes. Slope-parallel steep foliation forms back-cracks of unstable columns. Folds are specifically present in the Storfjord area, together with a clustering of potential slope failures. Folding increases the probability of having favourably orientated planes with respect to the gravitational forces and the slope. High water pressure is believed to seasonally build up along the shallow-dipping Caledonian detachments and may contribute to destabilization of the rock slope upwards. Regional cataclastic faults localized the gravitational structures at 45 sites. The volume of the slope instabilities tends to increase with the amount of reactivated prominent structures and the spacing of the latter controls the size of instabilities.
Resumo:
Many churches are concerned about older and dwindling congregations. We develop a theoretical framework to explain not only the downward trend in church attendance, but also the increase in the proportion of older people in the congregations. Religiosity depends positively on the expected social and spiritual benefits attached to religious adherence, as well as the probability of entering heaven in the afterlife. While otherworldly compensation in terms of salvation and spiritual benefits motivates religiosity, the costs of formal religion in terms of time allocated to communal activities and foregone income work in the opposite direction. We show that higher life expectancy discounts expected benefits in the afterlife and is hence likely to lead to postponement of religiosity. For this reason, religious organizations should be prepared to attract older members to their congregations, while emphasizing contemporaneous religious benefits to increase overall church attendance.
Resumo:
We propose an elementary theory of wars fought by fully rational contenders. Two parties play a Markov game that combines stages of bargaining with stages where one side has the ability to impose surrender on the other. Under uncertainty and incomplete information, in the unique equilibrium of the game, long confrontations occur: war arises when reality disappoints initial (rational) optimism, and it persist longer when both agents are optimists but reality proves both wrong. Bargaining proposals that are rejected initially might eventually be accepted after several periods of confrontation. We provide an explicit computation of the equilibrium, evaluating the probability of war, and its expected losses as a function of i) the costs of confrontation, ii) the asymmetry of the split imposed under surrender, and iii) the strengths of contenders at attack and defense. Changes in these parameters display non-monotonic effects.
Resumo:
Untreated wastewater being directly discharged into rivers is a very harmful environmental hazard that needs to be tackled urgently in many countries. In order to safeguard the river ecosystem and reduce water pollution, it is important to have an effluent charge policy that promotes the investment of wastewater treatment technology by domestic firms. This paper considers the strategic interaction between the government and the domestic firms regarding the investment in the wastewater treatment technology and the design of optimal effluent charge policy that should be implemented. In this model, the higher is the proportion of non-investing firms, the higher would be the probability of having to incur an effluent charge and the higher would be that charge. On one hand the government needs to impose a sufficiently strict policy to ensure that firms have strong incentive to invest. On the other hand, it cannot be too strict that it drives out firms which cannot afford to invest in such expensive technology. The paper analyses the factors that affect the probability of investment in this technology. It also explains the difficulty of imposing a strict environment policy in countries that have too many small firms which cannot afford to invest unless subsidised.
Resumo:
This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.
Resumo:
This paper compares the forecasting performance of different models which have been proposed for forecasting in the presence of structural breaks. These models differ in their treatment of the break process, the parameters defining the model which applies in each regime and the out-of-sample probability of a break occurring. In an extensive empirical evaluation involving many important macroeconomic time series, we demonstrate the presence of structural breaks and their importance for forecasting in the vast majority of cases. However, we find no single forecasting model consistently works best in the presence of structural breaks. In many cases, the formal modeling of the break process is important in achieving good forecast performance. However, there are also many cases where simple, rolling OLS forecasts perform well.
Resumo:
The aim of the project is to develop a theoretical framework where homelessness arises due to various economic and social factors that vary over time. The ultimate goal is i) to understand whether homelessness spells, entrances and exits could be predicted and if so what information is necessary; and ii) to design and evaluate a homelessness prevention programme in a changing and uncertain environment. Examples of the questions we want to answer are: Should it be made easier for people to borrow money so that they can get out of homelessness, or will such borrowing allow people to over-consume today and so fall into homelessness tomorrow? Should precautionary savings be encouraged so that people have cushions to withstand future shocks, or will savings just delay entry into homelessness? What interventions will affect the probability of becoming homeless and how will they affect behaviour? How will interventions affect incentives to save and to consume before homelessness prevention programmes kick in?
Resumo:
BACKGROUND: Recommended oral voriconazole (VRC) doses are lower than intravenous doses. Because plasma concentrations impact efficacy and safety of therapy, optimizing individual drug exposure may improve these outcomes. METHODS: A population pharmacokinetic analysis (NONMEM) was performed on 505 plasma concentration measurements involving 55 patients with invasive mycoses who received recommended VRC doses. RESULTS: A 1-compartment model with first-order absorption and elimination best fitted the data. VRC clearance was 5.2 L/h, the volume of distribution was 92 L, the absorption rate constant was 1.1 hour(-1), and oral bioavailability was 0.63. Severe cholestasis decreased VRC elimination by 52%. A large interpatient variability was observed on clearance (coefficient of variation [CV], 40%) and bioavailability (CV 84%), and an interoccasion variability was observed on bioavailability (CV, 93%). Lack of response to therapy occurred in 12 of 55 patients (22%), and grade 3 neurotoxicity occurred in 5 of 55 patients (9%). A logistic multivariate regression analysis revealed an independent association between VRC trough concentrations and probability of response or neurotoxicity by identifying a therapeutic range of 1.5 mg/L (>85% probability of response) to 4.5 mg/L (<15% probability of neurotoxicity). Population-based simulations with the recommended 200 mg oral or 300 mg intravenous twice-daily regimens predicted probabilities of 49% and 87%, respectively, for achievement of 1.5 mg/L and of 8% and 37%, respectively, for achievement of 4.5 mg/L. With 300-400 mg twice-daily oral doses and 200-300 mg twice-daily intravenous doses, the predicted probabilities of achieving the lower target concentration were 68%-78% for the oral regimen and 70%-87% for the intravenous regimen, and the predicted probabilities of achieving the upper target concentration were 19%-29% for the oral regimen and 18%-37% for the intravenous regimen. CONCLUSIONS: Higher oral than intravenous VRC doses, followed by individualized adjustments based on measured plasma concentrations, improve achievement of the therapeutic target that maximizes the probability of therapeutic response and minimizes the probability of neurotoxicity. These findings challenge dose recommendations for VRC.
Resumo:
The Conservative Party emerged from the 2010 United Kingdom General Election as the largest single party, but their support was not geographically uniform. In this paper, we estimate a hierarchical Bayesian spatial probit model that tests for the presence of regional voting effects. This model allows for the estimation of individual region-specic effects on the probability of Conservative Party success, incorporating information on the spatial relationships between the regions of the mainland United Kingdom. After controlling for a range of important covariates, we find that these spatial relationships are significant and that our individual region-specic effects estimates provide additional evidence of North-South variations in Conservative Party support.
Resumo:
The standard approach to the economics of climate change, which has its best known implementation in Nordhaus's DICE and RICE models (well described in Nordhaus's 2008 book, A Question of Balance) is not well equipped to deal with the possibility of catastrophe, since we are unable to evaluate a risk averse representative agent's expected utility when there is any signi cant probability of zero consumption. Whilst other authors attempt to develop new tools with which to address these problems, the simple solution proposed in this paper is to ask a question that the currently available tools of climate change economics are capable of answering. Rather than having agents optimally choosing a path (that differs from the recommendations of climate scientists) within models which cannot capture the essential features of the problem, I argue that economic models should be used to determine the savings and investment paths which implement climate targets that have been suggested in the physical science literature.
Resumo:
This paper presents an axiomatic characterization of difference-form group contests, that is, contests fought among groups and where their probability of victory depends on the difference of their effective efforts. This axiomatization rests on the property of Equalizing Consistency, stating that the difference between winning probabilities in the grand contest and in the smaller contest should be identical across all participants in the smaller contest. This property overcomes some of the drawbacks of the widely-used ratio-form contest success functions. Our characterization shows that the criticisms commonly-held against difference-form contests success functions, such as lack of scale invariance and zero elasticity of augmentation, are unfounded.By clarifying the properties of this family of contest success functions, this axiomatization can help researchers to find the functional form better suited to their application of interest.