31 resultados para sensitivity analyses

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Combined modality treatment (CMT) of chemotherapy followed by localized radiotherapy is standard treatment for patients with early stage Hodgkin's lymphoma. However, the role of radiotherapy has been questioned recently and some clinical study groups advocate chemotherapy only for this indication. We thus performed a systematic review with meta-analysis of randomized controlled trials comparing chemotherapy alone with CMT in patients with early stage Hodgkin's lymphoma with respect to response rate, tumor control and overall survival (OS). We searched Medline, EMBASE and the Cochrane Library as well as conference proceedings from January 1980 to February 2009 for randomized controlled trials comparing chemotherapy alone versus the same chemotherapy regimen plus radiotherapy. Progression free survival and similar outcomes were analyzed together as tumor control. Effect measures used were hazard ratios for OS and tumor control as well as relative risks for complete response (CR). Meta-analyses were performed using RevMan5. Five randomized controlled trials involving 1,245 patients were included. The hazard ratio (HR) was 0.41 (95% confidence interval (CI) 0.25 to 0.66) for tumor control and 0.40 (95% CI 0.27 to 0.59) for OS for patients receiving CMT compared to chemotherapy alone. CR rates were similar between treatment groups. In sensitivity analyses another 6 trials were included that did not fulfill the inclusion criteria of our protocol but were considered relevant to the topic. These trials underlined the results of the main analysis. In conclusion, adding radiotherapy to chemotherapy improves tumor control and OS in patients with early stage Hodgkin's lymphoma.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Context Long-term antiretroviral therapy (ART) use in resource-limited countries leads to increasing numbers of patients with HIV taking second-line therapy. Limited access to further therapeutic options makes essential the evaluation of second-line regimen efficacy in these settings. Objectives To investigate failure rates in patients receiving second-line therapy and factors associated with failure and death. Design, Setting, and Participants Multicohort study of 632 patients >14 years old receiving second-line therapy for more than 6 months in 27 ART programs in Africa and Asia between January 2001 and October 2008. Main Outcome Measures Clinical, immunological, virological, and immunovirological failure (first diagnosed episode of immunological or virological failure) rates, and mortality after 6 months of second-line therapy use. Sensitivity analyses were performed using alternative CD4 cell count thresholds for immunological and immunovirological definitions of failure and for cohort attrition instead of death. Results The 632 patients provided 740.7 person-years of follow-up; 119 (18.8%) met World Health Organization failure criteria after a median 11.9 months following the start of second-line therapy (interquartile range [IQR], 8.7-17.0 months), and 34 (5.4%) died after a median 15.1 months (IQR, 11.9-25.7 months). Failure rates were lower in those who changed 2 nucleoside reverse transcriptase inhibitors (NRTIs) instead of 1 (179.2 vs 251.6 per 1000 person-years; incidence rate ratio [IRR], 0.64; 95% confidence interval [CI], 0.42-0.96), and higher in those with lowest adherence index (383.5 vs 176.0 per 1000 person-years; IRR, 3.14; 95% CI, 1.67-5.90 for <80% vs ≥95% [percentage adherent, as represented by percentage of appointments attended with no delay]). Failure rates increased with lower CD4 cell counts when second-line therapy was started, from 156.3 vs 96.2 per 1000 person-years; IRR, 1.59 (95% CI, 0.78-3.25) for 100 to 199/μL to 336.8 per 1000 person-years; IRR, 3.32 (95% CI, 1.81-6.08) for less than 50/μL vs 200/μL or higher; and decreased with time using second-line therapy, from 250.0 vs 123.2 per 1000 person-years; IRR, 1.90 (95% CI, 1.19-3.02) for 6 to 11 months to 212.0 per 1000 person-years; 1.71 (95% CI, 1.01-2.88) for 12 to 17 months vs 18 or more months. Mortality for those taking second-line therapy was lower in women (32.4 vs 68.3 per 1000 person-years; hazard ratio [HR], 0.45; 95% CI, 0.23-0.91); and higher in patients with treatment failure of any type (91.9 vs 28.1 per 1000 person-years; HR, 2.83; 95% CI, 1.38-5.80). Sensitivity analyses showed similar results. Conclusions Among patients in Africa and Asia receiving second-line therapy for HIV, treatment failure was associated with low CD4 cell counts at second-line therapy start, use of suboptimal second-line regimens, and poor adherence. Mortality was associated with diagnosed treatment failure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Excess adiposity is associated with increased risks of developing adult malignancies. To inform public health policy and guide further research, the incident cancer burden attributable to excess body mass index (BMI >or= 25 kg/m(2)) across 30 European countries were estimated. Population attributable risks (PARs) were calculated using European- and gender-specific risk estimates from a published meta-analysis and gender-specific mean BMI estimates from a World Health Organization Global Infobase. Country-specific numbers of new cancers were derived from Globocan2002. A ten-year lag-period between risk exposure and cancer incidence was assumed and 95% confidence intervals (CI) were estimated in Monte Carlo simulations. In 2002, there were 2,171,351 new all cancer diagnoses in the 30 countries of Europe. Estimated PARs were 2.5% (95% CI 1.5-3.6%) in men and 4.1% (2.3-5.9%) in women. These collectively corresponded to 70,288 (95% CI 40,069-100,668) new cases. Sensitivity analyses revealed estimates were most influenced by the assumed shape of the BMI distribution in the population and cancer-specific risk estimates. In a scenario analysis of a plausible contemporary (2008) population, the estimated PARs increased to 3.2% (2.1-4.3%) and 8.6% (5.6-11.5%), respectively, in men and women. Endometrial, post-menopausal breast and colorectal cancers accounted for 65% of these cancers. This analysis quantifies the burden of incident cancers attributable to excess BMI in Europe. The estimates reported here provide a baseline for future modelling, and underline the need for research into interventions to control weight in the context of endometrial, breast and colorectal cancer.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Combined modality treatment (CMT) consisting of chemotherapy followed by localised radiotherapy is standard treatment for patients with early stage Hodgkin lymphoma (HL). However, due to long term adverse effects such as secondary malignancies, the role of radiotherapy has been questioned recently and some clinical study groups advocate chemotherapy only for this indication. Objectives We performed a systematic review with meta-analysis of randomised controlled trials (RCTs) comparing chemotherapy alone with CMT in patients with early stage Hodgkin lymphoma with respect to response rate, progression-free survival (alternatively tumour control) and overall survival (OS). Search methods We searched MEDLINE, EMBASE and CENTRAL as well as conference proceedings from January 1980 to November 2010 for randomised controlled trials comparing chemotherapy alone to the same chemotherapy regimen plus radiotherapy. Selection criteria Randomised controlled trials comparing chemotherapy alone with CMT in patients with early stage HL. Trials in which the chemotherapy differed between treatment arms were excluded. Trials with more than 20% of patients in advanced stage were also excluded. Data collection and analysis Effect measures used were hazard ratios (HR) for tumour control and OS as well as relative risks for response rates. Two review authors independently extracted data and assessed quality of trials. We contacted study authors to obtain missing information. Since none of the trials reported progression-free survival according to our definitions, all similar outcomes were evaluated as tumour control. Main results Five RCTs involving 1245 patients were included. The HR was 0.41 (95% confidence interval (CI) 0.25 to 0.66) for tumour control and 0.40 (95% CI 0.27 to 0.61) for OS for patients receiving CMT compared to chemotherapy alone. Complete response rates were similar between treatment groups. In sensitivity analyses another six trials were included that did not fulfil the inclusion criteria of our protocol but were considered relevant to the topic. These trials underlined the results of the main analysis. Authors' conclusions Adding radiotherapy to chemotherapy improves tumour control and overall survival in patients with early stage Hodgkin lymphoma.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Previous studies on childhood cancer and nuclear power plants (NPPs) produced conflicting results. We used a cohort approach to examine whether residence near NPPs was associated with leukaemia or any childhood cancer in Switzerland. Methods We computed person-years at risk for children aged 0–15 years born in Switzerland from 1985 to 2009, based on the Swiss censuses 1990 and 2000 and identified cancer cases from the Swiss Childhood Cancer Registry. We geo-coded place of residence at birth and calculated incidence rate ratios (IRRs) with 95% confidence intervals (CIs) comparing the risk of cancer in children born <5 km, 5–10 km and 10–15 km from the nearest NPP with children born >15 km away, using Poisson regression models. Results We included 2925 children diagnosed with cancer during 21 117 524 person-years of follow-up; 953 (32.6%) had leukaemia. Eight and 12 children diagnosed with leukaemia at ages 0–4 and 0–15 years, and 18 and 31 children diagnosed with any cancer were born <5 km from a NPP. Compared with children born >15 km away, the IRRs (95% CI) for leukaemia in 0–4 and 0–15 year olds were 1.20 (0.60–2.41) and 1.05 (0.60–1.86), respectively. For any cancer, corresponding IRRs were 0.97 (0.61–1.54) and 0.89 (0.63–1.27). There was no evidence of a dose–response relationship with distance (P > 0.30). Results were similar for residence at diagnosis and at birth, and when adjusted for potential confounders. Results from sensitivity analyses were consistent with main results. Conclusions This nationwide cohort study found little evidence of an association between residence near NPPs and the risk of leukaemia or any childhood cancer.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background Increased mortality among men on antiretroviral therapy (ART) has been documented but remains poorly understood. We examined the magnitude of and risk factors for gender differences in mortality on ART. Methods and Findings Analyses included 46,201 ART-naïve adults starting ART between January 2002 and December 2009 in eight ART programmes across South Africa (SA). Patients were followed from initiation of ART to outcome or analysis closure. The primary outcome was mortality; secondary outcomes were loss to follow-up (LTF), virologic suppression, and CD4+ cell count responses. Survival analyses were used to examine the hazard of death on ART by gender. Sensitivity analyses were limited to patients who were virologically suppressed and patients whose CD4+ cell count reached >200 cells/µl. We compared gender differences in mortality among HIV+ patients on ART with mortality in an age-standardised HIV-negative population. Among 46,201 adults (65% female, median age 35 years), during 77,578 person-years of follow-up, men had lower median CD4+ cell counts than women (85 versus 110 cells/µl, p<0.001), were more likely to be classified WHO stage III/IV (86 versus 77%, p<0.001), and had higher mortality in crude (8.5 versus 5.7 deaths/100 person-years, p<0.001) and adjusted analyses (adjusted hazard ratio [AHR] 1.31, 95% CI 1.22–1.41). After 36 months on ART, men were more likely than women to be truly LTF (AHR 1.20, 95% CI 1.12–1.28) but not to die after LTF (AHR 1.04, 95% CI 0.86–1.25). Findings were consistent across all eight programmes. Virologic suppression was similar by gender; women had slightly better immunologic responses than men. Notably, the observed gender differences in mortality on ART were smaller than gender differences in age-standardised death rates in the HIV-negative South African population. Over time, non-HIV mortality appeared to account for an increasing proportion of observed mortality. The analysis was limited by missing data on baseline HIV disease characteristics, and we did not observe directly mortality in HIV-negative populations where the participating cohorts were located. Conclusions HIV-infected men have higher mortality on ART than women in South African programmes, but these differences are only partly explained by more advanced HIV disease at the time of ART initiation, differential LTF and subsequent mortality, and differences in responses to treatment. The observed differences in mortality on ART may be best explained by background differences in mortality between men and women in the South African population unrelated to the HIV/AIDS epidemic.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A dynamic deterministic simulation model was developed to assess the impact of different putative control strategies on the seroprevalence of Neospora caninum in female Swiss dairy cattle. The model structure comprised compartments of "susceptible" and "infected" animals (SI-model) and the cattle population was divided into 12 age classes. A reference model (Model 1) was developed to simulate the current (status quo) situation (present seroprevalence in Switzerland 12%), taking into account available demographic and seroprevalence data of Switzerland. Model 1 was modified to represent four putative control strategies: testing and culling of seropositive animals (Model 2), discontinued breeding with offspring from seropositive cows (Model 3), chemotherapeutic treatment of calves from seropositive cows (Model 4), and vaccination of susceptible and infected animals (Model 5). Models 2-4 considered different sub-scenarios with regard to the frequency of diagnostic testing. Multivariable Monte Carlo sensitivity analysis was used to assess the impact of uncertainty in input parameters. A policy of annual testing and culling of all seropositive cattle in the population reduced the seroprevalence effectively and rapidly from 12% to <1% in the first year of simulation. The control strategies with discontinued breeding with offspring from all seropositive cows, chemotherapy of calves and vaccination of all cattle reduced the prevalence more slowly than culling but were still very effective (reduction of prevalence below 2% within 11, 23 and 3 years of simulation, respectively). However, sensitivity analyses revealed that the effectiveness of these strategies depended strongly on the quality of the input parameters used, such as the horizontal and vertical transmission factors, the sensitivity of the diagnostic test and the efficacy of medication and vaccination. Finally, all models confirmed that it was not possible to completely eradicate N. caninum as long as the horizontal transmission process was not interrupted.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

While the benefits of intensified insulin treatment in insulin-dependent (Type 1) diabetes mellitus (IDDM) are well recognized, the risks have not been comprehensively characterized. We examined the risk of severe hypoglycaemia, ketoacidosis, and death in a meta-analysis of randomized controlled trials. The MEDLINE database, reference lists, and specialist journals were searched electronically or by hand to identify relevant studies with at least 6 months of follow-up and the monitoring of glycaemia by glycosylated haemoglobin measurements. Logistic regression was used for calculation of combined odds ratios and 95% confidence intervals (95% CI). The influence of covariates was examined by including covariate-by-treatment interaction terms. Methodological study quality was assessed and sensitivity analyses were performed. Fourteen trials were identified. These contributed 16 comparisons with 1028 patients allocated to intensified and 1039 allocated to conventional treatment. A total of 846 patients suffered at least one episode of severe hypoglycaemia, 175 patients experienced ketoacidosis and 26 patients died. The combined odds ratio (95% CI) for hypoglycaemia was 2.99 (2.45-3.64), for ketoacidosis 1.74 (1.27-2.38) and for death from all causes 1.40 (0.65-3.01). The risk of severe hypoglycaemia was determined by the degree of normalization of glycaemia achieved (p=0.005 for interaction term), with the results from the Diabetes Control and Complications Trial (DCCT) in line with the other trials. Ketoacidosis risk depended on the type of intensified treatment used. Odds ratios (95% CI) were 7.20 (2.95-17.58) for exclusive use of pumps, 1.13 (0.15-8.35) for multiple daily injections and 1.28 (0.90-1.83) for trials offering a choice between the two (p = 0.004 for interaction). Mortality was significantly (p = 0.007) increased for causes potentially associated with acute complications (7 vs 0 deaths, 5 deaths attributed to ketoacidosis, and 2 sudden deaths), and non-significantly (p = 0.16) decreased for macrovascular causes (3 vs 8 deaths). We conclude that there is a substantial risk of severe adverse effects associated with intensified insulin treatment. Mortality from acute metabolic causes is increased; however, this is largely counterbalanced by a reduction in cardiovascular mortality. The excess of severe hypoglycemia in the DCCT is not exceptional. Multiple daily injection schemes may be safer than treatment with insulin pumps.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: A growing number of case reports have described tenofovir (TDF)-related proximal renal tubulopathy and impaired calculated glomerular filtration rates (cGFR). We assessed TDF-associated changes in cGFR in a large observational HIV cohort. METHODS: We compared treatment-naive patients or patients with treatment interruptions > or = 12 months starting either a TDF-based combination antiretroviral therapy (cART) (n = 363) or a TDF-sparing regime (n = 715). The predefined primary endpoint was the time to a 10 ml/min reduction in cGFR, based on the Cockcroft-Gault equation, confirmed by a follow-up measurement at least 1 month later. In sensitivity analyses, secondary endpoints including calculations based on the modified diet in renal disease (MDRD) formula were considered. Endpoints were modelled using pre-specified covariates in a multiple Cox proportional hazards model. RESULTS: Two-year event-free probabilities were 0.65 (95% confidence interval [CI] 0.58-0.72) and 0.80 (95% CI 0.76-0.83) for patients starting TDF-containing or TDF-sparing cART, respectively. In the multiple Cox model, diabetes mellitus (hazard ratio [HR] = 2.34 [95% CI 1.24-4.42]), higher baseline cGFR (HR = 1.03 [95% CI 1.02-1.04] by 10 ml/min), TDF use (HR = 1.84 [95% CI 1.35-2.51]) and boosted protease inhibitor use (HR = 1.71 [95% CI 1.30-2.24]) significantly increased the risk for reaching the primary endpoint. Sensitivity analyses showed high consistency. CONCLUSION: There is consistent evidence for a significant reduction in cGFR associated with TDF use in HIV-infected patients. Our findings call for a strict monitoring of renal function in long-term TDF users with tests that distinguish between glomerular dysfunction and proximal renal tubulopathy, a known adverse effect of TDF.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND The Fractional Flow Reserve Versus Angiography for Multivessel Evaluation (FAME) 2 trial demonstrated a significant reduction in subsequent coronary revascularization among patients with stable angina and at least 1 coronary lesion with a fractional flow reserve ≤0.80 who were randomized to percutaneous coronary intervention (PCI) compared with best medical therapy. The economic and quality-of-life implications of PCI in the setting of an abnormal fractional flow reserve are unknown. METHODS AND RESULTS We calculated the cost of the index hospitalization based on initial resource use and follow-up costs based on Medicare reimbursements. We assessed patient utility using the EQ-5D health survey with US weights at baseline and 1 month and projected quality-adjusted life-years assuming a linear decline over 3 years in the 1-month utility improvements. We calculated the incremental cost-effectiveness ratio based on cumulative costs over 12 months. Initial costs were significantly higher for PCI in the setting of an abnormal fractional flow reserve than with medical therapy ($9927 versus $3900, P<0.001), but the $6027 difference narrowed over 1-year follow-up to $2883 (P<0.001), mostly because of the cost of subsequent revascularization procedures. Patient utility was improved more at 1 month with PCI than with medical therapy (0.054 versus 0.001 units, P<0.001). The incremental cost-effectiveness ratio of PCI was $36 000 per quality-adjusted life-year, which was robust in bootstrap replications and in sensitivity analyses. CONCLUSIONS PCI of coronary lesions with reduced fractional flow reserve improves outcomes and appears economically attractive compared with best medical therapy among patients with stable angina.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES Non-steroidal anti-inflammatory drugs (NSAIDs) may cause kidney damage. This study assessed the impact of prolonged NSAID exposure on renal function in a large rheumatoid arthritis (RA) patient cohort. METHODS Renal function was prospectively followed between 1996 and 2007 in 4101 RA patients with multilevel mixed models for longitudinal data over a mean period of 3.2 years. Among the 2739 'NSAID users' were 1290 patients treated with cyclooxygenase type 2 selective NSAIDs, while 1362 subjects were 'NSAID naive'. Primary outcome was the estimated glomerular filtration rate according to the Cockroft-Gault formula (eGFRCG), and secondary the Modification of Diet in Renal Disease and Chronic Kidney Disease Epidemiology Collaboration formula equations and serum creatinine concentrations. In sensitivity analyses, NSAID dosing effects were compared for patients with NSAID registration in ≤/>50%, ≤/>80% or ≤/>90% of assessments. FINDINGS In patients with baseline eGFRCG >30 mL/min, eGFRCG evolved without significant differences over time between 'NSAID users' (mean change in eGFRCG -0.87 mL/min/year, 95% CI -1.15 to -0.59) and 'NSAID naive' (-0.67 mL/min/year, 95% CI -1.26 to -0.09, p=0.63). In a multivariate Cox regression analysis adjusted for significant confounders age, sex, body mass index, arterial hypertension, heart disease and for other insignificant factors, NSAIDs were an independent predictor for accelerated renal function decline only in patients with advanced baseline renal impairment (eGFRCG <30 mL/min). Analyses with secondary outcomes and sensitivity analyses confirmed these results. CONCLUSIONS NSAIDs had no negative impact on renal function estimates but in patients with advanced renal impairment.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

QUESTION UNDER STUDY The aim of this study was to evaluate the cost-effectiveness of ticagrelor and generic clopidogrel as add-on therapy to acetylsalicylic acid (ASA) in patients with acute coronary syndrome (ACS), from a Swiss perspective. METHODS Based on the PLATelet inhibition and patient Outcomes (PLATO) trial, one-year mean healthcare costs per patient treated with ticagrelor or generic clopidogrel were analysed from a payer perspective in 2011. A two-part decision-analytic model estimated treatment costs, quality-adjusted life years (QALYs), life years and the cost-effectiveness of ticagrelor and generic clopidogrel in patients with ACS up to a lifetime at a discount of 2.5% per annum. Sensitivity analyses were performed. RESULTS Over a patient's lifetime, treatment with ticagrelor generates an additional 0.1694 QALYs and 0.1999 life years at a cost of CHF 260 compared with generic clopidogrel. This results in an Incremental Cost Effectiveness Ratio (ICER) of CHF 1,536 per QALY and CHF 1,301 per life year gained. Ticagrelor dominated generic clopidogrel over the five-year and one-year periods with treatment generating cost savings of CHF 224 and 372 while gaining 0.0461 and 0.0051 QALYs and moreover 0.0517 and 0.0062 life years, respectively. Univariate sensitivity analyses confirmed the dominant position of ticagrelor in the first five years and probabilistic sensitivity analyses showed a high probability of cost-effectiveness over a lifetime. CONCLUSION During the first five years after ACS, treatment with ticagrelor dominates generic clopidogrel in Switzerland. Over a patient's lifetime, ticagrelor is highly cost-effective compared with generic clopidogrel, proven by ICERs significantly below commonly accepted willingness-to-pay thresholds.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Tropical wetlands are estimated to represent about 50% of the natural wetland methane (CH4) emissions and explain a large fraction of the observed CH4 variability on timescales ranging from glacial–interglacial cycles to the currently observed year-to-year variability. Despite their importance, however, tropical wetlands are poorly represented in global models aiming to predict global CH4 emissions. This publication documents a first step in the development of a process-based model of CH4 emissions from tropical floodplains for global applications. For this purpose, the LPX-Bern Dynamic Global Vegetation Model (LPX hereafter) was slightly modified to represent floodplain hydrology, vegetation and associated CH4 emissions. The extent of tropical floodplains was prescribed using output from the spatially explicit hydrology model PCR-GLOBWB. We introduced new plant functional types (PFTs) that explicitly represent floodplain vegetation. The PFT parameterizations were evaluated against available remote-sensing data sets (GLC2000 land cover and MODIS Net Primary Productivity). Simulated CH4 flux densities were evaluated against field observations and regional flux inventories. Simulated CH4 emissions at Amazon Basin scale were compared to model simulations performed in the WETCHIMP intercomparison project. We found that LPX reproduces the average magnitude of observed net CH4 flux densities for the Amazon Basin. However, the model does not reproduce the variability between sites or between years within a site. Unfortunately, site information is too limited to attest or disprove some model features. At the Amazon Basin scale, our results underline the large uncertainty in the magnitude of wetland CH4 emissions. Sensitivity analyses gave insights into the main drivers of floodplain CH4 emission and their associated uncertainties. In particular, uncertainties in floodplain extent (i.e., difference between GLC2000 and PCR-GLOBWB output) modulate the simulated emissions by a factor of about 2. Our best estimates, using PCR-GLOBWB in combination with GLC2000, lead to simulated Amazon-integrated emissions of 44.4 ± 4.8 Tg yr−1. Additionally, the LPX emissions are highly sensitive to vegetation distribution. Two simulations with the same mean PFT cover, but different spatial distributions of grasslands within the basin, modulated emissions by about 20%. Correcting the LPX-simulated NPP using MODIS reduces the Amazon emissions by 11.3%. Finally, due to an intrinsic limitation of LPX to account for seasonality in floodplain extent, the model failed to reproduce the full dynamics in CH4 emissions but we proposed solutions to this issue. The interannual variability (IAV) of the emissions increases by 90% if the IAV in floodplain extent is accounted for, but still remains lower than in most of the WETCHIMP models. While our model includes more mechanisms specific to tropical floodplains, we were unable to reduce the uncertainty in the magnitude of wetland CH4 emissions of the Amazon Basin. Our results helped identify and prioritize directions towards more accurate estimates of tropical CH4 emissions, and they stress the need for more research to constrain floodplain CH4 emissions and their temporal variability, even before including other fundamental mechanisms such as floating macrophytes or lateral water fluxes.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Recent evidence suggests that transition risks from initial clinical high risk (CHR) status to psychosis are decreasing. The role played by remission in this context is mostly unknown. The present study addresses this issue by means of a meta-analysis including eight relevant studies published up to January 2012 that reported remission rates from an initial CHR status. The primary effect size measure was the longitudinal proportion of remissions compared to non-remission in subjects with a baseline CHR state. Random effect models were employed to address the high heterogeneity across studies included. To assess the robustness of the results, we performed sensitivity analyses by sequentially removing each study and rerunning the analysis. Of 773 subjects who met initial CHR criteria, 73% did not convert to psychosis along a 2-year follow. Of these, about 46% fully remitted from the baseline attenuated psychotic symptoms, as evaluated on the psychometric measures usually employed by prodromal services. The corresponding clinical remission was estimated as high as 35% of the baseline CHR sample. The CHR state is associated with a significant proportion of remitting subjects that can be accounted by the effective treatments received, a lead time bias, a dilution effect, a comorbid effect of other psychiatric diagnoses.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. METHODS We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. RESULTS Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. CONCLUSION Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.