44 resultados para cost-benefit analyses


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Swiss health care is relatively costly. In order better to understand the drivers of spending, this study analyses geographic variation in per capita consultation costs for ambulatory care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Meadows are regularly mown in order to provide fodder or litter for livestock and to prevent vegetation succession. However, the time of year at which meadows should be first mown in order to maximize biological diversity remains controversial and may vary with respect to context and focal taxa. We carried out a systematic review and meta-analysis on the effects of delaying the first mowing date upon plants and invertebrates in European meadowlands. Methods Following a CEE protocol, ISI Web of Science, Science Direct, JSTOR, Google and Google Scholar were searched. We recorded all studies that compared the species richness of plants, or the species richness or abundance of invertebrates, between grassland plots mown at a postponed date (treatment) vs plots mown earlier (control). In order to be included in the meta-analysis, compared plots had to be similar in all management respects, except the date of the first cut that was (mostly experimentally) manipulated. They were also to be located in the same meadow type. Meta-analyses applying Hedges’d statistic were performed. Results Plant species richness responded differently to the date to which mowing was postponed. Delaying mowing from spring to summer had a positive effect, while delaying either from spring to fall, or from early summer to later in the season had a negative effect. Invertebrates were expected to show a strong response to delayed mowing due to their dependence on sward structure, but only species richness showed a clearly significant positive response. Invertebrate abundance was positively influenced in only a few studies. Conclusions The present meta-analysis shows that in general delaying the first mowing date in European meadowlands has either positive or neutral effects on plant and invertebrate biodiversity (except for plant species richness when delaying from spring to fall or from early summer to later). Overall, there was also strong between-study heterogeneity, pointing to other major confounding factors, the elucidation of which requires further field experiments with both larger sample sizes and a distinction between taxon-specific and meadow-type-specific responses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: This study aimed to assess the potential cost-effectiveness of testing patients with nephropathies for the I/D polymorphism before starting angiotensin-converting enzyme (ACE) inhibitor therapy, using a 3-year time horizon and a healthcare perspective. METHODS: We used a combination of a decision analysis and Markov modeling technique to evaluate the potential economic value of this pharmacogenetic test by preventing unfavorable treatment in patients with nephropathies. The estimation of the predictive value of the I/D polymorphism is based on a systematic review showing that DD carriers tend to respond well to ACE inhibitors, while II carriers seem not to benefit adequately from this treatment. Data on the ACE inhibitor effectiveness in nephropathy were derived from the REIN (Ramipril Efficacy in Nephropathy) trial. We calculated the number of patients with end-stage renal disease (ESRD) prevented and the differences in the incremental costs and incremental effect expressed as life-years free of ESRD. A probabilistic sensitivity analysis was conducted to determine the robustness of the results. RESULTS: Compared with unselective treatment, testing patients for their ACE genotype could save 12 patients per 1000 from developing ESRD during the 3 years covered by the model. As the mean net cost savings was euro 356,000 per 1000 patient-years, and 9 life-years free of ESRD were gained, selective treatment seems to be dominant. CONCLUSION: The study suggests that genetic testing of the I/D polymorphism in patients with nephropathy before initiating ACE therapy will most likely be cost-effective, even if the risk for II carriers to develop ESRD when treated with ACE inhibitors is only 1.4% higher than for DD carriers. Further studies, however, are required to corroborate the difference in treatment response between ACE genotypes, before genetic testing can be justified in clinical practice.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To compare the efficacy of chemoendocrine treatment with that of endocrine treatment (ET) alone for postmenopausal women with highly endocrine responsive breast cancer. In the International Breast Cancer Study Group (IBCSG) Trials VII and 12-93, postmenopausal women with node-positive, estrogen receptor (ER)-positive or ER-negative, operable breast cancer were randomized to receive either chemotherapy or endocrine therapy or combined chemoendocrine treatment. Results were analyzed overall in the cohort of 893 patients with endocrine-responsive disease, and according to prospectively defined categories of ER, age and nodal status. STEPP analyses assessed chemotherapy effect. The median follow-up was 13 years. Adding chemotherapy reduced the relative risk of a disease-free survival event by 19% (P = 0.02) compared with ET alone. STEPP analyses showed little effect of chemotherapy for tumors with high levels of ER expression (P = 0.07), or for the cohort with one positive node (P = 0.03). Chemotherapy significantly improves disease-free survival for postmenopausal women with endocrine-responsive breast cancer, but the magnitude of the effect is substantially attenuated if ER levels are high.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study explores the effects of three different 2-dose varicella zoster virus (VZV) vaccination strategies in Switzerland. The EVITA model was used to assess clinical benefits and costs of strategies (1) vaccination of 11-15 year old adolescents with a negative or uncertain history for chickenpox, (2) universal vaccination of toddlers at age 1 to 2 years, and (3) strategy 2 plus catch-up vaccination of 11-15 year old susceptible adolescents. The cost-effectiveness analysis compares strategies 2 and 3 versus strategy 1 (current vaccination policy in Switzerland). Probabilities for clinical outcomes and medical resource utilization were derived from a real-world survey among Swiss pediatricians and general practitioners including 236 individuals with VZV infection, published information on varicella complications, and expert opinion. Costs of medical resource utilization represent official Swiss medical tariffs. The model predicts both universal childhood vaccination strategies to be more effective in reducing varicella disease burden compared to strategy 1. Economically, both universal childhood vaccination strategies with or without catch-up result in net savings from the societal perspective reflected by a benefit cost ratio (BCR) of 1.22 or 1.29, respectively. In contrast, the model predicts net costs from the payer perspective (BCR of 0.27 and 0.30, respectively). These economic findings are comparable to those reported from other similar evaluations. However, due to the recent recommendation for using a 2-dose varicella vaccination schedule, our economic results for Switzerland are somewhat less favorable than those for other country analyses in which a less expensive 1-dose vaccination regimen for toddlers has been studied.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Even though complete resection is regarded as the only curative treatment for nonsmall cell lung cancer (NSCLC), >50% of resected patients die from a recurrence or a second primary tumour of the lung within 5 yrs. It remains unclear, whether follow-up in these patients is cost-effective and whether it can improve the outcome due to early detection of recurrent tumour. The benefit of regular follow-up in a consecutive series of 563 patients, who had undergone potentially curative resection for NSCLC at the University Hospital, was analysed. The follow-up consisted of clinical visits and chest radiography according to a standard protocol for up to 10 yrs. Survival rates were estimated using the Kaplan-Meier analysis method and the cost-effectiveness of the follow-up programme was assessed. A total of 23 patients (6.4% of the group with lobectomy) underwent further operation with curative intent for a second pulmonary malignancy. The regular follow-up over a 10-yr period provided the chance for a second curative treatment to 3.8% of all patients. The calculated costs per life-yr gained were 90,000 Swiss Francs. The cost-effectiveness of the follow-up protocol was far above those of comparable large-scale surveillance programmes. Based on these data, the intensity and duration of the follow-up was reduced.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Organic management is one of the most popular strategies to reduce negative environmental impacts of intensive agriculture. However, little is known about benefits for biodiversity and potential worsening of yield under organic grasslands management across different grassland types, i.e. meadow, pasture and mown pasture. Therefore, we studied the diversity of vascular plants and foliage-living arthropods (Coleoptera, Araneae, Heteroptera, Auchenorrhyncha), yield, fodder quality, soil phosphorus concentrations and land-use intensity of organic and conventional grasslands across three study regions in Germany. Furthermore, all variables were related to the time since conversion to organic management in order to assess temporal developments reaching up to 18 years. Arthropod diversity was significantly higher under organic than conventional management, although this was not the case for Araneae, Heteroptera and Auchenorrhyncha when analyzed separately. On the contrary, arthropod abundance, vascular plant diversity and also yield and fodder quality did not considerably differ between organic and conventional grasslands. Analyses did not reveal differences in the effect of organic management among grassland types. None of the recorded abiotic and biotic parameters showed a significant trend with time since transition to organic management, except soil organic phosphorus concentrations which decreased with time. This implies that permanent grasslands respond slower and probably weaker to organic management than crop fields do. However, as land-use intensity and inorganic soil phosphorus concentrations were significantly lower in organic grasslands, overcoming seed and dispersal limitation by re-introducing plant species might be needed to exploit the full ecological potential of organic grassland management. We conclude that although organic management did not automatically increase the diversity of all studied taxa, it is a reasonable and useful way to support agro-biodiversity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The Fractional Flow Reserve Versus Angiography for Multivessel Evaluation (FAME) 2 trial demonstrated a significant reduction in subsequent coronary revascularization among patients with stable angina and at least 1 coronary lesion with a fractional flow reserve ≤0.80 who were randomized to percutaneous coronary intervention (PCI) compared with best medical therapy. The economic and quality-of-life implications of PCI in the setting of an abnormal fractional flow reserve are unknown. METHODS AND RESULTS We calculated the cost of the index hospitalization based on initial resource use and follow-up costs based on Medicare reimbursements. We assessed patient utility using the EQ-5D health survey with US weights at baseline and 1 month and projected quality-adjusted life-years assuming a linear decline over 3 years in the 1-month utility improvements. We calculated the incremental cost-effectiveness ratio based on cumulative costs over 12 months. Initial costs were significantly higher for PCI in the setting of an abnormal fractional flow reserve than with medical therapy ($9927 versus $3900, P<0.001), but the $6027 difference narrowed over 1-year follow-up to $2883 (P<0.001), mostly because of the cost of subsequent revascularization procedures. Patient utility was improved more at 1 month with PCI than with medical therapy (0.054 versus 0.001 units, P<0.001). The incremental cost-effectiveness ratio of PCI was $36 000 per quality-adjusted life-year, which was robust in bootstrap replications and in sensitivity analyses. CONCLUSIONS PCI of coronary lesions with reduced fractional flow reserve improves outcomes and appears economically attractive compared with best medical therapy among patients with stable angina.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

QUESTION UNDER STUDY The aim of this study was to evaluate the cost-effectiveness of ticagrelor and generic clopidogrel as add-on therapy to acetylsalicylic acid (ASA) in patients with acute coronary syndrome (ACS), from a Swiss perspective. METHODS Based on the PLATelet inhibition and patient Outcomes (PLATO) trial, one-year mean healthcare costs per patient treated with ticagrelor or generic clopidogrel were analysed from a payer perspective in 2011. A two-part decision-analytic model estimated treatment costs, quality-adjusted life years (QALYs), life years and the cost-effectiveness of ticagrelor and generic clopidogrel in patients with ACS up to a lifetime at a discount of 2.5% per annum. Sensitivity analyses were performed. RESULTS Over a patient's lifetime, treatment with ticagrelor generates an additional 0.1694 QALYs and 0.1999 life years at a cost of CHF 260 compared with generic clopidogrel. This results in an Incremental Cost Effectiveness Ratio (ICER) of CHF 1,536 per QALY and CHF 1,301 per life year gained. Ticagrelor dominated generic clopidogrel over the five-year and one-year periods with treatment generating cost savings of CHF 224 and 372 while gaining 0.0461 and 0.0051 QALYs and moreover 0.0517 and 0.0062 life years, respectively. Univariate sensitivity analyses confirmed the dominant position of ticagrelor in the first five years and probabilistic sensitivity analyses showed a high probability of cost-effectiveness over a lifetime. CONCLUSION During the first five years after ACS, treatment with ticagrelor dominates generic clopidogrel in Switzerland. Over a patient's lifetime, ticagrelor is highly cost-effective compared with generic clopidogrel, proven by ICERs significantly below commonly accepted willingness-to-pay thresholds.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Introduction. In this era of high-tech medicine, it is becoming increasingly important to assess patient satisfaction. There are several methods to do so, but these differ greatly in terms of cost, time, and labour and external validity. The aim of this study is to describe and compare the structure and implementation of different methods to assess the satisfaction of patients in an emergency department. Methods. The structure and implementation of the different methods to assess patient satisfaction were evaluated on the basis of a 90-minute standardised interview. Results. We identified a total of six different methods in six different hospitals. The average number of patients assessed was 5012, with a range from 230 (M5) to 20 000 patients (M2). In four methods (M1, M3, M5, and M6), the questionnaire was composed by a specialised external institute. In two methods, the questionnaire was created by the hospital itself (M2, M4).The median response rate was 58.4% (range 9-97.8%). With a reminder, the response rate increased by 60% (M3). Conclusion. The ideal method to assess patient satisfaction in the emergency department setting is to use a patient-based, in-emergency department-based assessment of patient satisfaction, planned and guided by expert personnel.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The presence of minority nonnucleoside reverse transcriptase inhibitor (NNRTI)-resistant HIV-1 variants prior to antiretroviral therapy (ART) has been linked to virologic failure in treatment-naive patients. DESIGN: We performed a large retrospective study to determine the number of treatment failures that could have been prevented by implementing minority drug-resistant HIV-1 variant analyses in ART-naïve patients in whom no NNRTI resistance mutations were detected by routine resistance testing. METHODS: Of 1608 patients in the Swiss HIV Cohort Study, who have initiated first-line ART with two nucleoside reverse transcriptase inhibitors (NRTIs) and one NNRTI before July 2008, 519 patients were eligible by means of HIV-1 subtype, viral load and sample availability. Key NNRTI drug resistance mutations K103N and Y181C were measured by allele-specific PCR in 208 of 519 randomly chosen patients. RESULTS: Minority K103N and Y181C drug resistance mutations were detected in five out of 190 (2.6%) and 10 out of 201 (5%) patients, respectively. Focusing on 183 patients for whom virologic success or failure could be examined, virologic failure occurred in seven out of 183 (3.8%) patients; minority K103N and/or Y181C variants were present prior to ART initiation in only two of those patients. The NNRTI-containing, first-line ART was effective in 10 patients with preexisting minority NNRTI-resistant HIV-1 variant. CONCLUSION: As revealed in settings of case-control studies, minority NNRTI-resistant HIV-1 variants can have an impact on ART. However, the sole implementation of minority NNRTI-resistant HIV-1 variant analysis in addition to genotypic resistance testing (GRT) cannot be recommended in routine clinical settings. Additional associated risk factors need to be discovered.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. METHODS We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. RESULTS Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. CONCLUSION Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES To investigate the frequency of interim analyses, stopping rules, and data safety and monitoring boards (DSMBs) in protocols of randomized controlled trials (RCTs); to examine these features across different reasons for trial discontinuation; and to identify discrepancies in reporting between protocols and publications. STUDY DESIGN AND SETTING We used data from a cohort of RCT protocols approved between 2000 and 2003 by six research ethics committees in Switzerland, Germany, and Canada. RESULTS Of 894 RCT protocols, 289 prespecified interim analyses (32.3%), 153 stopping rules (17.1%), and 257 DSMBs (28.7%). Overall, 249 of 894 RCTs (27.9%) were prematurely discontinued; mostly due to reasons such as poor recruitment, administrative reasons, or unexpected harm. Forty-six of 249 RCTs (18.4%) were discontinued due to early benefit or futility; of those, 37 (80.4%) were stopped outside a formal interim analysis or stopping rule. Of 515 published RCTs, there were discrepancies between protocols and publications for interim analyses (21.1%), stopping rules (14.4%), and DSMBs (19.6%). CONCLUSION Two-thirds of RCT protocols did not consider interim analyses, stopping rules, or DSMBs. Most RCTs discontinued for early benefit or futility were stopped without a prespecified mechanism. When assessing trial manuscripts, journals should require access to the protocol.