822 resultados para Life Years


Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: This study aimed to assess the potential cost-effectiveness of testing patients with nephropathies for the I/D polymorphism before starting angiotensin-converting enzyme (ACE) inhibitor therapy, using a 3-year time horizon and a healthcare perspective. METHODS: We used a combination of a decision analysis and Markov modeling technique to evaluate the potential economic value of this pharmacogenetic test by preventing unfavorable treatment in patients with nephropathies. The estimation of the predictive value of the I/D polymorphism is based on a systematic review showing that DD carriers tend to respond well to ACE inhibitors, while II carriers seem not to benefit adequately from this treatment. Data on the ACE inhibitor effectiveness in nephropathy were derived from the REIN (Ramipril Efficacy in Nephropathy) trial. We calculated the number of patients with end-stage renal disease (ESRD) prevented and the differences in the incremental costs and incremental effect expressed as life-years free of ESRD. A probabilistic sensitivity analysis was conducted to determine the robustness of the results. RESULTS: Compared with unselective treatment, testing patients for their ACE genotype could save 12 patients per 1000 from developing ESRD during the 3 years covered by the model. As the mean net cost savings was euro 356,000 per 1000 patient-years, and 9 life-years free of ESRD were gained, selective treatment seems to be dominant. CONCLUSION: The study suggests that genetic testing of the I/D polymorphism in patients with nephropathy before initiating ACE therapy will most likely be cost-effective, even if the risk for II carriers to develop ESRD when treated with ACE inhibitors is only 1.4% higher than for DD carriers. Further studies, however, are required to corroborate the difference in treatment response between ACE genotypes, before genetic testing can be justified in clinical practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: There is little evidence on differences across health care systems in choice and outcome of the treatment of chronic low back pain (CLBP) with spinal surgery and conservative treatment as the main options. At least six randomised controlled trials comparing these two options have been performed; they show conflicting results without clear-cut evidence for superior effectiveness of any of the evaluated interventions and could not address whether treatment effect varied across patient subgroups. Cost-utility analyses display inconsistent results when comparing surgical and conservative treatment of CLBP. Due to its higher feasibility, we chose to conduct a prospective observational cohort study. METHODS: This study aims to examine if1. Differences across health care systems result in different treatment outcomes of surgical and conservative treatment of CLBP2. Patient characteristics (work-related, psychological factors, etc.) and co-interventions (physiotherapy, cognitive behavioural therapy, return-to-work programs, etc.) modify the outcome of treatment for CLBP3. Cost-utility in terms of quality-adjusted life years differs between surgical and conservative treatment of CLBP.This study will recruit 1000 patients from orthopaedic spine units, rehabilitation centres, and pain clinics in Switzerland and New Zealand. Effectiveness will be measured by the Oswestry Disability Index (ODI) at baseline and after six months. The change in ODI will be the primary endpoint of this study.Multiple linear regression models will be used, with the change in ODI from baseline to six months as the dependent variable and the type of health care system, type of treatment, patient characteristics, and co-interventions as independent variables. Interactions will be incorporated between type of treatment and different co-interventions and patient characteristics. Cost-utility will be measured with an index based on EQol-5D in combination with cost data. CONCLUSION: This study will provide evidence if differences across health care systems in the outcome of treatment of CLBP exist. It will classify patients with CLBP into different clinical subgroups and help to identify specific target groups who might benefit from specific surgical or conservative interventions. Furthermore, cost-utility differences will be identified for different groups of patients with CLBP. Main results of this study should be replicated in future studies on CLBP.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND The Fractional Flow Reserve Versus Angiography for Multivessel Evaluation (FAME) 2 trial demonstrated a significant reduction in subsequent coronary revascularization among patients with stable angina and at least 1 coronary lesion with a fractional flow reserve ≤0.80 who were randomized to percutaneous coronary intervention (PCI) compared with best medical therapy. The economic and quality-of-life implications of PCI in the setting of an abnormal fractional flow reserve are unknown. METHODS AND RESULTS We calculated the cost of the index hospitalization based on initial resource use and follow-up costs based on Medicare reimbursements. We assessed patient utility using the EQ-5D health survey with US weights at baseline and 1 month and projected quality-adjusted life-years assuming a linear decline over 3 years in the 1-month utility improvements. We calculated the incremental cost-effectiveness ratio based on cumulative costs over 12 months. Initial costs were significantly higher for PCI in the setting of an abnormal fractional flow reserve than with medical therapy ($9927 versus $3900, P<0.001), but the $6027 difference narrowed over 1-year follow-up to $2883 (P<0.001), mostly because of the cost of subsequent revascularization procedures. Patient utility was improved more at 1 month with PCI than with medical therapy (0.054 versus 0.001 units, P<0.001). The incremental cost-effectiveness ratio of PCI was $36 000 per quality-adjusted life-year, which was robust in bootstrap replications and in sensitivity analyses. CONCLUSIONS PCI of coronary lesions with reduced fractional flow reserve improves outcomes and appears economically attractive compared with best medical therapy among patients with stable angina.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES Economic evaluations of interventions to prevent and control sexually transmitted infections such as Chlamydia trachomatis are increasingly required to present their outcomes in terms of quality-adjusted life-years using preference-based measurements of relevant health states. The objectives of this study were to critically evaluate how published cost-effectiveness studies have conceptualized and valued health states associated with chlamydia and to examine the primary evidence available to inform health state utility values (HSUVs). METHODS A systematic review was conducted, with searches of six electronic databases up to December 2012. Data on study characteristics, methods, and main results were extracted by using a standard template. RESULTS Nineteen economic evaluations of relevant interventions were included. Individual studies considered different health states and assigned different values and durations. Eleven studies cited the same source for HSUVs. Only five primary studies valued relevant health states. The methods and viewpoints adopted varied, and different values for health states were generated. CONCLUSIONS Limitations in the information available about HSUVs associated with chlamydia and its complications have implications for the robustness of economic evaluations in this area. None of the primary studies could be used without reservation to inform cost-effectiveness analyses in the United Kingdom. Future debate should consider appropriate methods for valuing health states for infectious diseases, because recommended approaches may not be suitable. Unless we adequately tackle the challenges associated with measuring and valuing health-related quality of life for patients with chlamydia and other infectious diseases, evaluating the cost-effectiveness of interventions in this area will remain problematic.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: To update the 2006 systematic review of the comparative benefits and harms of erythropoiesis-stimulating agent (ESA) strategies and non-ESA strategies to manage anemia in patients undergoing chemotherapy and/or radiation for malignancy (excluding myelodysplastic syndrome and acute leukemia), including the impact of alternative thresholds for initiating treatment and optimal duration of therapy. Data sources: Literature searches were updated in electronic databases (n=3), conference proceedings (n=3), and Food and Drug Administration transcripts. Multiple sources (n=13) were searched for potential gray literature. A primary source for current survival evidence was a recently published individual patient data meta-analysis. In that meta-analysis, patient data were obtained from investigators for studies enrolling more than 50 patients per arm. Because those data constitute the most currently available data for this update, as well as the source for on-study (active treatment) mortality data, we limited inclusion in the current report to studies enrolling more than 50 patients per arm to avoid potential differential endpoint ascertainment in smaller studies. Review methods: Title and abstract screening was performed by one or two (to resolve uncertainty) reviewers; potentially included publications were reviewed in full text. Two or three (to resolve disagreements) reviewers assessed trial quality. Results were independently verified and pooled for outcomes of interest. The balance of benefits and harms was examined in a decision model. Results: We evaluated evidence from 5 trials directly comparing darbepoetin with epoetin, 41 trials comparing epoetin with control, and 8 trials comparing darbepoetin with control; 5 trials evaluated early versus late (delay until Hb ≤9 to 11 g/dL) treatment. Trials varied according to duration, tumor types, cancer therapy, trial quality, iron supplementation, baseline hemoglobin, ESA dosing frequency (and therefore amount per dose), and dose escalation. ESAs decreased the risk of transfusion (pooled relative risk [RR], 0.58; 95% confidence interval [CI], 0.53 to 0.64; I2 = 51%; 38 trials) without evidence of meaningful difference between epoetin and darbepoetin. Thromboembolic event rates were higher in ESA-treated patients (pooled RR, 1.51; 95% CI, 1.30 to 1.74; I2 = 0%; 37 trials) without difference between epoetin and darbepoetin. In 14 trials reporting the Functional Assessment of Cancer Therapy (FACT)-Fatigue subscale, the most common patient-reported outcome, scores decreased by −0.6 in control arms (95% CI, −6.4 to 5.2; I2 = 0%) and increased by 2.1 in ESA arms (95% CI, −3.9 to 8.1; I2 = 0%). There were fewer thromboembolic and on-study mortality adverse events when ESA treatment was delayed until baseline Hb was less than 10 g/dL, in keeping with current treatment practice, but the difference in effect from early treatment was not significant, and the evidence was limited and insufficient for conclusions. No evidence informed optimal duration of therapy. Mortality was increased during the on-study period (pooled hazard ratio [HR], 1.17; 95% CI, 1.04 to 1.31; I2 = 0%; 37 trials). There was one additional death for every 59 treated patients when the control arm on-study mortality was 10 percent and one additional death for every 588 treated patients when the control-arm on-study mortality was 1 percent. A cohort decision model yielded a consistent result—greater loss of life-years when control arm on-study mortality was higher. There was no discernible increase in mortality with ESA use over the longest available followup (pooled HR, 1.04; 95% CI, 0.99 to 1.10; I2 = 38%; 44 trials), but many trials did not include an overall survival endpoint and potential time-dependent confounding was not considered. Conclusions: Results of this update were consistent with the 2006 review. ESAs reduced the need for transfusions and increased the risk of thromboembolism. FACT-Fatigue scores were better with ESA use but the magnitude was less than the minimal clinically important difference. An increase in mortality accompanied the use of ESAs. An important unanswered question is whether dosing practices and overall ESA exposure might influence harms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

QUESTION UNDER STUDY The aim of this study was to evaluate the cost-effectiveness of ticagrelor and generic clopidogrel as add-on therapy to acetylsalicylic acid (ASA) in patients with acute coronary syndrome (ACS), from a Swiss perspective. METHODS Based on the PLATelet inhibition and patient Outcomes (PLATO) trial, one-year mean healthcare costs per patient treated with ticagrelor or generic clopidogrel were analysed from a payer perspective in 2011. A two-part decision-analytic model estimated treatment costs, quality-adjusted life years (QALYs), life years and the cost-effectiveness of ticagrelor and generic clopidogrel in patients with ACS up to a lifetime at a discount of 2.5% per annum. Sensitivity analyses were performed. RESULTS Over a patient's lifetime, treatment with ticagrelor generates an additional 0.1694 QALYs and 0.1999 life years at a cost of CHF 260 compared with generic clopidogrel. This results in an Incremental Cost Effectiveness Ratio (ICER) of CHF 1,536 per QALY and CHF 1,301 per life year gained. Ticagrelor dominated generic clopidogrel over the five-year and one-year periods with treatment generating cost savings of CHF 224 and 372 while gaining 0.0461 and 0.0051 QALYs and moreover 0.0517 and 0.0062 life years, respectively. Univariate sensitivity analyses confirmed the dominant position of ticagrelor in the first five years and probabilistic sensitivity analyses showed a high probability of cost-effectiveness over a lifetime. CONCLUSION During the first five years after ACS, treatment with ticagrelor dominates generic clopidogrel in Switzerland. Over a patient's lifetime, ticagrelor is highly cost-effective compared with generic clopidogrel, proven by ICERs significantly below commonly accepted willingness-to-pay thresholds.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. METHODS We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. RESULTS Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. CONCLUSION Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE To estimate the cost-effectiveness of prevention of mother-to-child transmission (MTCT) of HIV with lifelong antiretroviral therapy (ART) for pregnant and breastfeeding women ('Option B+') compared with ART during pregnancy or breastfeeding only unless clinically indicated ('Option B'). DESIGN Mathematical modelling study of first and second pregnancy, informed by data from the Malawi Option B+ programme. METHODS Individual-based simulation model. We simulated cohorts of 10 000 women and their infants during two subsequent pregnancies, including the breastfeeding period, with either Option B+ or B. We parameterized the model with data from the literature and by analysing programmatic data. We compared total costs of antenatal and postnatal care, and lifetime costs and disability-adjusted life-years of the infected infants between Option B+ and Option B. RESULTS During the first pregnancy, 15% of the infants born to HIV-infected mothers acquired the infection. With Option B+, 39% of the women were on ART at the beginning of the second pregnancy, compared with 18% with Option B. For second pregnancies, the rates MTCT were 11.3% with Option B+ and 12.3% with Option B. The incremental cost-effectiveness ratio comparing the two options ranged between about US$ 500 and US$ 1300 per DALY averted. CONCLUSION Option B+ prevents more vertical transmissions of HIV than Option B, mainly because more women are already on ART at the beginning of the next pregnancy. Option B+ is a cost-effective strategy for PMTCT if the total future costs and lost lifetime of the infected infants are taken into account.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

It is the aim of this paper to examine iron supplementation programs which receive funding from United States Agency for International Development (USAID) but approach combating iron deficiency anemia in two vastly different ways. A brief literature review and background information on iron deficiencies and the differences between supplementation programs and micronutrient fortification were reviewed. Two non-governmental organizations (NGO's) were examined for this paper: the Food and Nutrition Technical Assistance II (FANTA) and the MicroNutrient Initiative. The FANTA program included an educational component to their supplementation program while the MicroNutrient Initiative solely used supplementation of micronutrients to their population. Methods used were cost-benefit analysis and cost-effectiveness analysis to determine the overall effectiveness of each program in reducing iron deficiency anemia in each population, if the added costs of the incentives in the FANTA program changed the cost-effectiveness of the program compared to the MicroNutrient Initiative program and to determine which program imparted the greatest benefit to each population by reducing the disease burden in Disability Adjusted Life Years (DALY). Results showed that the unit cost of the FANTA program per person was higher than the MicroNutrient Initiative program due to the educational component. The FANTA program reduced iron deficiency anemia less overall but cost less for each percentage point of anemia decreased in their respective populations. The MicroNutrient Initiative program had a better benefit cost ratio for the populations it served. The MicroNutrient Initiative's large scale program imparted many advantages by reducing unit cost per person and decreasing iron deficiency anemia. The FANTA program was more effective at decreasing iron deficiency anemia with less money: $5,660 per 1% decrease in iron deficiency anemia versus $18,450 per 1% decrease in iron deficiency anemia for the MicroNutrient Initiative program. ^ In conclusion, economic analysis cannot measure all of the benefits associated with programs that contain an educational component or large scale supplementation. More information needs to be gathered by NGOs and reported to USAID, such as detailed prevalence rates of iron deficiency anemia among the populations served. Further research is needed to determine the effects an educational supplementation program has on compliance rates of participants and motivation to participate in supplementation programs whose aim is to decrease iron deficiency anemia in a targeted population.^

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Contexte: La régurgitation mitrale (RM) est une maladie valvulaire nécessitant une intervention dans les cas les plus grave. Une réparation percutanée de la valve mitrale avec le dispositif MitraClip est un traitement sécuritaire et efficace pour les patients à haut risque chirurgical. Nous voulons évaluer les résultats cliniques et l'impact économique de cette thérapie par rapport à la gestion médicale des patients en insuffisance cardiaque avec insuffisance mitrale symptomatique. Méthodes: L'étude a été composée de deux phases; une étude d'observation de patients souffrant d'insuffisance cardiaque et de régurgitation mitrale traitée avec une thérapie médicale ou le MitraClip, et un modèle économique. Les résultats de l'étude observationnelle ont été utilisés pour estimer les paramètres du modèle de décision, qui a estimé les coûts et les avantages d'une cohorte hypothétique de patients atteints d'insuffisance cardiaque et insuffisance mitrale sévère traitée avec soit un traitement médical standard ou MitraClip. Résultats: La cohorte de patients traités avec le système MitraClip était appariée par score de propension à une population de patients atteints d'insuffisance cardiaque, et leurs résultats ont été comparés. Avec un suivi moyen de 22 mois, la mortalité était de 21% dans la cohorte MitraClip et de 42% dans la cohorte de gestion médicale (p = 0,007). Le modèle de décision a démontré que MitraClip augmente l'espérance de vie de 1,87 à 3,60 années et des années de vie pondérées par la qualité (QALY) de 1,13 à 2,76 ans. Le coût marginal était 52.500 $ dollars canadiens, correspondant à un rapport coût-efficacité différentiel (RCED) de 32,300.00 $ par QALY gagné. Les résultats étaient sensibles à l'avantage de survie. Conclusion: Dans cette cohorte de patients atteints d'insuffisance cardiaque symptomatique et d insuffisance mitrale significative, la thérapie avec le MitraClip est associée à une survie supérieure et est rentable par rapport au traitement médical.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This paper introduces the rank-dependent quality-adjusted life-years (QALY) model, a new method to aggregate QALYs in economic evaluations of health care. The rank-dependent QALY model permits the formalization of influential concepts of equity in the allocation of health care, such as the fair innings approach, and it includes as special cases many of the social welfare functions that have been proposed in the literature. An important advantage of the rank-dependent QALY model is that it offers a straightforward procedure to estimate equity weights for QALYs. We characterize the rank-dependent QALY model and argue that its central condition has normative appeal. (C) 2003 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: Major depression is the largest single cause of nonfatal disease burden in Australia. Effective drug and psychological treatments exist, yet are underused. Objective: To quantify the burden of disease currently averted in people seeking care for major depression and the amount of disease burden that could be averted in these people under optimal episodic and maintenance treatment strategies. Design: Modeling impact of current and optimal treatment strategies based on secondary analysis of mental health survey data, studies of the natural history of major depression, and meta-analyses of effectiveness data. Monte Carlo simulation of uncertainty in the model. Setting: The cohort of Australian adults experiencing an episode of major depression in 2000 are modeled through "what if" scenarios of no treatment, current treatment, and optimal treatment strategies with cognitive behavioral therapy or antidepressant drug treatment. Main Outcome Measure: Disability-Adjusted Life Year. Results: Current episodic treatment averts 9% (95% uncertainty interval, 6%-12%) of the disease burden of major depression in Australian adults. Optimal episodic treatment with cognitive behavioral therapy could avert 28% (95% uncertainty interval, 19%-39%) of this disease burden, and with drugs 24% (95% uncertainty interval, 19%-30%) could be averted. During the 5 years after an episode of major depression, current episodic treatment patterns would avert 13% (95% uncertainty interval, 10%-17%) of Disability-Adjusted Life Years, whereas maintenance drug treatment could avert 50% (95% uncertainty interval, 40%-60%) and maintenance cognitive behavioral therapy could avert 52% (95% uncertainty interval, 42%-64%), even if adherence of around 60% is taken into account. Conclusions: Longer-term maintenance drug or psychological treatment strategies are required to make significant inroads into the large disease burden associated with major depression in the Australian population.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objective: To quantify the burden of disease and injury for the Aboriginal and non-Aboriginal populations in the Northern Territory. Design and setting: Analysis of Northern Territory data for 1 January 1994 to 30 December 1998 from multiple sources. Main outcome measures: Disability-adjusted life-years (DALYs), by age, sex, cause and Aboriginality. Results: Cardiovascular disease was the leading contributor (14.9%) to the total burden of disease and injury in the NT, followed by mental disorders (14.5%) and malignant neoplasms (11.2%). There was also a substantial contribution from unintentional injury (10.4%) and intentional injury (4.9%). Overall, the NT Aboriginal population had a rate of burden of disease 2.5 times higher than the non-Aboriginal population; in the 35-54-year age group their DALY rate was 4.1 times higher. The leading causes of disease burden were cardiovascular disease for both Aboriginal men (19.1%) and women (15.7%) and mental disorders for both non-Aboriginal men (16.7%) and women (22.3%). Conclusions: A comprehensive assessment of fatal and non-fatal conditions is important in describing differentials in health status of the NT population. Our study provides comparative data to identify health priorities and facilitate a more equitable distribution of health funding.