96 resultados para Life Years

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

70.00% 70.00%

Publicador:

Resumo:

BACKGROUND/AIMS: Alveolar echinococcosis (AE) is a serious liver disease. The aim of this study was to explore the long-term prognosis of AE patients, the burden of this disease in Switzerland and the cost-effectiveness of treatment. METHODS: Relative survival analysis was undertaken using a national database with 329 patient records. 155 representative cases had sufficient details regarding treatment costs and patient outcome to estimate the financial implications and treatment costs of AE. RESULTS: For an average 54-year-old patient diagnosed with AE in 1970 the life expectancy was estimated to be reduced by 18.2 and 21.3 years for men and women, respectively. By 2005 this was reduced to approximately 3.5 and 2.6 years, respectively. Patients undergoing radical surgery had a better outcome, whereas the older patients had a poorer prognosis than the younger patients. Costs amount to approximately Euro108,762 per patient. Assuming the improved life expectancy of AE patients is due to modern treatment the cost per disability-adjusted life years (DALY) saved is approximately Euro6,032. CONCLUSIONS: Current treatments have substantially improved the prognosis of AE patients compared to the 1970s. The cost per DALY saved is low compared to the average national annual income. Hence, AE treatment is highly cost-effective in Switzerland.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Cardiovascular disease (CVD) due to atherosclerosis of the arterial vessel wall and to thrombosis is the foremost cause of premature mortality and of disability-adjusted life years (DALYs) in Europe, and is also increasingly common in developing countries.1 In the European Union, the economic cost of CVD represents annually E192 billion1 in direct and indirect healthcare costs. The main clinical entities are coronary artery disease (CAD), ischaemic stroke, and peripheral arterial disease (PAD). The causes of these CVDs are multifactorial. Some of these factors relate to lifestyles, such as tobacco smoking, lack of physical activity, and dietary habits, and are thus modifiable. Other risk factors are also modifiable, such as elevated blood pressure, type 2 diabetes, and dyslipidaemias, or non-modifiable, such as age and male gender. These guidelines deal with the management of dyslipidaemias as an essential and integral part of CVD prevention. Prevention and treatment of dyslipidaemias should always be considered within the broader framework of CVD prevention, which is addressed in guidelines of the Joint European Societies’ Task forces on CVD prevention in clinical practice.2 – 5 The latest version of these guidelines was published in 20075; an update will become available in 2012. These Joint ESC/European Atherosclerosis Society (EAS) guidelines on the management of dyslipidaemias are complementary to the guidelines on CVD prevention in clinical practice and address not only physicians [e.g. general practitioners (GPs) and cardiologists] interested in CVD prevention, but also specialists from lipid clinics or metabolic units who are dealing with dyslipidaemias that are more difficult to classify and treat.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Aneurysmal subarachnoid haemorrhage (aSAH) is a haemorrhagic form of stroke and occurs in a younger population compared with ischaemic stroke or intracerebral haemorrhage. It accounts for a large proportion of productive life-years lost to stroke. Its surgical and medical treatment represents a multidisciplinary effort. Due to the complexity of the disease, the management remains difficult to standardise and quality of care is accordingly difficult to assess.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE: This study aimed to assess the potential cost-effectiveness of testing patients with nephropathies for the I/D polymorphism before starting angiotensin-converting enzyme (ACE) inhibitor therapy, using a 3-year time horizon and a healthcare perspective. METHODS: We used a combination of a decision analysis and Markov modeling technique to evaluate the potential economic value of this pharmacogenetic test by preventing unfavorable treatment in patients with nephropathies. The estimation of the predictive value of the I/D polymorphism is based on a systematic review showing that DD carriers tend to respond well to ACE inhibitors, while II carriers seem not to benefit adequately from this treatment. Data on the ACE inhibitor effectiveness in nephropathy were derived from the REIN (Ramipril Efficacy in Nephropathy) trial. We calculated the number of patients with end-stage renal disease (ESRD) prevented and the differences in the incremental costs and incremental effect expressed as life-years free of ESRD. A probabilistic sensitivity analysis was conducted to determine the robustness of the results. RESULTS: Compared with unselective treatment, testing patients for their ACE genotype could save 12 patients per 1000 from developing ESRD during the 3 years covered by the model. As the mean net cost savings was euro 356,000 per 1000 patient-years, and 9 life-years free of ESRD were gained, selective treatment seems to be dominant. CONCLUSION: The study suggests that genetic testing of the I/D polymorphism in patients with nephropathy before initiating ACE therapy will most likely be cost-effective, even if the risk for II carriers to develop ESRD when treated with ACE inhibitors is only 1.4% higher than for DD carriers. Further studies, however, are required to corroborate the difference in treatment response between ACE genotypes, before genetic testing can be justified in clinical practice.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: There is little evidence on differences across health care systems in choice and outcome of the treatment of chronic low back pain (CLBP) with spinal surgery and conservative treatment as the main options. At least six randomised controlled trials comparing these two options have been performed; they show conflicting results without clear-cut evidence for superior effectiveness of any of the evaluated interventions and could not address whether treatment effect varied across patient subgroups. Cost-utility analyses display inconsistent results when comparing surgical and conservative treatment of CLBP. Due to its higher feasibility, we chose to conduct a prospective observational cohort study. METHODS: This study aims to examine if1. Differences across health care systems result in different treatment outcomes of surgical and conservative treatment of CLBP2. Patient characteristics (work-related, psychological factors, etc.) and co-interventions (physiotherapy, cognitive behavioural therapy, return-to-work programs, etc.) modify the outcome of treatment for CLBP3. Cost-utility in terms of quality-adjusted life years differs between surgical and conservative treatment of CLBP.This study will recruit 1000 patients from orthopaedic spine units, rehabilitation centres, and pain clinics in Switzerland and New Zealand. Effectiveness will be measured by the Oswestry Disability Index (ODI) at baseline and after six months. The change in ODI will be the primary endpoint of this study.Multiple linear regression models will be used, with the change in ODI from baseline to six months as the dependent variable and the type of health care system, type of treatment, patient characteristics, and co-interventions as independent variables. Interactions will be incorporated between type of treatment and different co-interventions and patient characteristics. Cost-utility will be measured with an index based on EQol-5D in combination with cost data. CONCLUSION: This study will provide evidence if differences across health care systems in the outcome of treatment of CLBP exist. It will classify patients with CLBP into different clinical subgroups and help to identify specific target groups who might benefit from specific surgical or conservative interventions. Furthermore, cost-utility differences will be identified for different groups of patients with CLBP. Main results of this study should be replicated in future studies on CLBP.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND The Fractional Flow Reserve Versus Angiography for Multivessel Evaluation (FAME) 2 trial demonstrated a significant reduction in subsequent coronary revascularization among patients with stable angina and at least 1 coronary lesion with a fractional flow reserve ≤0.80 who were randomized to percutaneous coronary intervention (PCI) compared with best medical therapy. The economic and quality-of-life implications of PCI in the setting of an abnormal fractional flow reserve are unknown. METHODS AND RESULTS We calculated the cost of the index hospitalization based on initial resource use and follow-up costs based on Medicare reimbursements. We assessed patient utility using the EQ-5D health survey with US weights at baseline and 1 month and projected quality-adjusted life-years assuming a linear decline over 3 years in the 1-month utility improvements. We calculated the incremental cost-effectiveness ratio based on cumulative costs over 12 months. Initial costs were significantly higher for PCI in the setting of an abnormal fractional flow reserve than with medical therapy ($9927 versus $3900, P<0.001), but the $6027 difference narrowed over 1-year follow-up to $2883 (P<0.001), mostly because of the cost of subsequent revascularization procedures. Patient utility was improved more at 1 month with PCI than with medical therapy (0.054 versus 0.001 units, P<0.001). The incremental cost-effectiveness ratio of PCI was $36 000 per quality-adjusted life-year, which was robust in bootstrap replications and in sensitivity analyses. CONCLUSIONS PCI of coronary lesions with reduced fractional flow reserve improves outcomes and appears economically attractive compared with best medical therapy among patients with stable angina.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVES Economic evaluations of interventions to prevent and control sexually transmitted infections such as Chlamydia trachomatis are increasingly required to present their outcomes in terms of quality-adjusted life-years using preference-based measurements of relevant health states. The objectives of this study were to critically evaluate how published cost-effectiveness studies have conceptualized and valued health states associated with chlamydia and to examine the primary evidence available to inform health state utility values (HSUVs). METHODS A systematic review was conducted, with searches of six electronic databases up to December 2012. Data on study characteristics, methods, and main results were extracted by using a standard template. RESULTS Nineteen economic evaluations of relevant interventions were included. Individual studies considered different health states and assigned different values and durations. Eleven studies cited the same source for HSUVs. Only five primary studies valued relevant health states. The methods and viewpoints adopted varied, and different values for health states were generated. CONCLUSIONS Limitations in the information available about HSUVs associated with chlamydia and its complications have implications for the robustness of economic evaluations in this area. None of the primary studies could be used without reservation to inform cost-effectiveness analyses in the United Kingdom. Future debate should consider appropriate methods for valuing health states for infectious diseases, because recommended approaches may not be suitable. Unless we adequately tackle the challenges associated with measuring and valuing health-related quality of life for patients with chlamydia and other infectious diseases, evaluating the cost-effectiveness of interventions in this area will remain problematic.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Objectives: To update the 2006 systematic review of the comparative benefits and harms of erythropoiesis-stimulating agent (ESA) strategies and non-ESA strategies to manage anemia in patients undergoing chemotherapy and/or radiation for malignancy (excluding myelodysplastic syndrome and acute leukemia), including the impact of alternative thresholds for initiating treatment and optimal duration of therapy. Data sources: Literature searches were updated in electronic databases (n=3), conference proceedings (n=3), and Food and Drug Administration transcripts. Multiple sources (n=13) were searched for potential gray literature. A primary source for current survival evidence was a recently published individual patient data meta-analysis. In that meta-analysis, patient data were obtained from investigators for studies enrolling more than 50 patients per arm. Because those data constitute the most currently available data for this update, as well as the source for on-study (active treatment) mortality data, we limited inclusion in the current report to studies enrolling more than 50 patients per arm to avoid potential differential endpoint ascertainment in smaller studies. Review methods: Title and abstract screening was performed by one or two (to resolve uncertainty) reviewers; potentially included publications were reviewed in full text. Two or three (to resolve disagreements) reviewers assessed trial quality. Results were independently verified and pooled for outcomes of interest. The balance of benefits and harms was examined in a decision model. Results: We evaluated evidence from 5 trials directly comparing darbepoetin with epoetin, 41 trials comparing epoetin with control, and 8 trials comparing darbepoetin with control; 5 trials evaluated early versus late (delay until Hb ≤9 to 11 g/dL) treatment. Trials varied according to duration, tumor types, cancer therapy, trial quality, iron supplementation, baseline hemoglobin, ESA dosing frequency (and therefore amount per dose), and dose escalation. ESAs decreased the risk of transfusion (pooled relative risk [RR], 0.58; 95% confidence interval [CI], 0.53 to 0.64; I2 = 51%; 38 trials) without evidence of meaningful difference between epoetin and darbepoetin. Thromboembolic event rates were higher in ESA-treated patients (pooled RR, 1.51; 95% CI, 1.30 to 1.74; I2 = 0%; 37 trials) without difference between epoetin and darbepoetin. In 14 trials reporting the Functional Assessment of Cancer Therapy (FACT)-Fatigue subscale, the most common patient-reported outcome, scores decreased by −0.6 in control arms (95% CI, −6.4 to 5.2; I2 = 0%) and increased by 2.1 in ESA arms (95% CI, −3.9 to 8.1; I2 = 0%). There were fewer thromboembolic and on-study mortality adverse events when ESA treatment was delayed until baseline Hb was less than 10 g/dL, in keeping with current treatment practice, but the difference in effect from early treatment was not significant, and the evidence was limited and insufficient for conclusions. No evidence informed optimal duration of therapy. Mortality was increased during the on-study period (pooled hazard ratio [HR], 1.17; 95% CI, 1.04 to 1.31; I2 = 0%; 37 trials). There was one additional death for every 59 treated patients when the control arm on-study mortality was 10 percent and one additional death for every 588 treated patients when the control-arm on-study mortality was 1 percent. A cohort decision model yielded a consistent result—greater loss of life-years when control arm on-study mortality was higher. There was no discernible increase in mortality with ESA use over the longest available followup (pooled HR, 1.04; 95% CI, 0.99 to 1.10; I2 = 38%; 44 trials), but many trials did not include an overall survival endpoint and potential time-dependent confounding was not considered. Conclusions: Results of this update were consistent with the 2006 review. ESAs reduced the need for transfusions and increased the risk of thromboembolism. FACT-Fatigue scores were better with ESA use but the magnitude was less than the minimal clinically important difference. An increase in mortality accompanied the use of ESAs. An important unanswered question is whether dosing practices and overall ESA exposure might influence harms.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

QUESTION UNDER STUDY The aim of this study was to evaluate the cost-effectiveness of ticagrelor and generic clopidogrel as add-on therapy to acetylsalicylic acid (ASA) in patients with acute coronary syndrome (ACS), from a Swiss perspective. METHODS Based on the PLATelet inhibition and patient Outcomes (PLATO) trial, one-year mean healthcare costs per patient treated with ticagrelor or generic clopidogrel were analysed from a payer perspective in 2011. A two-part decision-analytic model estimated treatment costs, quality-adjusted life years (QALYs), life years and the cost-effectiveness of ticagrelor and generic clopidogrel in patients with ACS up to a lifetime at a discount of 2.5% per annum. Sensitivity analyses were performed. RESULTS Over a patient's lifetime, treatment with ticagrelor generates an additional 0.1694 QALYs and 0.1999 life years at a cost of CHF 260 compared with generic clopidogrel. This results in an Incremental Cost Effectiveness Ratio (ICER) of CHF 1,536 per QALY and CHF 1,301 per life year gained. Ticagrelor dominated generic clopidogrel over the five-year and one-year periods with treatment generating cost savings of CHF 224 and 372 while gaining 0.0461 and 0.0051 QALYs and moreover 0.0517 and 0.0062 life years, respectively. Univariate sensitivity analyses confirmed the dominant position of ticagrelor in the first five years and probabilistic sensitivity analyses showed a high probability of cost-effectiveness over a lifetime. CONCLUSION During the first five years after ACS, treatment with ticagrelor dominates generic clopidogrel in Switzerland. Over a patient's lifetime, ticagrelor is highly cost-effective compared with generic clopidogrel, proven by ICERs significantly below commonly accepted willingness-to-pay thresholds.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. METHODS We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. RESULTS Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. CONCLUSION Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

OBJECTIVE To estimate the cost-effectiveness of prevention of mother-to-child transmission (MTCT) of HIV with lifelong antiretroviral therapy (ART) for pregnant and breastfeeding women ('Option B+') compared with ART during pregnancy or breastfeeding only unless clinically indicated ('Option B'). DESIGN Mathematical modelling study of first and second pregnancy, informed by data from the Malawi Option B+ programme. METHODS Individual-based simulation model. We simulated cohorts of 10 000 women and their infants during two subsequent pregnancies, including the breastfeeding period, with either Option B+ or B. We parameterized the model with data from the literature and by analysing programmatic data. We compared total costs of antenatal and postnatal care, and lifetime costs and disability-adjusted life-years of the infected infants between Option B+ and Option B. RESULTS During the first pregnancy, 15% of the infants born to HIV-infected mothers acquired the infection. With Option B+, 39% of the women were on ART at the beginning of the second pregnancy, compared with 18% with Option B. For second pregnancies, the rates MTCT were 11.3% with Option B+ and 12.3% with Option B. The incremental cost-effectiveness ratio comparing the two options ranged between about US$ 500 and US$ 1300 per DALY averted. CONCLUSION Option B+ prevents more vertical transmissions of HIV than Option B, mainly because more women are already on ART at the beginning of the next pregnancy. Option B+ is a cost-effective strategy for PMTCT if the total future costs and lost lifetime of the infected infants are taken into account.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The purpose of the study was to assess long-term mortality after an intensive care unit (ICU) stay and to test the hypotheses that (1) quality of life improves over time and (2) predictions of outcome made by caregivers during an ICU stay are reliable.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background: After oral tumor resection, structural and functional rehabilitation by means of dental prostheses is complex, and positive treatment outcome is not always predictable. Purpose: The objective of the study was to report on oral rehabilitation and quality of life 2-5 years after resection of malignant oral tumors. Materials and Methods: Data of 46 patients (57 ± 7 years) who underwent oral tumor surgery were available. More than 50% of tumors were classified T3 or T4. Open oro-nasal defects resulted in 12 patients and full mandibulary block resections in 23 patients. Comprehensive planning, implant placement, and prosthetic rehabilitation followed an interdisciplinary protocol. Analysis comprised tumor location, type of prostheses, implant survival, and quality of life. Results: Because of advanced tumor status, resections resulted in marked alteration of the oral anatomy requiring complex treatment procedures. Prosthetic rehabilitation comprised fixed and removable prostheses, with 104 implants placed in 28 patients (60%). Early implant loss was high (13%) and cumulative survival rate of loaded implants was <90% after 5 years. Prosthetic plans had to be modified because of side effects of tumor therapy, complications with implants and tumor recurrence. The majority of patients rated quality of life favorable, but some experienced impaired swallowing, dry mouth, limited mouth opening, appearance, and soreness. Conclusions: Some local effects of tumor therapy could not be significantly improved by prosthetic rehabilitation leading to functional and emotional disability. Many patients had passed away or felt too ill to fill the questionnaires. This case series confirms the complex anatomic alterations after tumor resection and the need for individual treatment approaches especially regarding prosthesis design. In spite of disease-related local and general restrictions, most patients gave a positive assessment of quality of life.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

BACKGROUND: Children in emergencies need peripheral intravenous (IV) access in order to receive drugs or fluids. The success of IV access is associated with the age of patients and fails in up to 50% of children younger than 6 years. In such situations, it is essential that physicians and paramedics have a tool and easily learnable skills with a high chance of success. According to international guidelines intraosseous (IO) access would be the next step after failed IV access. Our hypothesis was that the success rate in IO puncturing can be improved by standardizing the training; so we developed an IO workshop. METHODS: Twenty-eight hospitals and ambulance services participated in an evaluation process over 3 years. IO workshops and the distribution of standardized IO sets were coordinated by the study group of the University Hospital of Berne. Any attempted or successful IO punctures were evaluated with a standardized interview. RESULTS: We investigated 35 applications in 30 patients (a total of 49 punctures) between November 2001 and December 2004. IO puncture was not successful in 5 patients. The success rate depended neither on the occupation nor the experience of users. Attendance at a standardized IO workshop increased the overall success rate from 77% to 100%, which was statistically not significant (P = 0.074). CONCLUSIONS: Standardized training in IO puncturing seems to improve success more than previous experience and occupation of providers. However, we could not show a significant increase in success rate after this training. Larger supranational studies are needed to show a significant impact of teaching on rarely used emergency skills.