18 resultados para Support unit costs
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
BACKGROUND Monitoring of HIV viral load in patients on combination antiretroviral therapy (ART) is not generally available in resource-limited settings. We examined the cost-effectiveness of qualitative point-of-care viral load tests (POC-VL) in sub-Saharan Africa. DESIGN Mathematical model based on longitudinal data from the Gugulethu and Khayelitsha township ART programmes in Cape Town, South Africa. METHODS Cohorts of patients on ART monitored by POC-VL, CD4 cell count or clinically were simulated. Scenario A considered the more accurate detection of treatment failure with POC-VL only, and scenario B also considered the effect on HIV transmission. Scenario C further assumed that the risk of virologic failure is halved with POC-VL due to improved adherence. We estimated the change in costs per quality-adjusted life-year gained (incremental cost-effectiveness ratios, ICERs) of POC-VL compared with CD4 and clinical monitoring. RESULTS POC-VL tests with detection limits less than 1000 copies/ml increased costs due to unnecessary switches to second-line ART, without improving survival. Assuming POC-VL unit costs between US$5 and US$20 and detection limits between 1000 and 10,000 copies/ml, the ICER of POC-VL was US$4010-US$9230 compared with clinical and US$5960-US$25540 compared with CD4 cell count monitoring. In Scenario B, the corresponding ICERs were US$2450-US$5830 and US$2230-US$10380. In Scenario C, the ICER ranged between US$960 and US$2500 compared with clinical monitoring and between cost-saving and US$2460 compared with CD4 monitoring. CONCLUSION The cost-effectiveness of POC-VL for monitoring ART is improved by a higher detection limit, by taking the reduction in new HIV infections into account and assuming that failure of first-line ART is reduced due to targeted adherence counselling.
Resumo:
Background: WHO's 2013 revisions to its Consolidated Guidelines on antiretroviral drugs recommend routine viral load monitoring, rather than clinical or immunological monitoring, as the preferred monitoring approach on the basis of clinical evidence. However, HIV programmes in resource-limited settings require guidance on the most cost-effective use of resources in view of other competing priorities such as expansion of antiretroviral therapy coverage. We assessed the cost-effectiveness of alternative patient monitoring strategies. Methods: We evaluated a range of monitoring strategies, including clinical, CD4 cell count, and viral load monitoring, alone and together, at different frequencies and with different criteria for switching to second-line therapies. We used three independently constructed and validated models simultaneously. We estimated costs on the basis of resource use projected in the models and associated unit costs; we quantified impact as disability-adjusted life years (DALYs) averted. We compared alternatives using incremental cost-effectiveness analysis. Findings: All models show that clinical monitoring delivers significant benefit compared with a hypothetical baseline scenario with no monitoring or switching. Regular CD4 cell count monitoring confers a benefit over clinical monitoring alone, at an incremental cost that makes it affordable in more settings than viral load monitoring, which is currently more expensive. Viral load monitoring without CD4 cell count every 6—12 months provides the greatest reductions in morbidity and mortality, but incurs a high cost per DALY averted, resulting in lost opportunities to generate health gains if implemented instead of increasing antiretroviral therapy coverage or expanding antiretroviral therapy eligibility. Interpretation: The priority for HIV programmes should be to expand antiretroviral therapy coverage, firstly at CD4 cell count lower than 350 cells per μL, and then at a CD4 cell count lower than 500 cells per μL, using lower-cost clinical or CD4 monitoring. At current costs, viral load monitoring should be considered only after high antiretroviral therapy coverage has been achieved. Point-of-care technologies and other factors reducing costs might make viral load monitoring more affordable in future. Funding: Bill & Melinda Gates Foundation, WHO.
Resumo:
BACKGROUND The cost-effectiveness of routine viral load (VL) monitoring of HIV-infected patients on antiretroviral therapy (ART) depends on various factors that differ between settings and across time. Low-cost point-of-care (POC) tests for VL are in development and may make routine VL monitoring affordable in resource-limited settings. We developed a software tool to study the cost-effectiveness of switching to second-line ART with different monitoring strategies, and focused on POC-VL monitoring. METHODS We used a mathematical model to simulate cohorts of patients from start of ART until death. We modeled 13 strategies (no 2nd-line, clinical, CD4 (with or without targeted VL), POC-VL, and laboratory-based VL monitoring, with different frequencies). We included a scenario with identical failure rates across strategies, and one in which routine VL monitoring reduces the risk of failure. We compared lifetime costs and averted disability-adjusted life-years (DALYs). We calculated incremental cost-effectiveness ratios (ICER). We developed an Excel tool to update the results of the model for varying unit costs and cohort characteristics, and conducted several sensitivity analyses varying the input costs. RESULTS Introducing 2nd-line ART had an ICER of US$1651-1766/DALY averted. Compared with clinical monitoring, the ICER of CD4 monitoring was US$1896-US$5488/DALY averted and VL monitoring US$951-US$5813/DALY averted. We found no difference between POC- and laboratory-based VL monitoring, except for the highest measurement frequency (every 6 months), where laboratory-based testing was more effective. Targeted VL monitoring was on the cost-effectiveness frontier only if the difference between 1st- and 2nd-line costs remained large, and if we assumed that routine VL monitoring does not prevent failure. CONCLUSION Compared with the less expensive strategies, the cost-effectiveness of routine VL monitoring essentially depends on the cost of 2nd-line ART. Our Excel tool is useful for determining optimal monitoring strategies for specific settings, with specific sex-and age-distributions and unit costs.
Resumo:
INTRODUCTION Dexmedetomidine was shown in two European randomized double-blind double-dummy trials (PRODEX and MIDEX) to be non-inferior to propofol and midazolam in maintaining target sedation levels in mechanically ventilated intensive care unit (ICU) patients. Additionally, dexmedetomidine shortened the time to extubation versus both standard sedatives, suggesting that it may reduce ICU resource needs and thus lower ICU costs. Considering resource utilization data from these two trials, we performed a secondary, cost-minimization analysis assessing the economics of dexmedetomidine versus standard care sedation. METHODS The total ICU costs associated with each study sedative were calculated on the basis of total study sedative consumption and the number of days patients remained intubated, required non-invasive ventilation, or required ICU care without mechanical ventilation. The daily unit costs for these three consecutive ICU periods were set to decline toward discharge, reflecting the observed reduction in mean daily Therapeutic Intervention Scoring System (TISS) points between the periods. A number of additional sensitivity analyses were performed, including one in which the total ICU costs were based on the cumulative sum of daily TISS points over the ICU period, and two further scenarios, with declining direct variable daily costs only. RESULTS Based on pooled data from both trials, sedation with dexmedetomidine resulted in lower total ICU costs than using the standard sedatives, with a difference of €2,656 in the median (interquartile range) total ICU costs-€11,864 (€7,070 to €23,457) versus €14,520 (€7,871 to €26,254)-and €1,649 in the mean total ICU costs. The median (mean) total ICU costs with dexmedetomidine compared with those of propofol or midazolam were €1,292 (€747) and €3,573 (€2,536) lower, respectively. The result was robust, indicating lower costs with dexmedetomidine in all sensitivity analyses, including those in which only direct variable ICU costs were considered. The likelihood of dexmedetomidine resulting in lower total ICU costs compared with pooled standard care was 91.0% (72.4% versus propofol and 98.0% versus midazolam). CONCLUSIONS From an economic point of view, dexmedetomidine appears to be a preferable option compared with standard sedatives for providing light to moderate ICU sedation exceeding 24 hours. The savings potential results primarily from shorter time to extubation. TRIAL REGISTRATION ClinicalTrials.gov NCT00479661 (PRODEX), NCT00481312 (MIDEX).
Resumo:
Intermittent and continuous renal replacement therapies (RRTs) are available for the treatment of acute renal failure (ARF) in the intensive care unit (ICU). Although at present there are no adequately powered survival studies, available data suggest that both methods are equal with respect to patient outcome. Therefore, cost comparison between techniques is important for selecting the modality. Expenditures were prospectively assessed as a secondary end point during a controlled, randomized trial comparing intermittent hemodialysis (IHD) with continuous venovenous hemodiafiltration (CVVHDF). The outcome of the primary end points of this trial, that is, ICU and in-hospital mortality, has been previously published. One hundred twenty-five patients from a Swiss university hospital ICU were randomized either to CVVHDF or IHD. Out of these, 42 (CVVHDF) and 34 (IHD) were available for cost analysis. Patients' characteristics, delivered dialysis dose, duration of stay in the ICU or hospital, mortality rates, and recovery of renal function were not different between the two groups. Detailed 24-h time and material consumption protocols were available for 369 (CVVHDF) and 195 (IHD) treatment days. The mean daily duration of CVVHDF was 19.5 +/- 3.2 h/day, resulting in total expenditures of Euro 436 +/- 21 (21% for human resources and 79% for technical devices). For IHD (mean 3.0 +/- 0.4 h/treatment), the costs were lower (Euro 268 +/- 26), with a larger proportion for human resources (45%). Nursing time spent for CVVHDF was 113 +/- 50 min, and 198 +/- 63 min per IHD treatment. Total costs for RRT in ICU patients with ARF were lower when treated with IHD than with CVVHDF, and have to be taken into account for the selection of the method of RRT in ARF on the ICU.
Resumo:
OBJECTIVE: Nursing in 'live islands' and routine high dose intravenous immunoglobulins after allogeneic hematopoietic stem cell transplantation were abandoned by many teams in view of limited evidence and high costs. METHODS: This retrospective single-center study examines the impact of change from nursing in 'live islands' to care in single rooms (SR) and from high dose to targeted intravenous immunoglobulins (IVIG) on mortality and infection rate of adult patients receiving an allogeneic stem cell or bone marrow transplantation in two steps and three time cohorts (1993-1997, 1997-2000, 2000-2003). RESULTS: Two hundred forty-eight allogeneic hematopoetic stem cell transplantations were performed in 227 patients. Patient characteristics were comparable in the three cohorts for gender, median age, underlying disease, and disease stage, prophylaxis for graft versus host disease (GvHD) and cytomegalovirus constellation. The incidence of infections (78.4%) and infection rates remained stable (rates/1000 days of neutropenia for sepsis 17.61, for pneumonia 6.76). Cumulative incidence of GvHD and transplant-related mortality did not change over time. CONCLUSIONS: Change from nursing in 'live islands' to SR and reduction of high dose to targeted IVIG did not result in increased infection rates or mortality despite an increase in patient age. These results support the current practice.
Resumo:
Writing unit tests for legacy systems is a key maintenance task. When writing tests for object-oriented programs, objects need to be set up and the expected effects of executing the unit under test need to be verified. If developers lack internal knowledge of a system, the task of writing tests is non-trivial. To address this problem, we propose an approach that exposes side effects detected in example runs of the system and uses these side effects to guide the developer when writing tests. We introduce a visualization called Test Blueprint, through which we identify what the required fixture is and what assertions are needed to verify the correct behavior of a unit under test. The dynamic analysis technique that underlies our approach is based on both tracing method executions and on tracking the flow of objects at runtime. To demonstrate the usefulness of our approach we present results from two case studies.
Resumo:
The aim of the study was to examine the economic performance as well as perceived social and environmental impacts of organic cotton in Southern Kyrgyzstan on the basis of a comparative field study (44 certified organic farmers and 33 conventional farmers) carried out in 2009. It also investigated farmers’ motivation for and assessment of conversion to organic farming. Cotton yields on organic farms were found to be 10% lower whereby input costs per unit were 42% lower, which resulted in organic farmers having a 20% higher revenue from cotton. Due to lower input costs and organic and fair trade price premiums the average gross margin from organic cotton was 27%. In addition to direct economic benefits organic farmers enjoy a number of additional benefits such as easy access to credits on favourable terms, provision with uncontaminated cotton cooking oil and seed cake as animal feed, marketing support as well as extension and training, services provided by the newly established organic service provider. A big majority of organic farmers perceives an improvement of soil qualities, improved health conditions, and positively assesses their previous decision to convert to organic farming. The major disadvantage of organic farming is the high manual labour input required. In the study area, where manual farm work is mainly women’s work and male labour migration widespread, women are most affected by this negative aspect of organic farming. Altogether, the results suggest that despite the inconvenience of higher work load the advantages of organic farming outweigh the disadvantages and that conversion to organic farming can improve the livelihoods of small-scale farmers.
Resumo:
Cotton is a leading agricultural non-food commodity associated with soil degradation, water pollution and pesticide poisoning due to high levels of agrochemical inputs. Organic farming is often promoted as a means of addressing the economic, environmental and health risks of conventional cotton production, and it is slowly gaining ground in the global cotton market. Organic and fair trade cotton are widely seen as opportunities for smallholder farmers to improve their livelihoods thanks to higher returns, lower input costs and fewer risks. Despite an increasing number of studies comparing the profitability of organic and non-organic farming systems in developing and industrialized countries, little has been published on organic farming in Central Asia. The aim of this article is to describe the economic performance and perceived social and environmental impacts of organic cotton in southern Kyrgyzstan, drawing on a comparative field study conducted by the author in 2009. In addition to economic and environmental aspects, the study investigated farmers’ motivations toward and assessment of conversion to organic farming. Cotton yields on organic farms were found to be 10% lower, while input costs per unit were 42% lower; as a result, organic farmers’ cotton revenues were 20% higher. Due to lower input costs as well as organic and fair trade price premiums, the average gross margin from organic cotton was 27% higher. In addition to direct economic benefits, organic farmers enjoy other benefits, such as easy access to credit on favorable terms, provision of uncontaminated cottonseed cooking oil and cottonseed cake as animal feed, and marketing support as well as extension and training services provided by newly established organic service providers. The majority of organic farmers perceive improved soil quality, improved health conditions, and positively assess their initial decision to convert to organic farming. The major disadvantage of organic farming is the high manual labor input required. In the study area, where manual farm work is mainly women's work and male labor migration is widespread, women are most affected by this negative aspect of organic farming. Altogether, the results suggest that, despite the inconvenience of a higher workload, the advantages of organic farming outweigh its disadvantages and that conversion to organic farming improves the livelihoods of small-scale farmers.
Resumo:
Soil erosion models and soil erosion risk maps are often used as indicators to assess potential soil erosion in order to assist policy decisions. This paper shows the scientific basis of the soil erosion risk map of Switzerland and its application in policy and practice. Linking a USLE/RUSLE-based model approach (AVErosion) founded on multiple flow algorithms and the unit contributing area concept with an extremely precise and high-resolution digital terrain model (2 m × 2 m grid) using GIS allows for a realistic assessment of the potential soil erosion risk, on single plots, i.e. uniform and comprehensive for the agricultural area of Switzerland (862,579 ha in the valley area and the lower mountain regions). The national or small-scale soil erosion prognosis has thus reached a level heretofore possible only in smaller catchment areas or single plots. Validation was carried out using soil loss data from soil erosion damage mappings in the field from long-term monitoring in different test areas. 45% of the evaluated agricultural area of Switzerland was classified as low potential erosion risk, 12% as moderate potential erosion risk, and 43% as high potential erosion risk. However, many of the areas classified as high potential erosion risk are located at the transition from valley to mountain zone, where many areas are used as permanent grassland, which drastically lowers their current erosion risk. The present soil erosion risk map serves on the one hand to identify and prioritise the high-erosion risk areas, and on the other hand to promote awareness amongst farmers and authorities. It was published on the internet and will be made available to the authorities in digital form. It is intended as a tool for simplifying and standardising enforcement of the legal framework for soil erosion prevention in Switzerland. The work therefore provides a successful example of cooperation between science, policy and practice.
Resumo:
BACKGROUND: Empirical antibiotic therapy is based on patients' characteristics and antimicrobial susceptibility data. Hospital-wide cumulative antibiograms may not sufficiently support informed decision-making for optimal treatment of hospitalized patients. METHODS: We studied different approaches to analysing antimicrobial susceptibility rates (SRs) of all diagnostic bacterial isolates collected from patients hospitalized between July 2005 and June 2007 at the University Hospital in Zurich, Switzerland. We compared stratification for unit-specific, specimen type-specific (blood, urinary, respiratory versus all specimens) and isolate sequence-specific (first, follow-up versus all isolates) data with hospital-wide cumulative antibiograms, and studied changes of mean SR during the course of hospitalization. RESULTS: A total of 16 281 isolates (7965 first, 1201 follow-up and 7115 repeat isolates) were tested. We found relevant differences in SRs across different hospital departments. Mean SRs of Escherichia coli to ciprofloxacin ranged between 64.5% and 95.1% in various departments, and mean SRs of Pseudomonas aeruginosa to imipenem and meropenem ranged from 54.2% to 100% and 80.4% to 100%, respectively. Compared with hospital cumulative antibiograms, lower SRs were observed in intensive care unit specimens, follow-up isolates and isolates causing nosocomial infections (except for Staphylococcus aureus). Decreasing SRs were observed in first isolates of coagulase-negative staphylococci with increasing interval between hospital admission and specimen collection. Isolates from different anatomical sites showed variations in SRs. CONCLUSIONS: We recommend the reporting of unit-specific rather than hospital-wide cumulative antibiograms. Decreasing antimicrobial susceptibility during hospitalization and variations in SRs in isolates from different anatomical sites should be taken into account when selecting empirical antibiotic treatment.
Resumo:
PURPOSE: To assess family satisfaction in the ICU and to identify parameters for improvement. METHODS: Multicenter study in Swiss ICUs. Families were given a questionnaire covering overall satisfaction, satisfaction with care and satisfaction with information/decision-making. Demographic, medical and institutional data were gathered from patients, visitors and ICUs. RESULTS: A total of 996 questionnaires from family members were analyzed. Individual questions were assessed, and summary measures (range 0-100) were calculated, with higher scores indicating greater satisfaction. Summary score was 78 +/- 14 (mean +/- SD) for overall satisfaction, 79 +/- 14 for care and 77 +/- 15 for information/decision-making. In multivariable multilevel linear regression analyses, higher severity of illness was associated with higher satisfaction, while a higher patient:nurse ratio and written admission/discharge criteria were associated with lower overall satisfaction. Using performance-importance plots, items with high impact on overall satisfaction but low satisfaction were identified. They included: emotional support, providing understandable, complete, consistent information and coordination of care. CONCLUSIONS: Overall, proxies were satisfied with care and with information/decision-making. Still, several factors, such as emotional support, coordination of care and communication, are associated with poor satisfaction, suggesting the need for improvement. ELECTRONIC SUPPLEMENTARY MATERIAL: The online version of this article (doi:10.1007/s00134-009-1611-4) contains supplementary material, which is available to authorized users.
Resumo:
OBJECTIVES: Respiratory syncytial virus (RSV) infections are a leading cause of hospital admissions in small children. A substantial proportion of these patients require medical and nursing care, which can only be provided in intermediate (IMC) or intensive care units (ICU). This article reports on all children aged < 3 years who required admission to IMC and/or ICU between October 1, 2001 and September 30, 2005 in Switzerland. PATIENTS AND METHODS: We prospectively collected data on all children aged < 3 years who were admitted to an IMC or ICU for an RSV-related illness. Using a detailed questionnaire, we collected information on risk factors, therapy requirements, length of stay in the IMC/ICU and hospital, and outcome. RESULTS: Of the 577 cases reported during the study period, 90 were excluded because the patients did not fulfill the inclusion criteria; data were incomplete in another 25 cases (5%). Therefore, a total of 462 verified cases were eligible for analysis. At the time of hospital admission, only 31 patients (11%) were older than 12 months. Since RSV infection was not the main reason for IMC/ICU admission in 52% of these patients, we chose to exclude this subgroup from further analyses. Among the 431 infants aged < 12 months, the majority (77%) were former near term or full term (NT/FT) infants with a gestational age > or = 35 weeks without additional risk factors who were hospitalized at a median age of 1.5 months. Gestational age (GA) < 32 weeks, moderate to severe bronchopulmonary dysplasia (BPD), and congenital heart disease (CHD) were all associated with a significant risk increase for IMC/ICU admission (relative risk 14, 56, and 10, for GA < or = 32 weeks, BPD, and CHD, respectively). Compared with NT/FT infants, high-risk infants were hospitalized at an older age (except for infants with CHD), required more invasive and longer respiratory support, and had longer stays in the IMC/ICU and hospital. CONCLUSIONS: In Switzerland, RSV infections lead to the IMC/ICU admission of approximately 1%-2% of each annual birth cohort. Although prematurity, BPD, and CHD are significant risk factors, non-pharmacological preventive strategies should not be restricted to these high-risk patients but also target young NT/FT infants since they constitute 77% of infants requiring IMC/ICU admission.
Resumo:
The WOCAT network has collected, documented, and assessed more than 350 case studies on promising and good practices of SLM. Information on on- and off-site benefits of different SLM types, as well as on investment and maintenance costs is available, sometimes in quantitative and often in qualitative form. The objective of the present paper is to analyse what kind of economic benefits accrue to local stakeholders, and to better understand how these benefits compare to investment and maintenance costs. The large majority of the technologies contained in the database are perceived by land users as having positive benefits that outweigh costs in the long term. About three quarters of them also have positive or at least neutral benefits in the short term. The analysis shows that many SLM measures exist which can generate important benefits to land users, but also to other stakeholders. However, methodological issues need to be tackled and further quantitative and qualitative data are needed to better understand and support the adoption of SLM measures. Keywords: Sustainable Land Management, Costs, Benefits, Technologies
Resumo:
Background The usefulness and modalities of cardiovascular screening in young athletes remain controversial, particularly concerning the role of 12-lead ECG. One of the reasons refers to the presumed false-positive ECGs requiring additional examinations and higher costs. Our study aimed to assess the total costs and yield of a preparticipation cardiovascular examination with ECG in young athletes in Switzerland. Methods Athletes aged 14–35 years were examined according to the 2005 European Society of Cardiology (ESC) protocol. ECGs were interpreted based on the 2010 ESC-adapted recommendations. The costs of the overall screening programme until diagnosis were calculated according to Swiss medical rates. Results A total of 1070 athletes were examined (75% men, 19.7±6.3 years) over a 15-month period. Among them, 67 (6.3%) required further examinations: 14 (1.3%) due to medical history, 15 (1.4%) due to physical examination and 42 (3.9%) because of abnormal ECG findings. A previously unknown cardiac abnormality was established in 11 athletes (1.0%). In four athletes (0.4%), the abnormality may potentially lead to sudden cardiac death and all of them were identified by ECG alone. The cost was 157 464 Swiss francs (CHF) for the overall programme, CHF147 per athlete and CHF14 315 per finding. Conclusions Cardiovascular preparticipation examination in young athletes using modern and athlete-specific criteria for interpreting ECG is feasible in Switzerland at reasonable cost. ECG alone is used to detect all potentially lethal cardiac diseases. The results of our study support the inclusion of ECG in routine preparticipation screening.