7 resultados para Cost Estimating Practice
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Introduction: The Health Technology Assessment report on effectiveness, cost-effectiveness and appropriateness of homeopathy was compiled on behalf of the Swiss Federal Office for Public Health (BAG) within the framework of the 'Program of Evaluation of Complementary Medicine (PEK)'. Materials and Methods: Databases accessible by Internet were systematically searched, complemented by manual search and contacts with experts, and evaluated according to internal and external validity criteria. Results: Many high-quality investigations of pre-clinical basic research proved homeopathic high-potencies inducing regulative and specific changes in cells or living organisms. 20 of 22 systematic reviews detected at least a trend in favor of homeopathy. In our estimation 5 studies yielded results indicating clear evidence for homeopathic therapy. The evaluation of 29 studies in the domain 'Upper Respiratory Tract Infections/Allergic Reactions' showed a positive overall result in favor of homeopathy. 6 out of 7 controlled studies were at least equivalent to conventional medical interventions. 8 out of 16 placebocontrolled studies were significant in favor of homeopathy. Swiss regulations grant a high degree of safety due to product and training requirements for homeopathic physicians. Applied properly, classical homeopathy has few side-effects and the use of high-potencies is free of toxic effects. A general health-economic statement about homeopathy cannot be made from the available data. Conclusion: Taking internal and external validity criteria into account, effectiveness of homeopathy can be supported by clinical evidence and professional and adequate application be regarded as safe. Reliable statements of cost-effectiveness are not available at the moment. External and model validity will have to be taken more strongly into consideration in future studies.
Resumo:
OBJECTIVE: This study aimed to assess the potential cost-effectiveness of testing patients with nephropathies for the I/D polymorphism before starting angiotensin-converting enzyme (ACE) inhibitor therapy, using a 3-year time horizon and a healthcare perspective. METHODS: We used a combination of a decision analysis and Markov modeling technique to evaluate the potential economic value of this pharmacogenetic test by preventing unfavorable treatment in patients with nephropathies. The estimation of the predictive value of the I/D polymorphism is based on a systematic review showing that DD carriers tend to respond well to ACE inhibitors, while II carriers seem not to benefit adequately from this treatment. Data on the ACE inhibitor effectiveness in nephropathy were derived from the REIN (Ramipril Efficacy in Nephropathy) trial. We calculated the number of patients with end-stage renal disease (ESRD) prevented and the differences in the incremental costs and incremental effect expressed as life-years free of ESRD. A probabilistic sensitivity analysis was conducted to determine the robustness of the results. RESULTS: Compared with unselective treatment, testing patients for their ACE genotype could save 12 patients per 1000 from developing ESRD during the 3 years covered by the model. As the mean net cost savings was euro 356,000 per 1000 patient-years, and 9 life-years free of ESRD were gained, selective treatment seems to be dominant. CONCLUSION: The study suggests that genetic testing of the I/D polymorphism in patients with nephropathy before initiating ACE therapy will most likely be cost-effective, even if the risk for II carriers to develop ESRD when treated with ACE inhibitors is only 1.4% higher than for DD carriers. Further studies, however, are required to corroborate the difference in treatment response between ACE genotypes, before genetic testing can be justified in clinical practice.
Resumo:
Ecology and conservation require reliable data on the occurrence of animals and plants. A major source of bias is imperfect detection, which, however, can be corrected for by estimation of detectability. In traditional occupancy models, this requires repeat or multi-observer surveys. Recently, time-to-detection models have been developed as a cost-effective alternative, which requires no repeat surveys and hence costs could be halved. We compared the efficiency and reliability of time-to-detection and traditional occupancy models under varying survey effort. Two observers independently searched for 17 plant species in 44100m(2) Swiss grassland quadrats and recorded the time-to-detection for each species, enabling detectability to be estimated with both time-to-detection and traditional occupancy models. In addition, we gauged the relative influence on detectability of species, observer, plant height and two measures of abundance (cover and frequency). Estimates of detectability and occupancy under both models were very similar. Rare species were more likely to be overlooked; detectability was strongly affected by abundance. As a measure of abundance, frequency outperformed cover in its predictive power. The two observers differed significantly in their detection ability. Time-to-detection models were as accurate as traditional occupancy models, but their data easier to obtain; thus they provide a cost-effective alternative to traditional occupancy models for detection-corrected estimation of occurrence.
Resumo:
Stepwise uncertainty reduction (SUR) strategies aim at constructing a sequence of points for evaluating a function f in such a way that the residual uncertainty about a quantity of interest progressively decreases to zero. Using such strategies in the framework of Gaussian process modeling has been shown to be efficient for estimating the volume of excursion of f above a fixed threshold. However, SUR strategies remain cumbersome to use in practice because of their high computational complexity, and the fact that they deliver a single point at each iteration. In this article we introduce several multipoint sampling criteria, allowing the selection of batches of points at which f can be evaluated in parallel. Such criteria are of particular interest when f is costly to evaluate and several CPUs are simultaneously available. We also manage to drastically reduce the computational cost of these strategies through the use of closed form formulas. We illustrate their performances in various numerical experiments, including a nuclear safety test case. Basic notions about kriging, auxiliary problems, complexity calculations, R code, and data are available online as supplementary materials.
Resumo:
For swine dysentery, which is caused by Brachyspira hyodysenteriae infection and is an economically important disease in intensive pig production systems worldwide, a perfect or error-free diagnostic test ("gold standard") is not available. In the absence of a gold standard, Bayesian latent class modelling is a well-established methodology for robust diagnostic test evaluation. In contrast to risk factor studies in food animals, where adjustment for within group correlations is both usual and required for good statistical practice, diagnostic test evaluation studies rarely take such clustering aspects into account, which can result in misleading results. The aim of the present study was to estimate test accuracies of a PCR originally designed for use as a confirmatory test, displaying a high diagnostic specificity, and cultural examination for B. hyodysenteriae. This estimation was conducted based on results of 239 samples from 103 herds originating from routine diagnostic sampling. Using Bayesian latent class modelling comprising of a hierarchical beta-binomial approach (which allowed prevalence across individual herds to vary as herd level random effect), robust estimates for the sensitivities of PCR and culture, as well as for the specificity of PCR, were obtained. The estimated diagnostic sensitivity of PCR (95% CI) and culture were 73.2% (62.3; 82.9) and 88.6% (74.9; 99.3), respectively. The estimated specificity of the PCR was 96.2% (90.9; 99.8). For test evaluation studies, a Bayesian latent class approach is well suited for addressing the considerable complexities of population structure in food animals.
Resumo:
BACKGROUND Estimating the prevalence of comorbidities and their associated costs in patients with diabetes is fundamental to optimizing health care management. This study assesses the prevalence and health care costs of comorbid conditions among patients with diabetes compared with patients without diabetes. Distinguishing potentially diabetes- and nondiabetes-related comorbidities in patients with diabetes, we also determined the most frequent chronic conditions and estimated their effect on costs across different health care settings in Switzerland. METHODS Using health care claims data from 2011, we calculated the prevalence and average health care costs of comorbidities among patients with and without diabetes in inpatient and outpatient settings. Patients with diabetes and comorbid conditions were identified using pharmacy-based cost groups. Generalized linear models with negative binomial distribution were used to analyze the effect of comorbidities on health care costs. RESULTS A total of 932,612 persons, including 50,751 patients with diabetes, were enrolled. The most frequent potentially diabetes- and nondiabetes-related comorbidities in patients older than 64 years were cardiovascular diseases (91%), rheumatologic conditions (55%), and hyperlipidemia (53%). The mean total health care costs for diabetes patients varied substantially by comorbidity status (US$3,203-$14,223). Patients with diabetes and more than two comorbidities incurred US$10,584 higher total costs than patients without comorbidity. Costs were significantly higher in patients with diabetes and comorbid cardiovascular disease (US$4,788), hyperlipidemia (US$2,163), hyperacidity disorders (US$8,753), and pain (US$8,324) compared with in those without the given disease. CONCLUSION Comorbidities in patients with diabetes are highly prevalent and have substantial consequences for medical expenditures. Interestingly, hyperacidity disorders and pain were the most costly conditions. Our findings highlight the importance of developing strategies that meet the needs of patients with diabetes and comorbidities. Integrated diabetes care such as used in the Chronic Care Model may represent a useful strategy.
Resumo:
Currently several thousands of objects are being tracked in the MEO and GEO regions through optical means. The problem faced in this framework is that of Multiple Target Tracking (MTT). In this context both, the correct associations among the observations and the orbits of the objects have to be determined. The complexity of the MTT problem is defined by its dimension S. The number S corresponds to the number of fences involved in the problem. Each fence consists of a set of observations where each observation belongs to a different object. The S ≥ 3 MTT problem is an NP-hard combinatorial optimization problem. There are two general ways to solve this. One way is to seek the optimum solution, this can be achieved by applying a branch-and- bound algorithm. When using these algorithms the problem has to be greatly simplified to keep the computational cost at a reasonable level. Another option is to approximate the solution by using meta-heuristic methods. These methods aim to efficiently explore the different possible combinations so that a reasonable result can be obtained with a reasonable computational effort. To this end several population-based meta-heuristic methods are implemented and tested on simulated optical measurements. With the advent of improved sensors and a heightened interest in the problem of space debris, it is expected that the number of tracked objects will grow by an order of magnitude in the near future. This research aims to provide a method that can treat the correlation and orbit determination problems simultaneously, and is able to efficiently process large data sets with minimal manual intervention.