767 resultados para cost utility analysis
Resumo:
Introduction Asthma is now one of the most common long-term conditions in the UK. It is therefore important to develop a comprehensive appreciation of the healthcare and societal costs in order to inform decisions on care provision and planning. We plan to build on our earlier estimates of national prevalence and costs from asthma by filling the data gaps previously identified in relation to healthcare and broadening the field of enquiry to include societal costs. This work will provide the first UK-wide estimates of the costs of asthma. In the context of asthma for the UK and its member countries (ie, England, Northern Ireland, Scotland and Wales), we seek to: (1) produce a detailed overview of estimates of incidence, prevalence and healthcare utilisation; (2) estimate health and societal costs; (3) identify any remaining information gaps and explore the feasibility of filling these and (4) provide insights into future research that has the potential to inform changes in policy leading to the provision of more cost-effective care.
Methods and analysis Secondary analyses of data from national health surveys, primary care, prescribing, emergency care, hospital, mortality and administrative data sources will be undertaken to estimate prevalence, healthcare utilisation and outcomes from asthma. Data linkages and economic modelling will be undertaken in an attempt to populate data gaps and estimate costs. Separate prevalence and cost estimates will be calculated for each of the UK-member countries and these will then be aggregated to generate UK-wide estimates.
Ethics and dissemination Approvals have been obtained from the NHS Scotland Information Services Division's Privacy Advisory Committee, the Secure Anonymised Information Linkage Collaboration Review System, the NHS South-East Scotland Research Ethics Service and The University of Edinburgh's Centre for Population Health Sciences Research Ethics Committee. We will produce a report for Asthma-UK, submit papers to peer-reviewed journals and construct an interactive map.
Resumo:
BACKGROUND: Despite vaccines and improved medical intensive care, clinicians must continue to be vigilant of possible Meningococcal Disease in children. The objective was to establish if the procalcitonin test was a cost-effective adjunct for prodromal Meningococcal Disease in children presenting at emergency department with fever without source.
METHODS AND FINDINGS: Data to evaluate procalcitonin, C-reactive protein and white cell count tests as indicators of Meningococcal Disease were collected from six independent studies identified through a systematic literature search, applying PRISMA guidelines. The data included 881 children with fever without source in developed countries.The optimal cut-off value for the procalcitonin, C-reactive protein and white cell count tests, each as an indicator of Meningococcal Disease, was determined. Summary Receiver Operator Curve analysis determined the overall diagnostic performance of each test with 95% confidence intervals. A decision analytic model was designed to reflect realistic clinical pathways for a child presenting with fever without source by comparing two diagnostic strategies: standard testing using combined C-reactive protein and white cell count tests compared to standard testing plus procalcitonin test. The costs of each of the four diagnosis groups (true positive, false negative, true negative and false positive) were assessed from a National Health Service payer perspective. The procalcitonin test was more accurate (sensitivity=0.89, 95%CI=0.76-0.96; specificity=0.74, 95%CI=0.4-0.92) for early Meningococcal Disease compared to standard testing alone (sensitivity=0.47, 95%CI=0.32-0.62; specificity=0.8, 95% CI=0.64-0.9). Decision analytic model outcomes indicated that the incremental cost effectiveness ratio for the base case was £-8,137.25 (US $ -13,371.94) per correctly treated patient.
CONCLUSIONS: Procalcitonin plus standard recommended tests, improved the discriminatory ability for fatal Meningococcal Disease and was more cost-effective; it was also a superior biomarker in infants. Further research is recommended for point-of-care procalcitonin testing and Markov modelling to incorporate cost per QALY with a life-time model.
Resumo:
Severe refractory asthma poses a substantial burden in terms of healthcare costs but relatively little is known about the factors which drive these costs. This study uses data from the British Thoracic Society Difficult Asthma Registry (n=596) to estimate direct healthcare treatment costs from an National Health Service perspective and examines factors that explain variations in costs. Annual mean treatment costs among severe refractory asthma patients were £2912 (SD £2212) to £4217 (SD £2449). Significant predictors of costs were FEV1% predicted, location of care, maintenance oral corticosteroid treatment and body mass index. Treating individuals with severe refractory asthma presents a substantial cost to the health service.
Resumo:
Multiple Table Lookup architectures in Software Defined Networking (SDN) open the door for exciting new network applications. The development of the OpenFlow protocol supported the SDN paradigm. However, the first version of the OpenFlow protocol specified a single table lookup model with the associated constraints in flow entry numbers and search capabilities. With the introduction of multiple table lookup in OpenFlow v1.1, flexible and efficient search to support SDN application innovation became possible. However, implementation of multiple table lookup in hardware to meet high performance requirements is non-trivial. One possible approach involves the use of multi-dimensional lookup algorithms. A high lookup performance can be achieved by using embedded memory for flow entry storage. A detailed study of OpenFlow flow filters for multi-dimensional lookup is presented in this paper. Based on a proposed multiple table lookup architecture, the memory consumption and update performance using parallel single field searches are evaluated. The results demonstrate an efficient multi-table lookup implementation with minimum memory usage.
Resumo:
Small bowel accounts for only 0.5% of cancer cases in the US but incidence rates have been rising at 2.4% per year over the past decade. One-third of these are adenocarcinomas but little is known about their molecular pathology and no molecular markers are available for clinical use. Using a retrospective 28 patient matched normal-tumor cohort, next-generation sequencing, gene expression arrays and CpG methylation arrays were used for molecular profiling. Next-generation sequencing identified novel mutations in IDH1, CDH1, KIT, FGFR2, FLT3, NPM1, PTEN, MET, AKT1, RET, NOTCH1 and ERBB4. Array data revealed 17% of CpGs and 5% of RNA transcripts assayed to be differentially methylated and expressed respectively (p < 0.01). Merging gene expression and DNA methylation data revealed CHN2 as consistently hypermethylated and downregulated in this disease (Spearman -0.71, p < 0.001). Mutations in TP53 which were found in more than half of the cohort (15/28) and Kazald1 hypomethylation were both were indicative of poor survival (p = 0.03, HR = 3.2 and p = 0.01, HR = 4.9 respectively). By integrating high-throughput mutational, gene expression and DNA methylation data, this study reveals for the first time the distinct molecular profile of small bowel adenocarcinoma and highlights potential clinically exploitable markers.
Resumo:
The present study is intended to provide a new scientific approach to the solution of the worlds cost engineering problems encountered in the chemical industries in our nation. The problem is that of cost estimation of equipments especially of pressure vessels when setting up chemical industries .The present study attempts to develop a model for such cost estimation. This in turn is hoped would go a long way to solve this and related problems in forecasting the cost of setting up chemical plants.
Resumo:
Compositional data naturally arises from the scientific analysis of the chemical composition of archaeological material such as ceramic and glass artefacts. Data of this type can be explored using a variety of techniques, from standard multivariate methods such as principal components analysis and cluster analysis, to methods based upon the use of log-ratios. The general aim is to identify groups of chemically similar artefacts that could potentially be used to answer questions of provenance. This paper will demonstrate work in progress on the development of a documented library of methods, implemented using the statistical package R, for the analysis of compositional data. R is an open source package that makes available very powerful statistical facilities at no cost. We aim to show how, with the aid of statistical software such as R, traditional exploratory multivariate analysis can easily be used alongside, or in combination with, specialist techniques of compositional data analysis. The library has been developed from a core of basic R functionality, together with purpose-written routines arising from our own research (for example that reported at CoDaWork'03). In addition, we have included other appropriate publicly available techniques and libraries that have been implemented in R by other authors. Available functions range from standard multivariate techniques through to various approaches to log-ratio analysis and zero replacement. We also discuss and demonstrate a small selection of relatively new techniques that have hitherto been little-used in archaeometric applications involving compositional data. The application of the library to the analysis of data arising in archaeometry will be demonstrated; results from different analyses will be compared; and the utility of the various methods discussed
Resumo:
Most building services products are installed while a building is constructed, but they are not operated until the building is commissioned. The warranty of the products may cover the time starting from their installation to the end of the warranty period. Prior to the commissioning of the building, the products are at a dormant mode (i.e., not operated) but protected by the warranty. For such products, both the usage intensity and the failure patterns are different from those with continuous usage intensity and failure patterns. This paper develops warranty cost models for repairable products with a dormant mode from both the manufacturer's and buyer's perspectives. Relationships between the failure patterns at the dormant mode and at the operational mode are also discussed. Numerical examples and sensitivity analysis are used to demonstrate the applicability of the methodology derived in the paper.