31 resultados para Cost estimate accuracy

em QUB Research Portal - Research Directory and Institutional Repository for Queen's University Belfast


Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Varicella infection during pregnancy poses a serious risk for both foetus and mother. It has been suggested that it would be more cost-effective to screen antenatally with post-partum vaccination, which occurs in the US, than the current policy of checking immune status post varicella exposure, with VZIG administration where necessary. Additionally, it is doubtful whether the current policy provides best patient care, when a vaccine is available. Objectives The study aims to retrospectively compare the cost of the current policy with a cost estimate for antenatal screening with post-partum vaccination in NI. Study design A cost estimate of antenatal screening of primigravidas, with post-partum vaccination, was calculated for two models: (1) verbal screening, with serological testing of those with no history of varicella infection and (2) serological screening of all primigravidas. Results The cost of VZIG issued to pregnant women in 2006 was £100,800; 43% of births were to primigravidas therefore the estimated cost of VZIG issued to multigravidas was £58,100. The cost of verbal screening with post-partum vaccination is estimated at £23,750 p.a., saving £34,350 over current policy. The estimated cost of screening all primigravidas with post-partum vaccination is £43,000, saving £15,100. Conclusions This retrospective study suggests that in NI either of the proposed antenatal screening strategies would be less costly than current practice. This finding supports the suggestion that varicella immunity testing should be included in the Antenatal Infectious Diseases Screening Programme, either as part of the universal vaccination programme or solely as an antenatal programme.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Recent improvements in the speed, cost and accuracy of next generation sequencing are revolutionizing the discovery of single nucleotide polymorphisms (SNPs). SNPs are increasingly being used as an addition to the molecular ecology toolkit in nonmodel organisms, but their efficient use remains challenging. Here, we discuss common issues when employing SNP markers, including the high numbers of markers typically employed, the effects of ascertainment bias and the inclusion of nonneutral loci in a marker panel. We provide a critique of considerations specifically associated with the application and population genetic analysis of SNPs in nonmodel taxa, focusing specifically on some of the most commonly applied methods.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Objectives: To assess whether open angle glaucoma (OAG) screening meets the UK National Screening Committee criteria, to compare screening strategies with case finding, to estimate test parameters, to model estimates of cost and cost-effectiveness, and to identify areas for future research. Data sources: Major electronic databases were searched up to December 2005. Review methods: Screening strategies were developed by wide consultation. Markov submodels were developed to represent screening strategies. Parameter estimates were determined by systematic reviews of epidemiology, economic evaluations of screening, and effectiveness (test accuracy, screening and treatment). Tailored highly sensitive electronic searches were undertaken. Results: Most potential screening tests reviewed had an estimated specificity of 85% or higher. No test was clearly most accurate, with only a few, heterogeneous studies for each test. No randomised controlled trials (RCTs) of screening were identified. Based on two treatment RCTs, early treatment reduces the risk of progression. Extrapolating from this, and assuming accelerated progression with advancing disease severity, without treatment the mean time to blindness in at least one eye was approximately 23 years, compared to 35 years with treatment. Prevalence would have to be about 3-4% in 40 year olds with a screening interval of 10 years to approach cost-effectiveness. It is predicted that screening might be cost-effective in a 50-year-old cohort at a prevalence of 4% with a 10-year screening interval. General population screening at any age, thus, appears not to be cost-effective. Selective screening of groups with higher prevalence (family history, black ethnicity) might be worthwhile, although this would only cover 6% of the population. Extension to include other at-risk cohorts (e.g. myopia and diabetes) would include 37% of the general population, but the prevalence is then too low for screening to be considered cost-effective. Screening using a test with initial automated classification followed by assessment by a specialised optometrist, for test positives, was more cost-effective than initial specialised optometric assessment. The cost-effectiveness of the screening programme was highly sensitive to the perspective on costs (NHS or societal). In the base-case model, the NHS costs of visual impairment were estimated as £669. If annual societal costs were £8800, then screening might be considered cost-effective for a 40-year-old cohort with 1% OAG prevalence assuming a willingness to pay of £30,000 per quality-adjusted life-year. Of lesser importance were changes to estimates of attendance for sight tests, incidence of OAG, rate of progression and utility values for each stage of OAG severity. Cost-effectiveness was not particularly sensitive to the accuracy of screening tests within the ranges observed. However, a highly specific test is required to reduce large numbers of false-positive referrals. The findings that population screening is unlikely to be cost-effective are based on an economic model whose parameter estimates have considerable uncertainty, in particular, if rate of progression and/or costs of visual impairment are higher than estimated then screening could be cost-effective. Conclusions: While population screening is not cost-effective, the targeted screening of high-risk groups may be. Procedures for identifying those at risk, for quality assuring the programme, as well as adequate service provision for those screened positive would all be needed. Glaucoma detection can be improved by increasing attendance for eye examination, and improving the performance of current testing by either refining practice or adding in a technology-based first assessment, the latter being the more cost-effective option. This has implications for any future organisational changes in community eye-care services. Further research should aim to develop and provide quality data to populate the economic model, by conducting a feasibility study of interventions to improve detection, by obtaining further data on costs of blindness, risk of progression and health outcomes, and by conducting an RCT of interventions to improve the uptake of glaucoma testing. © Queen's Printer and Controller of HMSO 2007. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generic, hierarchical, and multifidelity unit cost of acquisition estimating methodology for outside production machined parts is presented. The originality of the work lies with the method’s inherent capability of being able to generate multilevel and multifidelity cost relations for large volumes of parts utilizing process, supply chain costing data, and varying degrees of part design definition information. Estimates can be generated throughout the life cycle of a part using different grades of the combined information available. Considering design development for a given part, additional design definition may be used as it becomes available within the developed method to improve the quality of the resulting estimate. Via a process of analogous classification, parts are classified into groups of increasing similarity using design-based descriptors. A parametric estimating method is then applied to each subgroup of the machined part commodity in the direction of improved classification and using which, a relationship which links design variables to manufacturing cycle time may be generated. A rate cost reflective of the supply chain is then applied to the cycle time estimate for a given part to arrive at an estimate of make cost which is then totalled with the material and treatments cost components respectively to give an overall estimate of unit acquisition cost. Both the rate charge applied and the treatments cost calculated for a given procured part is derived via the use of ratio analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The work presented is concerned with the estimation of manufacturing cost at the concept design stage, when little technical information is readily available. The work focuses on the nose cowl sections of a wide range of engine nacelles built at Bombardier Aerospace Shorts of Belfast. A core methodology is presented that: defines manufacturing cost elements that are prominent; utilises technical parameters that are highly influential in generating those costs; establishes the linkage between these two; and builds the associated cost estimating relations into models. The methodology is readily adapted to deal with both the early and more mature conceptual design phases, which thereby highlights the generic, flexible and fundamental nature of the method. The early concept cost model simplifies cost as a cumulative element that can be estimated using higher level complexity ratings, while the mature concept cost model breaks manufacturing cost down into a number of constituents that are each driven by their own specific drivers. Both methodologies have an average error of less that ten percent when correlated with actual findings, thus achieving an acceptable level of accuracy. By way of validity and application, the research is firmly based on industrial case studies and practice and addresses the integration of design and manufacture through cost. The main contribution of the paper is the cost modelling methodology. The elemental modelling of the cost breakdown structure through materials, part fabrication, assembly and their associated drivers is relevant to the analytical design procedure, as it utilises design definition and complexity that is understood by engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indoor wireless network based client localisation requires the use of a radio map to relate received signal strength to specific locations. However, signal strength measurements are time consuming, expensive and usually require unrestricted access to all parts of the building concerned. An obvious option for circumventing this difficulty is to estimate the radio map using a propagation model. This paper compares the effect of measured and simulated radio maps on the accuracy of two different methods of wireless network based localisation. The results presented indicate that, although the propagation model used underestimated the signal strength by up to 15 dB at certain locations, there was not a signigicant reduction in localisation performance. In general, the difference in performance between the simulated and measured radio maps was around a 30 % increase in rms error

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A methodology to estimate the cost implications of design decisions by integrating cost as a design parameter at an early design stage is presented. The model is developed on a hierarchical basis, the manufacturing cost of aircraft fuselage panels being analysed in this paper. The manufacturing cost modelling is original and relies on a genetic-causal method where the drivers of each element of cost are identified relative to the process capability. The cost model is then extended to life cycle costing by computing the Direct Operating Cost as a function of acquisition cost and fuel burn, and coupled with a semi-empirical numerical analysis using Engineering Sciences Data Unit reference data to model the structural integrity of the fuselage shell with regard to material failure and various modes of buckling. The main finding of the paper is that the traditional minimum weight condition is a dated and sub-optimal approach to airframe structural design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Query processing over the Internet involving autonomous data sources is a major task in data integration. It requires the estimated costs of possible queries in order to select the best one that has the minimum cost. In this context, the cost of a query is affected by three factors: network congestion, server contention state, and complexity of the query. In this paper, we study the effects of both the network congestion and server contention state on the cost of a query. We refer to these two factors together as system contention states. We present a new approach to determining the system contention states by clustering the costs of a sample query. For each system contention state, we construct two cost formulas for unary and join queries respectively using the multiple regression process. When a new query is submitted, its system contention state is estimated first using either the time slides method or the statistical method. The cost of the query is then calculated using the corresponding cost formulas. The estimated cost of the query is further adjusted to improve its accuracy. Our experiments show that our methods can produce quite accurate cost estimates of the submitted queries to remote data sources over the Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A techno-economic model of an autonomous wave-powered desalination plant is developed and indicates that fresh water can be produced for as little as £0.45/m3. The advantages of an autonomous wave-powered desalination plant are also discussed indicating that the real value of the system is enhanced due to its flexibility for deployment and reduced environmental impact. The modelled plant consists of the Oyster wave energy converter, conventional reverse osmosis membranes and a pressure exchanger–intensifier for energy recovery. A time-domain model of the plant is produced using wave-tank experimentation to calibrate the model of Oyster, manufacturer's data for the model of the reverse osmosis membranes and a hydraulic model of the pressure exchanger–intensifier. The economic model of the plant uses best-estimate cost data which are reduced to annualised costs to facilitate the calculation of the cost of water. Finally, the barriers to the deployment of this technology are discussed, but they are not considered insurmountable.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: The Secondary Prevention of Heart disEase in geneRal practicE (SPHERE) trial has recently reported. This study examines the cost-effectiveness of the SPHERE intervention in both healthcare systems on the island of Ireland. Methods: Incremental cost-effectiveness analysis. A probabilistic model was developed to combine within-trial and beyond-trial impacts of treatment to estimate the lifetime costs and benefits of two secondary prevention strategies: Intervention - tailored practice and patient care plans; and Control - standardized usual care. Results: The intervention strategy resulted in mean cost savings per patient of 512.77 (95 percent confidence interval [CI], 1086.46-91.98) and an increase in mean quality-adjusted life-years (QALYs) per patient of 0.0051 (95 percent CI, 0.0101-0.0200), when compared with the control strategy. The probability of the intervention being cost-effective was 94 percent if decision makers are willing to pay €45,000 per additional QALY. Conclusions: Decision makers in both settings must determine whether the level of evidence presented is sufficient to justify the adoption of the SPHERE intervention in clinical practice. Copyright © Cambridge University Press 2010.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: There is growing interest in the potential utility of molecular diagnostics in improving the detection of life-threatening infection (sepsis). LightCycler® SeptiFast is a multipathogen probebased real-time PCR system targeting DNA sequences of bacteria and fungi present in blood samples within a few hours. We report here the protocol of the first systematic review of published clinical diagnostic accuracy studies of this technology when compared with blood culture in the setting of suspected sepsis. Methods/design: Data sources: the Cochrane Database of Systematic Reviews, the Database of Abstracts of Reviews of Effects (DARE), the Health Technology Assessment Database (HTA), the NHS Economic Evaluation Database (NHSEED), The Cochrane Library, MEDLINE, EMBASE, ISI Web of Science, BIOSIS Previews, MEDION and the Aggressive Research Intelligence Facility Database (ARIF). Study selection: diagnostic accuracy studies that compare the real-time PCR technology with standard culture results performed on a patient's blood sample during the management of sepsis. Data extraction: three reviewers, working independently, will determine the level of evidence, methodological quality and a standard data set relating to demographics and diagnostic accuracy metrics for each study. Statistical analysis/data synthesis: heterogeneity of studies will be investigated using a coupled forest plot of sensitivity and specificity and a scatter plot in Receiver Operator Characteristic (ROC) space. Bivariate model method will be used to estimate summary sensitivity and specificity. The authors will investigate reporting biases using funnel plots based on effective sample size and regression tests of asymmetry. Subgroup analyses are planned for adults, children and infection setting (hospital vs community) if sufficient data are uncovered. Dissemination: Recommendations will be made to the Department of Health (as part of an open-access HTA report) as to whether the real-time PCR technology has sufficient clinical diagnostic accuracy potential to move forward to efficacy testing during the provision of routine clinical care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diagnostic accuracy and management recommendations of realtime teledermatology consultations using low-cost telemedicine equipment were evaluated. Patients were seen by a dermatologist over a video-link and a diagnosis and treatment plan were recorded. This was followed by a face-to-face consultation on the same day to confirm the earlier diagnosis and management plan. A total of 351 patients with 427 diagnoses participated. Sixty-seven per cent of the diagnoses made over the video-link agreed with the face-to-face diagnosis. Clinical management plans were recorded for 214 patients with 252 diagnoses. For this cohort, 44% of the patients were seen by the same dermatologist at both consultations, while 56% were seen by a different dermatologist. In 64% of cases the same management plan was recommended at both consultations; a sub-optimum treatment plan was recommended in 8% of cases; and in 9% of cases the video-link management plans were judged to be inappropriate. In 20% of cases the dermatologist was unable to recommend a suitable management plan by video-link. There were significant differences in the ability to recommend an optimum management plan by video-link when a different dermatologist made the reference management plan. The results indicate that a high proportion of dermatological conditions can be successfully managed by realtime teledermatology.