84 resultados para Value analysis (Cost control)
Resumo:
Background: Recently both the UK and US governments have advocated the use of financial incentives to encourage healthier lifestyle choices but evidence for the cost-effectiveness of such interventions is lacking. Our aim was to perform a cost-effectiveness analysis (CEA) of a quasi-experimental trial, exploring the use of financial incentives to increase employee physical activity levels, from a healthcare and employer’s perspective.
Methods: Employees used a ‘loyalty card’ to objectively monitor their physical activity at work over 12 weeks. The Incentive Group (n=199) collected points and received rewards for minutes of physical activity completed. The No Incentive Group (n=207) self-monitored their physical activity only. Quality of life (QOL) and absenteeism were assessed at baseline and 6 months follow-up. QOL scores were also converted into productivity estimates using a validated algorithm. The additional costs of the Incentive Group were divided by the additional quality adjusted life years (QALYs) or productivity gained to calculate incremental cost effectiveness ratios (ICERs). Cost-effectiveness acceptability curves (CEACs) and population expected value of perfect information (EVPI) was used to characterize and value the uncertainty in our estimates.
Results: The Incentive Group performed more physical activity over 12 weeks and by 6 months had achieved greater gains in QOL and productivity, although these mean differences were not statistically significant. The ICERs were £2,900/QALY and £2,700 per percentage increase in overall employee productivity. Whilst the confidence intervals surrounding these ICERs were wide, CEACs showed a high chance of the intervention being cost-effective at low willingness-to-pay (WTP) thresholds.
Conclusions: The Physical Activity Loyalty card (PAL) scheme is potentially cost-effective from both a healthcare and employer’s perspective but further research is warranted to reduce uncertainty in our results. It is based on a sustainable “business model” which should become more cost-effective as it is delivered to more participants and can be adapted to suit other health behaviors and settings. This comes at a time when both UK and US governments are encouraging business involvement in tackling public health challenges.
Resumo:
Reducing wafer metrology continues to be a major target in semiconductor manufacturing efficiency initiatives due to it being a high cost, non-value added operation that impacts on cycle-time and throughput. However, metrology cannot be eliminated completely given the important role it plays in process monitoring and advanced process control. To achieve the required manufacturing precision, measurements are typically taken at multiple sites across a wafer. The selection of these sites is usually based on a priori knowledge of wafer failure patterns and spatial variability with additional sites added over time in response to process issues. As a result, it is often the case that in mature processes significant redundancy can exist in wafer measurement plans. This paper proposes a novel methodology based on Forward Selection Component Analysis (FSCA) for analyzing historical metrology data in order to determine the minimum set of wafer sites needed for process monitoring. The paper also introduces a virtual metrology (VM) based approach for reconstructing the complete wafer profile from the optimal sites identified by FSCA. The proposed methodology is tested and validated on a wafer manufacturing metrology dataset. © 2012 IEEE.
Resumo:
BACKGROUND: Despite vaccines and improved medical intensive care, clinicians must continue to be vigilant of possible Meningococcal Disease in children. The objective was to establish if the procalcitonin test was a cost-effective adjunct for prodromal Meningococcal Disease in children presenting at emergency department with fever without source.
METHODS AND FINDINGS: Data to evaluate procalcitonin, C-reactive protein and white cell count tests as indicators of Meningococcal Disease were collected from six independent studies identified through a systematic literature search, applying PRISMA guidelines. The data included 881 children with fever without source in developed countries.The optimal cut-off value for the procalcitonin, C-reactive protein and white cell count tests, each as an indicator of Meningococcal Disease, was determined. Summary Receiver Operator Curve analysis determined the overall diagnostic performance of each test with 95% confidence intervals. A decision analytic model was designed to reflect realistic clinical pathways for a child presenting with fever without source by comparing two diagnostic strategies: standard testing using combined C-reactive protein and white cell count tests compared to standard testing plus procalcitonin test. The costs of each of the four diagnosis groups (true positive, false negative, true negative and false positive) were assessed from a National Health Service payer perspective. The procalcitonin test was more accurate (sensitivity=0.89, 95%CI=0.76-0.96; specificity=0.74, 95%CI=0.4-0.92) for early Meningococcal Disease compared to standard testing alone (sensitivity=0.47, 95%CI=0.32-0.62; specificity=0.8, 95% CI=0.64-0.9). Decision analytic model outcomes indicated that the incremental cost effectiveness ratio for the base case was £-8,137.25 (US $ -13,371.94) per correctly treated patient.
CONCLUSIONS: Procalcitonin plus standard recommended tests, improved the discriminatory ability for fatal Meningococcal Disease and was more cost-effective; it was also a superior biomarker in infants. Further research is recommended for point-of-care procalcitonin testing and Markov modelling to incorporate cost per QALY with a life-time model.
Resumo:
Teachers frequently struggle to cope with conduct problems in the classroom. The aim of this study was to assess the effectiveness of the Incredible Years Teacher Classroom Management Training Programme for improving teacher competencies and child adjustment. The study involved a group randomised controlled trial which included 22 teachers and 217 children (102 boys and 115 girls). The average age of children included in the study was 5.3 years (standard deviation = 0.89). Teachers were randomly allocated to an intervention group (n = 11 teachers; 110 children) or a waiting-list control group (n = 11; 107 children). The sample also included 63 ‘high-risk’ children (33 intervention; 30 control), who scored above the cut-off (>12) on the Strengths and Difficulties Questionnaire for abnormal socioemotional and behavioural difficulties. Teacher and child behaviours were assessed at baseline and 6 months later using psychometric and observational measures. Programme delivery costs were also analysed. Results showed positive changes in teachers’ self-reported use of positive classroom management strategies (effect size = 0.56), as well as negative classroom management strategies (effect size = −0.43). Teacher reports also highlight improvements in the classroom behaviour of the high-risk group of children, while the estimated cost of delivering the Incredible Years Teacher Classroom Management Training Programme was modest. However, analyses of teacher and child observations were largely non-significant. A need for further research exploring the effectiveness and cost-effectiveness of the Incredible Years Teacher Classroom Management Training Programme is indicated.
Resumo:
The primary intention of this paper is to review the current state of the art in engineering cost modelling as applied to aerospace. This is a topic of current interest and in addressing the literature, the presented work also sets out some of the recognised definitions of cost that relate to the engineering domain. The paper does not attempt to address the higher-level financial sector but rather focuses on the costing issues directly relevant to the engineering process, primarily those of design and manufacture. This is of more contemporary interest as there is now a shift towards the analysis of the influence of cost, as defined in more engineering related terms; in an attempt to link into integrated product and process development (IPPD) within a concurrent engineering environment. Consequently, the cost definitions are reviewed in the context of the nature of cost as applicable to the engineering process stages: from bidding through to design, to manufacture, to procurement and ultimately, to operation. The linkage and integration of design and manufacture is addressed in some detail. This leads naturally to the concept of engineers influencing and controlling cost within their own domain rather than trusting this to financers who have little control over the cause of cost. In terms of influence, the engineer creates the potential for cost and in a concurrent environment this requires models that integrate cost into the decision making process.
Resumo:
Damping torque analysis is a well-developed technique for understanding and studying power system oscillations. This paper presents the applications of damping torque analysis for DC bus implemented damping control in power transmission networks in two examples. The first example is the investigation of damping effect of shunt VSC (Voltage Source Converter) based FACTS voltage control, i.e., STATCOM (Static Synchronous Compensator) voltage control. It is shown in the paper that STATCOM voltage control mainly contributes synchronous torque and hence has little effect on the damping of power system oscillations. The second example is the damping control implemented by a Battery Energy Storage System (BESS) installed in a power system. Damping torque analysis reveals that when BESS damping control is realized by regulating exchange of active and reactive power between the BESS and power system respectively, BESS damping control exhibits different properties. It is concluded by damping torque analysis that BESS damping control implemented by regulating active power is better with less interaction with BESS voltage control and more robust to variations of power system operating conditions. In the paper, all analytical conclusions obtained are demonstrated by simulation results of example power systems.
Resumo:
Query processing over the Internet involving autonomous data sources is a major task in data integration. It requires the estimated costs of possible queries in order to select the best one that has the minimum cost. In this context, the cost of a query is affected by three factors: network congestion, server contention state, and complexity of the query. In this paper, we study the effects of both the network congestion and server contention state on the cost of a query. We refer to these two factors together as system contention states. We present a new approach to determining the system contention states by clustering the costs of a sample query. For each system contention state, we construct two cost formulas for unary and join queries respectively using the multiple regression process. When a new query is submitted, its system contention state is estimated first using either the time slides method or the statistical method. The cost of the query is then calculated using the corresponding cost formulas. The estimated cost of the query is further adjusted to improve its accuracy. Our experiments show that our methods can produce quite accurate cost estimates of the submitted queries to remote data sources over the Internet.