895 resultados para cost model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The Lescol Intervention Prevention Study (LIPS) was a multinational randomized controlled trial that showed a 47% reduction in the relative risk of cardiac death and a 22% reduction in major adverse cardiac events (MACEs) from the routine use of fluvastatin, compared with controls, in patients undergoing percutaneous coronary intervention (PCI, defined as angioplasty with or without stents). In this study, MACEs included cardiac death, nonfatal myocardial infarction, and subsequent PCI and coronary artery bypass graft. Diabetes was the greatest risk factor for MACEs. Objective: This study estimated the cost-effectiveness of fluvastatin when used for secondary prevention of MACEs after PCI in people with diabetes. Methods: A post hoc subgroup analysis of patients with diabetes from the LIPS was used to estimate the effectiveness of fluvastatin in reducing myocardial infarction, revascularization, and cardiac death. A probabilistic Markov model was developed using United Kingdom resource and cost data to estimate the additional costs and quality-adjusted life-years (QALYs) gained over 10 years from the perspective of the British National Health Service. The model contained 6 health states, and the transition probabilities were derived from the LIPS data. Crossover from fluvastatin to other lipid-lowering drugs, withdrawal from fluvastatin, and the use of lipid-lowering drugs in the control group were included. Results: In the subgroup of 202 patients with diabetes in the LIPS trial, 18 (15.0%) of 120 fluvastatin patients and 21 (25.6%) of 82 control participants were insulin dependent (P = NS). Compared with the control group, patients treated with fluvastatin can expect to gain an additional mean (SD) of 0.196 (0.139) QALY per patient over 10 years (P < 0.001) and will cost the health service an additional mean (SD) of 10 (E448) (P = NS) (mean [SD] US $16 [$689]). The additional cost per QALY gained was;(51 (US $78). The key determinants of cost-effectiveness included the probabilities of repeat interventions, cardiac death, the cost of fluvastatin, and the time horizon used for the evaluation. Conclusion: Fluvastatin was an economically efficient treatment to prevent MACEs in these patients with diabetes undergoing PCI.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Existing evidence suggests that vocational rehabilitation services, in particular individual placement and support (IPS), are effective in assisting people with schizophrenia and related conditions gain open employment. Despite this, such services are not available to all unemployed people with schizophrenia who wish to work. Existing evidence suggests that while IPS confers no clinical advantages over routine care, it does improve the proportion of people returning to employment. The objective of the current study is to investigate the net benefit of introducing IPS services into current mental health services in Australia. Method: The net benefit of IPS is assessed from a health sector perspective using cost-benefit analysis. A two-stage approach is taken to the assessment of benefit. The first stage involves a quantitative analysis of the net benefit, defined as the benefits of IPS (comprising transfer payments averted, income tax accrued and individual income earned) minus the costs. The second stage involves application of 'second-filter' criteria (including equity, strength of evidence, feasibility and acceptability to stakeholders) to results. The robustness of results is tested using the multivariate probabilistic sensitivity analysis. Results: The costs of IPS are $A10.3M (95% uncertainty interval $A7.4M-$A13.6M), the benefits are $A4.7M ($A3.1M-$A6.5M), resulting in a negative net benefit of $A5.6M ($A8.4M-$A3.4M). Conclusions: The current analysis suggests that IPS costs are greater than the monetary benefits. However, the evidence-base of the current analysis is weak. Structural conditions surrounding welfare payments in Australia create disincentives to full-time employment for people with disabilities.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Antidepressant drugs and cognitive-behavioural therapy (CBT) are effective treatment options for depression and are recommended by clinical practice guidelines. As part of the Assessing Cost-effectiveness - Mental Health project we evaluate the available evidence on costs and benefits of CBT and drugs in the episodic and maintenance treatment of major depression. Method: The cost-effectiveness is modelled from a health-care perspective as the cost per disability-adjusted life year. Interventions are targeted at people with major depression who currently seek care but receive non-evidence based treatment. Uncertainty in model inputs is tested using Monte Carlo simulation methods. Results: All interventions for major depression examined have a favourable incremental cost-effectiveness ratio under Australian health service conditions. Bibliotherapy, group CBT, individual CBT by a psychologist on a public salary and tricyclic antidepressants (TCAs) are very cost-effective treatment options falling below $A10 000 per disability-adjusted life year (DALY) even when taking the upper limit of the uncertainty interval into account. Maintenance treatment with selective serotonin re-uptake inhibitors (SSRIs) is the most expensive option (ranging from $A17 000 to $A20 000 per DALY) but still well below $A50 000, which is considered the affordable threshold. Conclusions: A range of cost-effective interventions for episodes of major depression exists and is currently underutilized. Maintenance treatment strategies are required to significantly reduce the burden of depression, but the cost of long-term drug treatment for the large number of depressed people is high if SSRIs are the drug of choice. Key policy issues with regard to expanded provision of CBT concern the availability of suitably trained providers and the funding mechanisms for therapy in primary care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present paper articulates a model in which ingroup and outgroup norms inform 'rational' decision-making (cost-benefit analysis) for conflict behaviors. Norms influence perceptions of the consequences of the behavior, and individuals may thus strategically conform to or violate norms in order to acquire benefits and avoid costs. Two studies demonstrate these processes in the context of conflict in Quebec. In the first study, Anglophones' perceptions of Francophone and Anglophone norms for pro-English behaviors predicted evaluations of the benefits and costs of the behaviors, and these cost-benefit evaluations in turn mediated the norm-intention links for both group norms. In the second study, a manipulated focus on supportive versus hostile ingroup and outgroup norms also predicted cost-benefit evaluations, which mediated the norm-intention relationships. The studies support a model of strategic conflict choices in which group norms inform, rather than suppress, rational expectancy value processes. Implications for theories of decision-making and normative influence are discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the insight gained from 2-D particle models, and given that the dynamics of crustal faults occur in 3-D space, the question remains, how do the 3-D fault gouge dynamics differ from those in 2-D? Traditionally, 2-D modeling has been preferred over 3-D simulations because of the computational cost of solving 3-D problems. However, modern high performance computing architectures, combined with a parallel implementation of the Lattice Solid Model (LSM), provide the opportunity to explore 3-D fault micro-mechanics and to advance understanding of effective constitutive relations of fault gouge layers. In this paper, macroscopic friction values from 2-D and 3-D LSM simulations, performed on an SGI Altix 3700 super-cluster, are compared. Two rectangular elastic blocks of bonded particles, with a rough fault plane and separated by a region of randomly sized non-bonded gouge particles, are sheared in opposite directions by normally-loaded driving plates. The results demonstrate that the gouge particles in the 3-D models undergo significant out-of-plane motion during shear. The 3-D models also exhibit a higher mean macroscopic friction than the 2-D models for varying values of interparticle friction. 2-D LSM gouge models have previously been shown to exhibit accelerating energy release in simulated earthquake cycles, supporting the Critical Point hypothesis. The 3-D models are shown to also display accelerating energy release, and good fits of power law time-to-failure functions to the cumulative energy release are obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new methodology is proposed for the analysis of generation capacity investment in a deregulated market environment. This methodology proposes to make the investment appraisal using a probabilistic framework. The probabilistic production simulation (PPC) algorithm is used to compute the expected energy generated, taking into account system load variations and plant forced outage rates, while the Monte Carlo approach has been applied to model the electricity price variability seen in a realistic network. The model is able to capture the price and hence the profitability uncertainties for generator companies. Seasonal variation in the electricity prices and the system demand are independently modeled. The method is validated on IEEE RTS system, augmented with realistic market and plant data, by using it to compare the financial viability of several generator investments applying either conventional or directly connected generator (powerformer) technologies. The significance of the results is assessed using several financial risk measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Offshore oil and gas pipelines are vulnerable to environment as any leak and burst in pipelines cause oil/gas spill resulting in huge negative Impacts on marine lives. Breakdown maintenance of these pipelines is also cost-intensive and time-consuming resulting in huge tangible and intangible loss to the pipeline operators. Pipelines health monitoring and integrity analysis have been researched a lot for successful pipeline operations and risk-based maintenance model is one of the outcomes of those researches. This study develops a risk-based maintenance model using a combined multiple-criteria decision-making and weight method for offshore oil and gas pipelines in Thailand with the active participation of experienced executives. The model's effectiveness has been demonstrated through real life application on oil and gas pipelines in the Gulf of Thailand. Practical implications. Risk-based inspection and maintenance methodology is particularly important for oil pipelines system, as any failure in the system will not only affect productivity negatively but also has tremendous negative environmental impact. The proposed model helps the pipelines operators to analyze the health of pipelines dynamically, to select specific inspection and maintenance method for specific section in line with its probability and severity of failure.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests efficient design and operation philosophy, construction methodology and logical insurance plans. The risk-based model uses Analytic Hierarchy Process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the UK, low vision rehabilitation is delivered by a wide variety of providers with different strategies being used to integrate services from health, social care and the voluntary sector. In order to capture the current diversity of service provision the Low vision Service Model Evaluation (LOVSME) project aimed to profile selected low vision services using published standards for service delivery as a guide. Seven geographically and organizationally varied low-vision services across England were chosen for their diversity and all agreed to participate. A series of questionnaires and follow-up visits were undertaken to obtain a comprehensive description of each service, including the staff workloads and the cost of providing the service. In this paper the strengths of each model of delivery are discussed, and examples of good practice identified. As a result of the project, an Assessment Framework tool has been developed that aims to help other service providers evaluate different aspects of their own service to identify any gaps in existing service provision, and will act as a benchmark for future service development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Based on recent advances in autonomic computing, we propose a methodology for the cost-effective development of self-managing systems starting from a model of the resources to be managed and using a general-purpose autonomic architecture.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In construction projects, the aim of project control is to ensure projects finish on time, within budget, and achieve other project objectives. During the last few decades, numerous project control methods have been developed and adopted by project managers in practice. However, many existing methods focus on describing what the processes and tasks of project control are; not on how these tasks should be conducted. There is also a potential gap between principles that underly these methods and project control practice. As a result, time and cost overruns are still common in construction projects, partly attributable to deficiencies of existing project control methods and difficulties in implementing them. This paper describes a new project cost and time control model, the project control and inhibiting factors management (PCIM) model, developed through a study involving extensive interaction with construction practitioners in the UK, which better reflects the real needs of project managers. A set of good practice checklist is also developed to facilitate implementation of the model. © 2013 American Society of Civil Engineers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes the procedure and results from four years research undertaken through the IHD (Interdisciplinary Higher Degrees) Scheme at Aston University in Birmingham, sponsored by the SERC (Science and Engineering Research Council) and Monk Dunstone Associates, Chartered Quantity Surveyors. A stochastic networking technique VERT (Venture Evaluation and Review Technique) was used to model the pre-tender costs of public health, heating ventilating, air-conditioning, fire protection, lifts and electrical installations within office developments. The model enabled the quantity surveyor to analyse, manipulate and explore complex scenarios which previously had defied ready mathematical analysis. The process involved the examination of historical material costs, labour factors and design performance data. Components and installation types were defined and formatted. Data was updated and adjusted using mechanical and electrical pre-tender cost indices and location, selection of contractor, contract sum, height and site condition factors. Ranges of cost, time and performance data were represented by probability density functions and defined by constant, uniform, normal and beta distributions. These variables and a network of the interrelationships between services components provided the framework for analysis. The VERT program, in this particular study, relied upon Monte Carlo simulation to model the uncertainties associated with pre-tender estimates of all possible installations. The computer generated output in the form of relative and cumulative frequency distributions of current element and total services costs, critical path analyses and details of statistical parameters. From this data alternative design solutions were compared, the degree of risk associated with estimates was determined, heuristics were tested and redeveloped, and cost significant items were isolated for closer examination. The resultant models successfully combined cost, time and performance factors and provided the quantity surveyor with an appreciation of the cost ranges associated with the various engineering services design options.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of papers and reports covering the techno-economic analysis of bio-oil production has been published. These have had different scopes, use different feedstocks and reflected national cost structures. This paper reviews and compares their cost estimates and the experimental results that underpin them. A comprehensive cost and performance model was produced based on consensus data from the previous studies or stated scenarios where data is not available that reflected UK costs. The model takes account sales of bio-char that is a co-product of pyrolysis and the electricity consumption of the pyrolysis plant and biomass pre-processing plants. It was concluded that it should be able to produce bio-oil in the UK from energy crops for a similar cost as distillate fuel oil. It was also found that there was little difference in the processing cost for woodchips and baled miscanthus. © 2011 Elsevier Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Component-based development (CBD) has become an important emerging topic in the software engineering field. It promises long-sought-after benefits such as increased software reuse, reduced development time to market and, hence, reduced software production cost. Despite the huge potential, the lack of reasoning support and development environment of component modeling and verification may hinder its development. Methods and tools that can support component model analysis are highly appreciated by industry. Such a tool support should be fully automated as well as efficient. At the same time, the reasoning tool should scale up well as it may need to handle hundreds or even thousands of components that a modern software system may have. Furthermore, a distributed environment that can effectively manage and compose components is also desirable. In this paper, we present an approach to the modeling and verification of a newly proposed component model using Semantic Web languages and their reasoning tools. We use the Web Ontology Language and the Semantic Web Rule Language to precisely capture the inter-relationships and constraints among the entities in a component model. Semantic Web reasoning tools are deployed to perform automated analysis support of the component models. Moreover, we also proposed a service-oriented architecture (SOA)-based semantic web environment for CBD. The adoption of Semantic Web services and SOA make our component environment more reusable, scalable, dynamic and adaptive.