890 resultados para project cost engineering


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mode of access: Internet.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

"January 1990."

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Thesis (Master's)--University of Washington, 2016-06

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To evaluate the cost of atrial fibrillation (AF) to health and social services in the UK in 1995 and, based on epidemiological trends, to project this estimate to 2000. Design, setting, and main outcome measures: Contemporary estimates of health care activity related to AF were applied to the whole population of the UK on an age and sex specific basis for the year 1995. The activities considered ( and costs calculated) were hospital admissions, outpatient consultations, general practice consultations, and drug treatment ( including the cost of monitoring anticoagulant treatment). By adjusting for the progressive aging of the British population and related increases in hospital admissions, the cost of AF was also projected to the year 2000. Results: There were 534 000 people with AF in the UK during 1995. The direct'' cost of health care for these patients was pound 244 million (similar toE350 million) or 0.62% of total National Health Service ( NHS) expenditure. Hospitalisations and drug prescriptions accounted for 50% and 20% of this expenditure, respectively. Long term nursing home care after hospital admission cost an additional pound46.4 million (similar toE66 million). The direct cost of AF rose to pound459 million (similar toE655 million) in 2000, equivalent to 0.97% of total NHS expenditure based on 1995 figures. Nursing home costs rose to pound111 million (similar toE160 million). Conclusions: AF is an extremely costly public health problem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: Antidepressant drugs and cognitive-behavioural therapy (CBT) are effective treatment options for depression and are recommended by clinical practice guidelines. As part of the Assessing Cost-effectiveness - Mental Health project we evaluate the available evidence on costs and benefits of CBT and drugs in the episodic and maintenance treatment of major depression. Method: The cost-effectiveness is modelled from a health-care perspective as the cost per disability-adjusted life year. Interventions are targeted at people with major depression who currently seek care but receive non-evidence based treatment. Uncertainty in model inputs is tested using Monte Carlo simulation methods. Results: All interventions for major depression examined have a favourable incremental cost-effectiveness ratio under Australian health service conditions. Bibliotherapy, group CBT, individual CBT by a psychologist on a public salary and tricyclic antidepressants (TCAs) are very cost-effective treatment options falling below $A10 000 per disability-adjusted life year (DALY) even when taking the upper limit of the uncertainty interval into account. Maintenance treatment with selective serotonin re-uptake inhibitors (SSRIs) is the most expensive option (ranging from $A17 000 to $A20 000 per DALY) but still well below $A50 000, which is considered the affordable threshold. Conclusions: A range of cost-effective interventions for episodes of major depression exists and is currently underutilized. Maintenance treatment strategies are required to significantly reduce the burden of depression, but the cost of long-term drug treatment for the large number of depressed people is high if SSRIs are the drug of choice. Key policy issues with regard to expanded provision of CBT concern the availability of suitably trained providers and the funding mechanisms for therapy in primary care.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper presents a spreadsheet-based multiple account framework for cost-benefit analysis which incorporates all the usual concerns of cost-benefit analysts such as shadow-pricing to account for market failure. distribution of net benefits. sensitivity and risk analysis, cost of public funds, and environmental effects. The approach is generalizable to a wide range of projects and situations and offers a number of advantages to both analysts and decision-makers, including transparency, a check on internal consistency, and a detailed summary of project net benefits disaggregated by stakeholder group. Of particular importance is the ease with which this framework allows for a project to be evaluated from alternative decision-making perspectives and under alternative policy scenarios where the trade-offs among the project's stakeholders can readily be identified and quantified. (C) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In 1999, the Department of Health in Western Australia began a telehealth project, which finished in 2004. The 75 videoconferencing sites funded by the project were part of a total state-wide videoconference network of 104 sites. During the period from January 2002 to December 2003, a total of 3266 consultations, case reviews and patient education sessions took place. Clinical use grew to 30% of all telehealth activity. Educational use was approximately 40% (1416 sessions) and management use was about 30% (1031 sessions). The average overhead cost per telehealth session across all regions and usage types was $A192. Meaningful comparisons of the results of the present study with other public health providers were difficult, because many of the available Websites on telehealth were out of date. Despite the successful use of telehealth to deliver clinical services in Western Australia, sustaining the effort in the post-project phase will present significant challenges.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Process optimisation and optimal control of batch and continuous drum granulation processes are studied in this paper. The main focus of the current research has been: (i) construction of optimisation and control relevant, population balance models through the incorporation of moisture content, drum rotation rate and bed depth into the coalescence kernels; (ii) investigation of optimal operational conditions using constrained optimisation techniques; (iii) development of optimal control algorithms based on discretized population balance equations; and (iv) comprehensive simulation studies on optimal control of both batch and continuous granulation processes. The objective of steady state optimisation is to minimise the recycle rate with minimum cost for continuous processes. It has been identified that the drum rotation-rate, bed depth (material charge), and moisture content of solids are practical decision (design) parameters for system optimisation. The objective for the optimal control of batch granulation processes is to maximize the mass of product-sized particles with minimum time and binder consumption. The objective for the optimal control of the continuous process is to drive the process from one steady state to another in a minimum time with minimum binder consumption, which is also known as the state-driving problem. It has been known for some time that the binder spray-rate is the most effective control (manipulative) variable. Although other possible manipulative variables, such as feed flow-rate and additional powder flow-rate have been investigated in the complete research project, only the single input problem with the binder spray rate as the manipulative variable is addressed in the paper to demonstrate the methodology. It can be shown from simulation results that the proposed models are suitable for control and optimisation studies, and the optimisation algorithms connected with either steady state or dynamic models are successful for the determination of optimal operational conditions and dynamic trajectories with good convergence properties. (c) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Calibration of a groundwater model requires that hydraulic properties be estimated throughout a model domain. This generally constitutes an underdetermined inverse problem, for which a Solution can only be found when some kind of regularization device is included in the inversion process. Inclusion of regularization in the calibration process can be implicit, for example through the use of zones of constant parameter value, or explicit, for example through solution of a constrained minimization problem in which parameters are made to respect preferred values, or preferred relationships, to the degree necessary for a unique solution to be obtained. The cost of uniqueness is this: no matter which regularization methodology is employed, the inevitable consequence of its use is a loss of detail in the calibrated field. This, ill turn, can lead to erroneous predictions made by a model that is ostensibly well calibrated. Information made available as a by-product of the regularized inversion process allows the reasons for this loss of detail to be better understood. In particular, it is easily demonstrated that the estimated value for an hydraulic property at any point within a model domain is, in fact, a weighted average of the true hydraulic property over a much larger area. This averaging process causes loss of resolution in the estimated field. Where hydraulic conductivity is the hydraulic property being estimated, high averaging weights exist in areas that are strategically disposed with respect to measurement wells, while other areas may contribute very little to the estimated hydraulic conductivity at any point within the model domain, this possibly making the detection of hydraulic conductivity anomalies in these latter areas almost impossible. A study of the post-calibration parameter field covariance matrix allows further insights into the loss of system detail incurred through the calibration process to be gained. A comparison of pre- and post-calibration parameter covariance matrices shows that the latter often possess a much smaller spectral bandwidth than the former. It is also demonstrated that, as all inevitable consequence of the fact that a calibrated model cannot replicate every detail of the true system, model-to-measurement residuals can show a high degree of spatial correlation, a fact which must be taken into account when assessing these residuals either qualitatively, or quantitatively in the exploration of model predictive uncertainty. These principles are demonstrated using a synthetic case in which spatial parameter definition is based oil pilot points, and calibration is Implemented using both zones of piecewise constancy and constrained minimization regularization. (C) 2005 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This economic evaluation was part of the Australian National Evaluation of Pharmacotherapies for Opioid Dependence (NEPOD) project. Data from four trials of heroin detoxification methods, involving 365 participants, were pooled to enable a comprehensive comparison of the cost-effectiveness of five inpatient and outpatient detoxification methods. This study took the perspective of the treatment provider in assessing resource use and costs. Two short-term outcome measures were used-achievement of an initial 7-day period of abstinence, and entry into ongoing post-detoxification treatment. The mean costs of the various detoxification methods ranged widely, from AUD $491 (buprenorphine-based outpatient); to AUD $605 for conventional outpatient; AUD $1404 for conventional inpatient; AUD $1990 for rapid detoxification under sedation; and to AUD $2689 for anaesthesia per episode. An incremental cost-effectiveness analysis was carried out using conventional outpatient detoxification as the base comparator. The buprenorphine-based outpatient detoxification method was found to be the most cost-effective method overall, and rapid opioid detoxification under sedation was the most costeffective inpatient method.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A modern mineral processing plant represents a substantial investment. During the design process, there is often a period when costs (or revenues) must be compensated for by cuts in capital expenditure. In many cases, sampling and measurement equipment provides a soft target for such 'savings'. This process is almost analgous to reducing the capital investment in a corner store by not including a cash register. The consequences will be quite similar - a serious lack of sound performance data and plenty of opportunities for theft - deliberate or inadvertent. This paper makes the case that investment in sampling and measurement equipment is more cost-effective during the design phase. Further, a strong measurement culture will have many benefits including the ability to take advantage of small gains. In almost any business, there are many more opportunities to make small gains than to make large, step changes. In short, if a project cannot justify the cost of accurate and reliable measurement of its performance, it probably should not be a project at all.