979 resultados para cost estimating tools
Resumo:
The objective of this thesis is to investigate, through an empirical study, the different functions of the highways maintenance departments and to suggest methods by means of which road maintenance work could be carried out in a more efficient way by utilising its resources of men, material and plant to the utmost advantage. This is particularly important under the present circumstances of national financial difficulties which have resulted in continuous cuts in public expenditure. In order to achieve this objective, the researcher carried out a survey among several Highways Authorities by means of questionnaire and interview. The information so collected was analysed in order to understand the actual, practical situation within highways manintenance departments, and highlight any existing problems, and try to answer the question of how they could become more efficient. According to the results obtained by the questionnaire and the interview, and the analysis of these results, the researcher concludes that it is the management system where least has been done, and where problems exist and are most complex. The management of highways maintenance departments argue that the reasons for their problems include both financial and organisational difficulties, apart from the political aspect and nature of the activities undertaken. The researcher believes that this ought to necessitate improving the management's analytical tools and techniques in order to achieve the most effective way of performing each activity. To this end the researcher recommends several related procedures to be adopted by the management of the highways maintenance departments. These recommendations, arising from the study, involve the technical, practical and human aspects. These are essential factors of which management should be aware - and certainly should not neglect - in order to achieve its objectives of improved productivity in the highways maintenance departments.
Resumo:
This paper explores the potential for cost savings in the general Practice units of a Primary Care Trust (PCT) in the UK. We have used Data Envelopment Analysis (DEA) to identify benchmark Practices, which offer the lowest aggregate referral and drugs costs controlling for the number, age, gender, and deprivation level of the patients registered with each Practice. For the remaining, non-benchmark Practices, estimates of the potential for savings on referral and drug costs were obtained. Such savings could be delivered through a combination of the following actions: (i) reducing the levels of referrals and prescriptions without affecting their mix (£15.74 m savings were identified, representing 6.4% of total expenditure); (ii) switching between inpatient and outpatient referrals and/or drug treatment to exploit differences in their unit costs (£10.61 m savings were identified, representing 4.3% of total expenditure); (iii) seeking a different profile of referral and drug unit costs (£11.81 m savings were identified, representing 4.8% of total expenditure). © 2012 Elsevier B.V. All rights reserved.
Resumo:
Successful commercialization of a technology such as Fiber Bragg Gratings requires the ability to manufacture devices repeatably, quickly and at low cost. Although the first report of photorefractive gratings was in 1978 it was not until 1993, when phase mask fabrication was demonstrated, that this became feasible. More recently, draw tower fabrication on a production level and grating writing through the polymer jacket have been realized; both important developments since they preserve the intrinsic strength of the fiber. Potentially the most significant recent development has been femtosecond laser inscription of gratings. Although not yet a commercial technology, it provides the means of writing multiple gratings in the optical core providing directional sensing capability in a single fiber. Femtosecond processing can also be used to machine the fiber to produce micronscale slots and holes enhancing the interaction between the light in the core and the surrounding medium. © 2011 Bentham Science Publishers Ltd. All rights reserved.
Resumo:
High precision manufacturers continuously seek out disruptive technologies to improve the quality, cost, and delivery of their products. With the advancement of machine tool and measurement technology many companies are ready to capitalise on the opportunity of on-machine measurement (OMM). Coupled with business case, manufacturing engineers are now questioning whether OMM can soon eliminate the need for post-process inspection systems. Metrologists will however argue that the machining environment is too hostile and that there are numerous process variables which need consideration before traceable measurement on-the-machine can be achieved. In this paper we test the measurement capability of five new multi-axis machine tools enabled as OMM systems via on-machine probing. All systems are tested under various operating conditions in order to better understand the effects of potentially significant variables. This investigation has found that key process variables such as machine tool warm-up and tool-change cycles can have an effect on machine tool measurement repeatability. New data presented here is important to many manufacturers whom are considering utilising their high precision multi-axis machine tools for both the creation and verification of their products.
Resumo:
Successful commercialization of a technology such as Fiber Bragg Gratings requires the ability to manufacture devices repeatably, quickly and at low cost. Although the first report of photorefractive gratings was in 1978 it was not until 1993, when phase mask fabrication was demonstrated, that this became feasible. More recently, draw tower fabrication on a production level and grating writing through the polymer jacket have been realized; both important developments since they preserve the intrinsic strength of the fiber. Potentially the most significant recent development has been femtosecond laser inscription of gratings. Although not yet a commercial technology, it provides the means of writing multiple gratings in the optical core providing directional sensing capability in a single fiber. Femtosecond processing can also be used to machine the fiber to produce micronscale slots and holes enhancing the interaction between the light in the core and the surrounding medium. © 2011 Bentham Science Publishers Ltd. All rights reserved.
Resumo:
Environmentally conscious construction has received a significant amount of research attention during the last decades. Even though construction literature is rich in studies that emphasize the importance of environmental impact during the construction phase, most of the previous studies failed to combine environmental analysis with other project performance criteria in construction. This is mainly because most of the studies have overlooked the multi-objective nature of construction projects. In order to achieve environmentally conscious construction, multi-objectives and their relationships need to be successfully analyzed in the complex construction environment. The complex construction system is composed of changing project conditions that have an impact on the relationship between time, cost and environmental impact (TCEI) of construction operations. Yet, this impact is still unknown by construction professionals. Studying this impact is vital to fulfill multiple project objectives and achieve environmentally conscious construction. This research proposes an analytical framework to analyze the impact of changing project conditions on the relationship of TCEI. This study includes green house gas (GHG) emissions as an environmental impact category. The methodology utilizes multi-agent systems, multi-objective optimization, analytical network process, and system dynamics tools to study the relationships of TCEI and support decision-making under the influence of project conditions. Life cycle assessment (LCA) is applied to the evaluation of environmental impact in terms of GHG. The mixed method approach allowed for the collection and analysis of qualitative and quantitative data. Structured interviews of professionals in the highway construction field were conducted to gain their perspectives in decision-making under the influence of certain project conditions, while the quantitative data were collected from the Florida Department of Transportation (FDOT) for highway resurfacing projects. The data collected were used to test the framework. The framework yielded statistically significant results in simulating project conditions and optimizing TCEI. The results showed that the change in project conditions had a significant impact on the TCEI optimal solutions. The correlation between TCEI suggested that they affected each other positively, but in different strengths. The findings of the study will assist contractors to visualize the impact of their decision on the relationship of TCEI.
Resumo:
People go through their life making all kinds of decisions, and some of these decisions affect their demand for transportation, for example, their choices of where to live and where to work, how and when to travel and which route to take. Transport related choices are typically time dependent and characterized by large number of alternatives that can be spatially correlated. This thesis deals with models that can be used to analyze and predict discrete choices in large-scale networks. The proposed models and methods are highly relevant for, but not limited to, transport applications. We model decisions as sequences of choices within the dynamic discrete choice framework, also known as parametric Markov decision processes. Such models are known to be difficult to estimate and to apply to make predictions because dynamic programming problems need to be solved in order to compute choice probabilities. In this thesis we show that it is possible to explore the network structure and the flexibility of dynamic programming so that the dynamic discrete choice modeling approach is not only useful to model time dependent choices, but also makes it easier to model large-scale static choices. The thesis consists of seven articles containing a number of models and methods for estimating, applying and testing large-scale discrete choice models. In the following we group the contributions under three themes: route choice modeling, large-scale multivariate extreme value (MEV) model estimation and nonlinear optimization algorithms. Five articles are related to route choice modeling. We propose different dynamic discrete choice models that allow paths to be correlated based on the MEV and mixed logit models. The resulting route choice models become expensive to estimate and we deal with this challenge by proposing innovative methods that allow to reduce the estimation cost. For example, we propose a decomposition method that not only opens up for possibility of mixing, but also speeds up the estimation for simple logit models, which has implications also for traffic simulation. Moreover, we compare the utility maximization and regret minimization decision rules, and we propose a misspecification test for logit-based route choice models. The second theme is related to the estimation of static discrete choice models with large choice sets. We establish that a class of MEV models can be reformulated as dynamic discrete choice models on the networks of correlation structures. These dynamic models can then be estimated quickly using dynamic programming techniques and an efficient nonlinear optimization algorithm. Finally, the third theme focuses on structured quasi-Newton techniques for estimating discrete choice models by maximum likelihood. We examine and adapt switching methods that can be easily integrated into usual optimization algorithms (line search and trust region) to accelerate the estimation process. The proposed dynamic discrete choice models and estimation methods can be used in various discrete choice applications. In the area of big data analytics, models that can deal with large choice sets and sequential choices are important. Our research can therefore be of interest in various demand analysis applications (predictive analytics) or can be integrated with optimization models (prescriptive analytics). Furthermore, our studies indicate the potential of dynamic programming techniques in this context, even for static models, which opens up a variety of future research directions.
Resumo:
This thesis presents details of the design and development of novel tools and instruments for scanning tunneling microscopy (STM), and may be considered as a repository for several years' worth of development work. The author presents design goals and implementations for two microscopes. First, a novel Pan-type STM was built that could be operated in an ambient environment as a liquid-phase STM. Unique features of this microscope include a unibody frame, for increased microscope rigidity, a novel slider component with large Z-range, a unique wiring scheme and damping mechanism, and a removable liquid cell. The microscope exhibits a high level of mechanical isolation at the tunnel junction, and operates excellently as an ambient tool. Experiments in liquid are on-going. Simultaneously, the author worked on designs for a novel low temperature, ultra-high vacuum (LT-UHV) instrument, and these are presented as well. A novel stick-slip vertical coarse approach motor was designed and built. To gauge the performance of the motor, an in situ motion sensing apparatus was implemented, which could measure the step size of the motor to high precision. A new driving circuit for stick-slip inertial motors is also presented, that o ffers improved performance over our previous driving circuit, at a fraction of the cost. The circuit was shown to increase step size performance by 25%. Finally, a horizontal sample stage was implemented in this microscope. The build of this UHV instrument is currently being fi nalized. In conjunction with the above design projects, the author was involved in a collaborative project characterizing N-heterocyclic carbene (NHC) self-assembled monolayers (SAMs) on Au(111) films. STM was used to characterize Au substrate quality, for both commercial substrates and those manufactured via a unique atomic layer deposition (ALD) process by collaborators. Ambient and UHV STM was then also used to characterize the NHC/Au(111) films themselves, and several key properties of these films are discussed. During this study, the author discovered an unexpected surface contaminant, and details of this are also presented. Finally, two models are presented for the nature of the NHC-Au(111) surface interaction based on the observed film properties, and some preliminary theoretical work by collaborators is presented.
Resumo:
Genetic mutations can cause a wide range of diseases, e.g. cancer. Gene therapy has the potential to alleviate or even cure these diseases. One of the many gene therapies developed so far is RNA-cleaving deoxyribozymes, short DNA oligonucleotides that specifically bind to and cleave RNA. Since the development of these synthetic catalytic oligonucleotides, the main way of determining their cleavage kinetics has been through the use of a laborious and error prone gel assay to quantify substrate and product at different time-points. We have developed two new methods for this purpose. The first one includes a fluorescent intercalating dye, PicoGreen, which has an increased fluorescence upon binding double-stranded oligonucleotides; during the course of the reaction the fluorescence intensity will decrease as the RNA is cleaved and dissociates from the deoxyribozyme. A second method was developed based on the common denominator of all nucleases, each cleavage event exposes a single phosphate of the oligonucleotide phosphate backbone; the exposed phosphate can simultaneously be released by a phosphatase and directly quantified by a fluorescent phosphate sensor. This method allows for multiple turnover kinetics of diverse types of nucleases, including deoxyribozymes and protein nucleases. The main challenge of gene therapy is often the delivery into the cell. To bypass cellular defenses researchers have used a vast number of methods; one of these are cell-penetrating peptides which can be either covalently coupled to or non-covalently complexed with a cargo to deliver it into a cell. To further evolve cell-penetrating peptides and understand how they work we developed an assay to be able to quickly screen different conditions in a high-throughput manner. A luciferase up- and downregulation experiment was used together with a reduction of the experimental time by 1 day, upscaling from 24- to 96-well plates and the cost was reduced by 95% compared to commercially available assays. In the last paper we evaluated if cell-penetrating peptides could be used to improve the uptake of an LNA oligonucleotide mimic of GRN163L, a telomerase-inhibiting oligonucleotide. The combination of cell-penetrating peptides and our mimic oligonucleotide lead to an IC50 more than 20 times lower than that of GRN163L.
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
Shearing is a fast and inexpensive method to cut sheet metal that has been used since the beginning of the industrialism. Consequently, published experimental studies of shearing can be found from over a century back in time. Recent studies, however, are due to the availability of low cost digital computation power, mostly based on finite element simulations that guarantees quick results. Still, for validation of models and simulations, accurate experimental data is a requisite. When applicable, 2D models are in general desirable over 3D models because of advantages like low computation time and easy model formulation. Shearing of sheet metal with parallel tools is successfully modelled in 2D with a plane strain approximation, but with angled tools the approximation is less obvious. Therefore, plane strain approximations for shearing with angled tools were evaluated by shear experiments of high accuracy. Tool angle, tool clearance, and clamping of the sheet were varied in the experiments. The results showed that the measured forces in shearing with angled tools can be approximately calculated using force measurements from shearing with parallel tools. Shearing energy was introduced as a quantifiable measure of suitable tool clearance range. The effects of the shearing parameters on forces were in agreement with previous studies. Based on the agreement between calculations and experiments, analysis based on a plane strain assumption is considered applicable for angled tools with a small (up to 2 degrees) rake angle.
Resumo:
Nervous system disorders are associated with cognitive and motor deficits, and are responsible for the highest disability rates and global burden of disease. Their recovery paths are vulnerable and dependent on the effective combination of plastic brain tissue properties, with complex, lengthy and expensive neurorehabilitation programs. This work explores two lines of research, envisioning sustainable solutions to improve treatment of cognitive and motor deficits. Both projects were developed in parallel and shared a new sensible approach, where low-cost technologies were integrated with common clinical operative procedures. The aim was to achieve more intensive treatments under specialized monitoring, improve clinical decision-making and increase access to healthcare. The first project (articles I – III) concerned the development and evaluation of a web-based cognitive training platform (COGWEB), suitable for intensive use, either at home or at institutions, and across a wide spectrum of ages and diseases that impair cognitive functioning. It was tested for usability in a memory clinic setting and implemented in a collaborative network, comprising 41 centers and 60 professionals. An adherence and intensity study revealed a compliance of 82.8% at six months and an average of six hours/week of continued online cognitive training activities. The second project (articles IV – VI) was designed to create and validate an intelligent rehabilitation device to administer proprioceptive stimuli on the hemiparetic side of stroke patients while performing ambulatory movement characterization (SWORD). Targeted vibratory stimulation was found to be well tolerated and an automatic motor characterization system retrieved results comparable to the first items of the Wolf Motor Function Test. The global system was tested in a randomized placebo controlled trial to assess its impact on a common motor rehabilitation task in a relevant clinical environment (early post-stroke). The number of correct movements on a hand-to-mouth task was increased by an average of 7.2/minute while the probability to perform an error decreased from 1:3 to 1:9. Neurorehabilitation and neuroplasticity are shifting to more neuroscience driven approaches. Simultaneously, their final utility for patients and society is largely dependent on the development of more effective technologies that facilitate the dissemination of knowledge produced during the process. The results attained through this work represent a step forward in that direction. Their impact on the quality of rehabilitation services and public health is discussed according to clinical, technological and organizational perspectives. Such a process of thinking and oriented speculation has led to the debate of subsequent hypotheses, already being explored in novel research paths.
Resumo:
In Queensland the subtropical strawberry (Fragaria ×ananassa) breeding program aims to combine traits into new genotypes that increase production efficiency. The contribution of individual plant traits to cost and income under subtropical Queensland conditions has been investigated. The study adapted knowledge of traits and the production and marketing system to assess the economic impact (gross margin) of new cultivars on the system, with the overall goal of improving the profitability of the industry through the release of new strawberry cultivars. Genotypes varied widely in their effect on gross margin, from 48% above to 10% below the base value. The advantage of a new genotype was also affected by the proportion of total area allocated to the new genotype. The largest difference in gross margin between that at optimum allocation (8% increase in gross margin) and an all of industry allocation (20% decrease in gross margin) of area to the genotype was 28%. While in other cases the all of industry allocation was also the optimum allocation, with one genotype giving a 48% benefit in gross margin.
Resumo:
Plantings of mixed native species (termed 'environmental plantings') are increasingly being established for carbon sequestration whilst providing additional environmental benefits such as biodiversity and water quality. In Australia, they are currently one of the most common forms of reforestation. Investment in establishing and maintaining such plantings relies on having a cost-effective modelling approach to providing unbiased estimates of biomass production and carbon sequestration rates. In Australia, the Full Carbon Accounting Model (FullCAM) is used for both national greenhouse gas accounting and project-scale sequestration activities. Prior to undertaking the work presented here, the FullCAM tree growth curve was not calibrated specifically for environmental plantings and generally under-estimated their biomass. Here we collected and analysed above-ground biomass data from 605 mixed-species environmental plantings, and tested the effects of several planting characteristics on growth rates. Plantings were then categorised based on significant differences in growth rates. Growth of plantings differed between temperate and tropical regions. Tropical plantings were relatively uniform in terms of planting methods and their growth was largely related to stand age, consistent with the un-calibrated growth curve. However, in temperate regions where plantings were more variable, key factors influencing growth were planting width, stand density and species-mix (proportion of individuals that were trees). These categories provided the basis for FullCAM calibration. Although the overall model efficiency was only 39-46%, there was nonetheless no significant bias when the model was applied to the various planting categories. Thus, modelled estimates of biomass accumulation will be reliable on average, but estimates at any particular location will be uncertain, with either under- or over-prediction possible. When compared with the un-calibrated yield curves, predictions using the new calibrations show that early growth is likely to be more rapid and total above-ground biomass may be higher for many plantings at maturity. This study has considerably improved understanding of the patterns of growth in different types of environmental plantings, and in modelling biomass accumulation in young (<25. years old) plantings. However, significant challenges remain to understand longer-term stand dynamics, particularly with temporal changes in stand density and species composition. © 2014.
Resumo:
When ambient air quality standards established in the EU Directive 2008/50/EC are exceeded, Member States are obliged to develop and implement Air Quality Plans (AQP) to improve air quality and health. Notwithstanding the achievements in emission reductions and air quality improvement, additional efforts need to be undertaken to improve air quality in a sustainable way - i.e. through a cost-efficiency approach. This work was developed in the scope of the recently concluded MAPLIA project "Moving from Air Pollution to Local Integrated Assessment", and focuses on the definition and assessment of emission abatement measures and their associated costs, air quality and health impacts and benefits by means of air quality modelling tools, health impact functions and cost-efficiency analysis. The MAPLIA system was applied to the Grande Porto urban area (Portugal), addressing PM10 and NOx as the most important pollutants in the region. Four different measures to reduce PM10 and NOx emissions were defined and characterized in terms of emissions and implementation costs, and combined into 15 emission scenarios, simulated by the TAPM air quality modelling tool. Air pollutant concentration fields were then used to estimate health benefits in terms of avoided costs (external costs), using dose-response health impact functions. Results revealed that, among the 15 scenarios analysed, the scenario including all 4 measures lead to a total net benefit of 0.3M€·y(-1). The largest net benefit is obtained for the scenario considering the conversion of 50% of open fire places into heat recovery wood stoves. Although the implementation costs of this measure are high, the benefits outweigh the costs. Research outcomes confirm that the MAPLIA system is useful for policy decision support on air quality improvement strategies, and could be applied to other urban areas where AQP need to be implemented and monitored.