919 resultados para Adjustment cost models
Resumo:
An economic analysis has been performed to establish when it is advantageous to use structured packing in air separation plant. A model of a low pressure cycle was developed to calculate the power saved when packing is used, and cost models were developed for the columns and cold box. The rate of return was calculated on the extra investment required for a packed plant based on the annual power saving. Structured packing was found to be economic only in larger plants, where economies of scale mean that the increased capital cost becomes less significant compared with the power saved. It was also found that different sized plants favour different packings. The analysis identified that the packing variable with the biggest impact on the economic balance was the efficiency and that increasing the efficiency of current packings could enhance their balance in air distillation. A new packing was therefore developed to have a higher efficiency than conventional ones. The vapour phase resistance was targeted for reduction, since most packing models predict this to be dominant. The final shape was chosen as the easiest and most economic to make. It has converging and diverging channels and was manufactured in two specific areas and with two block heights by Tianjin University Packing Factory. A 0.3 m diameter distillation column test rig was designed, built and commissioned with the standard Sulzer Mellapak 500YW. It was then used to test the new packing alongside some standard ones. Because the packings had different specific areas, correlations of published results were developed to allow a true comparison to be made. The test results show that, unexpectedly, the packings with 0.1 m high blocks have an efficiency about 8% greater than the standard 0.2 m blocks. The new shape as implemented in the 350Y packing shows an additional 7% greater efficiency, so it is 15% better than a standard packing with the same area. It has a better efficiency than the Mellapak 500YW and the higher capacity associated with its lower area. The new 500Y did not show a significant advantage.
Resumo:
In this paper we propose a range of dynamic data envelopment analysis (DEA) models which allow information on costs of adjustment to be incorporated into the DEA framework. We first specify a basic dynamic DEA model predicated on a number or simplifying assumptions. We then outline a number of extensions to this model to accommodate asymmetric adjustment costs, non-static output quantities, non-static input prices, and non-static costs of adjustment, technological change, quasi-fixed inputs and investment budget constraints. The new dynamic DEA models provide valuable extra information relative to the standard static DEA models-they identify an optimal path of adjustment for the input quantities, and provide a measure of the potential cost savings that result from recognising the costs of adjusting input quantities towards the optimal point. The new models are illustrated using data relating to a chain of 35 retail department stores in Chile. The empirical results illustrate the wealth of information that can be derived from these models, and clearly show that static models overstate potential cost savings when adjustment costs are non-zero.
Resumo:
Investors value the special attributes of monetary assets (e.g., exchangeability, liquidity, and safety) and pay a premium for holding them in the form of a lower return rate -- The user cost of holding monetary assets can be measured approximately by the difference between the returns on illiquid risky assets and those of safer liquid assets -- A more appropriate measure should adjust this difference by the differential risk of the assets in question -- We investigate the impact that time non-separable preferences has on the estimation of the risk-adjusted user cost of money -- Using U.K. data from 1965Q1 to 2011Q1, we estimate a habit-based asset pricing model with money in the utility function and find that the risk adjustment for risky monetary assets is negligible -- Thus, researchers can dispense with risk adjusting the user cost of money in constructing monetary aggregate indexes
Resumo:
Quality oriented management systems and methods have become the dominant business and governance paradigm. From this perspective, satisfying customers’ expectations by supplying reliable, good quality products and services is the key factor for an organization and even government. During recent decades, Statistical Quality Control (SQC) methods have been developed as the technical core of quality management and continuous improvement philosophy and now are being applied widely to improve the quality of products and services in industrial and business sectors. Recently SQC tools, in particular quality control charts, have been used in healthcare surveillance. In some cases, these tools have been modified and developed to better suit the health sector characteristics and needs. It seems that some of the work in the healthcare area has evolved independently of the development of industrial statistical process control methods. Therefore analysing and comparing paradigms and the characteristics of quality control charts and techniques across the different sectors presents some opportunities for transferring knowledge and future development in each sectors. Meanwhile considering capabilities of Bayesian approach particularly Bayesian hierarchical models and computational techniques in which all uncertainty are expressed as a structure of probability, facilitates decision making and cost-effectiveness analyses. Therefore, this research investigates the use of quality improvement cycle in a health vii setting using clinical data from a hospital. The need of clinical data for monitoring purposes is investigated in two aspects. A framework and appropriate tools from the industrial context are proposed and applied to evaluate and improve data quality in available datasets and data flow; then a data capturing algorithm using Bayesian decision making methods is developed to determine economical sample size for statistical analyses within the quality improvement cycle. Following ensuring clinical data quality, some characteristics of control charts in the health context including the necessity of monitoring attribute data and correlated quality characteristics are considered. To this end, multivariate control charts from an industrial context are adapted to monitor radiation delivered to patients undergoing diagnostic coronary angiogram and various risk-adjusted control charts are constructed and investigated in monitoring binary outcomes of clinical interventions as well as postintervention survival time. Meanwhile, adoption of a Bayesian approach is proposed as a new framework in estimation of change point following control chart’s signal. This estimate aims to facilitate root causes efforts in quality improvement cycle since it cuts the search for the potential causes of detected changes to a tighter time-frame prior to the signal. This approach enables us to obtain highly informative estimates for change point parameters since probability distribution based results are obtained. Using Bayesian hierarchical models and Markov chain Monte Carlo computational methods, Bayesian estimators of the time and the magnitude of various change scenarios including step change, linear trend and multiple change in a Poisson process are developed and investigated. The benefits of change point investigation is revisited and promoted in monitoring hospital outcomes where the developed Bayesian estimator reports the true time of the shifts, compared to priori known causes, detected by control charts in monitoring rate of excess usage of blood products and major adverse events during and after cardiac surgery in a local hospital. The development of the Bayesian change point estimators are then followed in a healthcare surveillances for processes in which pre-intervention characteristics of patients are viii affecting the outcomes. In this setting, at first, the Bayesian estimator is extended to capture the patient mix, covariates, through risk models underlying risk-adjusted control charts. Variations of the estimator are developed to estimate the true time of step changes and linear trends in odds ratio of intensive care unit outcomes in a local hospital. Secondly, the Bayesian estimator is extended to identify the time of a shift in mean survival time after a clinical intervention which is being monitored by riskadjusted survival time control charts. In this context, the survival time after a clinical intervention is also affected by patient mix and the survival function is constructed using survival prediction model. The simulation study undertaken in each research component and obtained results highly recommend the developed Bayesian estimators as a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances as well as industrial and business contexts. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The empirical results and simulations indicate that the Bayesian estimators are a strong alternative in change point estimation within quality improvement cycle in healthcare surveillances. The superiority of the proposed Bayesian framework and estimators are enhanced when probability quantification, flexibility and generalizability of the developed model are also considered. The advantages of the Bayesian approach seen in general context of quality control may also be extended in the industrial and business domains where quality monitoring was initially developed.
Resumo:
Developers and policy makers are consistently at odds over the debate as to whether impact fees increase house prices. This debate continues despite the extensive body of theoretical and empirical international literature that discusses the passing on to home buyers of impact fees, and the corresponding increase to housing prices. In attempting to quantify this impact, over a dozen empirical studies have been carried out in the US and Canada since the 1980’s. However the methodologies used vary greatly, as do the results. Despite similar infrastructure funding policies in numerous developed countries, no such empirical works exist outside of the US/Canada. The purpose of this research is to analyse the existing econometric models in order to identify, compare and contrast the theoretical bases, methodologies, key assumptions and findings of each. This research will assist in identifying if further model development is required and/or whether any of these models have external validity and are readily transferable outside of the US. The findings conclude that there is very little explicit rationale behind the various model selections and that significant model deficiencies appear still to exist.
Resumo:
During the early design stages of construction projects, accurate and timely cost feedback is critical to design decision making. This is particularly challenging for cost estimators, as they must quickly and accurately estimate the cost of the building when the design is still incomplete and evolving. State-of-the-art software tools typically use a rule-based approach to generate detailed quantities from the design details present in a building model and relate them to the cost items in a cost estimating database. In this paper, we propose a generic approach for creating and maintaining a cost estimate using flexible mappings between a building model and a cost estimate. The approach uses queries on the building design that are used to populate views, and each view is then associated with one or more cost items. The benefit of this approach is that the flexibility of modern query languages allows the estimator to encode a broad variety of relationships between the design and estimate. It also avoids the use of a common standard to which both designers and estimators must conform, allowing the estimator added flexibility and functionality to their work.
Resumo:
This study analyses and compares the cost efficiency of Japanese steam power generation companies using the fixed and random Bayesian frontier models. We show that it is essential to account for heterogeneity in modelling the performance of energy companies. Results from the model estimation also indicate that restricting CO2 emissions can lead to a decrease in total cost. The study finally discusses the efficiency variations between the energy companies under analysis, and elaborates on the managerial and policy implications of the results.
Resumo:
This digital poster (which was on display at "The Cube", Queensland University of Technology) demonstrates how specification parameters can be extracted from a product library repository for use in augmenting the information contents of the objects in a local BIM tool (Revit in this instance).
Resumo:
To identify current ED models of care and their impact on care quality, care effectiveness, and cost. A systematic search of key health databases (Medline, CINAHL, Cochrane, EMbase) was conducted to identify literature on ED models of care. Additionally, a focused review of the contents of 11 international and national emergency medicine, nursing and health economic journals (published between 2010 and 2013) was undertaken with snowball identification of references of the most recent and relevant papers. Articles published between 1998 and 2013 in the English language were included for initial review by three of the authors. Studies in underdeveloped countries and not addressing the objectives of the present study were excluded. Relevant details were extracted from the retrieved literature, and analysed for relevance and impact. The literature was synthesised around the study's main themes. Models described within the literature mainly focused on addressing issues at the input, throughput or output stages of ED care delivery. Models often varied to account for site specific characteristics (e.g. onsite inpatient units) or to suit staffing profiles (e.g. extended scope physiotherapist), ED geographical location (e.g. metropolitan or rural site), and patient demographic profile (e.g. paediatrics, older persons, ethnicity). Only a few studies conducted cost-effectiveness analysis of service models. Although various models of delivering emergency healthcare exist, further research is required in order to make accurate and reliable assessments of their safety, clinical effectiveness and cost-effectiveness.
Resumo:
To facilitate marketing and export, the Australian macadamia industry requires accurate crop forecasts. Each year, two levels of crop predictions are produced for this industry. The first is an overall longer-term forecast based on tree census data of growers in the Australian Macadamia Society (AMS). This data set currently accounts for around 70% of total production, and is supplemented by our best estimates of non-AMS orchards. Given these total tree numbers, average yields per tree are needed to complete the long-term forecasts. Yields from regional variety trials were initially used, but were found to be consistently higher than the average yields that growers were obtaining. Hence, a statistical model was developed using growers' historical yields, also taken from the AMS database. This model accounted for the effects of tree age, variety, year, region and tree spacing, and explained 65% of the total variation in the yield per tree data. The second level of crop prediction is an annual climate adjustment of these overall long-term estimates, taking into account the expected effects on production of the previous year's climate. This adjustment is based on relative historical yields, measured as the percentage deviance between expected and actual production. The dominant climatic variables are observed temperature, evaporation, solar radiation and modelled water stress. Initially, a number of alternate statistical models showed good agreement within the historical data, with jack-knife cross-validation R2 values of 96% or better. However, forecasts varied quite widely between these alternate models. Exploratory multivariate analyses and nearest-neighbour methods were used to investigate these differences. For 2001-2003, the overall forecasts were in the right direction (when compared with the long-term expected values), but were over-estimates. In 2004 the forecast was well under the observed production, and in 2005 the revised models produced a forecast within 5.1% of the actual production. Over the first five years of forecasting, the absolute deviance for the climate-adjustment models averaged 10.1%, just outside the targeted objective of 10%.
Resumo:
This paper studies the impact of "liberalizing " the cost-sharing of links on some basic models of network formation. This is done in a setting where both doubly supported and singly supported links are possible, and which includes the two seminal models of network formation by Jackson and Wolinsky and Bala and Goyal as extreme cases. In this setting, the notion of pairwise stability is extended and it is proved that liberalizing cost-sharing for doubly supported links widens the range of values of the parameters where the efficient networks formed by such type of links are pairwise stable, while the range of values of the parameters where the efficient networks formed by singly supported links are pairwise stable shrinks, but the region where the latter are e¢ cient and pairwise stable remains the same.
Resumo:
Contém resumo