26 resultados para log-based cost analysis
em Aston University Research Archive
Resumo:
Are the perceptions of professional economists on transaction costs consistent with make-or-buy decisions made within firms? The answer may have important implications for transaction cost research. Data on firms' outsourcing during the new product development process are taken from a largescale survey of UK, German and Irish manufacturing plants, and we test the consistency of these outsourcing decisions with the predictions derived from the transaction cost perceptions of a panel of economists. Little consistency is evident between actual outsourcing patterns and the predictions of the (Williamsonian) transactions cost model derived from the panel of economists. There is, however, evidence of a systematic pattern to the differences, suggesting that a competence or resource-based approach may be relevant to understanding firm outsourcing, and that firms are adopting a strategic approach to managing their external relationships. © Cambridge Political Economy Society 2005; all rights reserved.
Resumo:
Energy price is related to more than half of the total life cycle cost of asphalt pavements. Furthermore, the fluctuation related to price of energy has been much higher than the general inflation and interest rate. This makes the energy price inflation an important variable that should be addressed when performing life cycle cost (LCC) studies re- garding asphalt pavements. The present value of future costs is highly sensitive to the selected discount rate. Therefore, the choice of the discount rate is the most critical element in LCC analysis during the life time of a project. The objective of the paper is to present a discount rate for asphalt pavement projects as a function of interest rate, general inflation and energy price inflation. The discount rate is defined based on the portion of the energy related costs during the life time of the pavement. Consequently, it can reflect the financial risks related to the energy price in asphalt pavement projects. It is suggested that a discount rate sensitivity analysis for asphalt pavements in Sweden should range between –20 and 30%.
Resumo:
Transaction cost theory is one of the most widely used theories in marketing, management, and economics. The focus of the theory is on explaining how firms organize transactions. The rules by which transactions are organized is called governance. A wide variety of strategic decisions of firms, such as outsourcing, the mode of organizing exports, the use of crowdsourcing, or partner selection efforts, can be analyzed and understood using transaction cost theory. The basic argument of transaction cost theory is that firms economize on costs by choosing a form of governance that minimizes production and transaction costs. We discuss the origins and uses of the theory, critical variables, assumptions, and limitations.
Resumo:
The aim of this research was to improve the quantitative support to project planning and control principally through the use of more accurate forecasting for which new techniques were developed. This study arose from the observation that in most cases construction project forecasts were based on a methodology (c.1980) which relied on the DHSS cumulative cubic cost model and network based risk analysis (PERT). The former of these, in particular, imposes severe limitations which this study overcomes. Three areas of study were identified, namely growth curve forecasting, risk analysis and the interface of these quantitative techniques with project management. These fields have been used as a basis for the research programme. In order to give a sound basis for the research, industrial support was sought. This resulted in both the acquisition of cost profiles for a large number of projects and the opportunity to validate practical implementation. The outcome of this research project was deemed successful both in theory and practice. The new forecasting theory was shown to give major reductions in projection errors. The integration of the new predictive and risk analysis technologies with management principles, allowed the development of a viable software management aid which fills an acknowledged gap in current technology.
Resumo:
Nearest feature line-based subspace analysis is first proposed in this paper. Compared with conventional methods, the newly proposed one brings better generalization performance and incremental analysis. The projection point and feature line distance are expressed as a function of a subspace, which is obtained by minimizing the mean square feature line distance. Moreover, by adopting stochastic approximation rule to minimize the objective function in a gradient manner, the new method can be performed in an incremental mode, which makes it working well upon future data. Experimental results on the FERET face database and the UCI satellite image database demonstrate the effectiveness.
Resumo:
The authors propose a new approach to discourse analysis which is based on meta data from social networking behavior of learners who are submerged in a socially constructivist e-learning environment. It is shown that traditional data modeling techniques can be combined with social network analysis - an approach that promises to yield new insights into the largely uncharted domain of network-based discourse analysis. The chapter is treated as a non-technical introduction and is illustrated with real examples, visual representations, and empirical findings. Within the setting of a constructivist statistics course, the chapter provides an illustration of what network-based discourse analysis is about (mainly from a methodological point of view), how it is implemented in practice, and why it is relevant for researchers and educators.
Resumo:
BACKGROUND: Heavy menstrual bleeding (HMB) is a common problem, yet evidence to inform decisions about initial medical treatment is limited. OBJECTIVES: To assess the clinical effectiveness and cost-effectiveness of the levonorgestrel-releasing intrauterine system (LNG-IUS) (Mirena(®), Bayer) compared with usual medical treatment, with exploration of women's perspectives on treatment. DESIGN: A pragmatic, multicentre randomised trial with an economic evaluation and a longitudinal qualitative study. SETTING: Women who presented in primary care. PARTICIPANTS: A total of 571 women with HMB. A purposeful sample of 27 women who were randomised or ineligible owing to treatment preference participated in semistructured face-to-face interviews around 2 and 12 months after commencing treatment. INTERVENTIONS: LNG-IUS or usual medical treatment (tranexamic acid, mefenamic acid, combined oestrogen-progestogen or progesterone alone). Women could subsequently swap or cease their allocated treatment. OUTCOME MEASURES: The primary outcome was the patient-reported score on the Menorrhagia Multi-Attribute Scale (MMAS) assessed over a 2-year period and then again at 5 years. Secondary outcomes included general quality of life (QoL), sexual activity, surgical intervention and safety. Data were analysed using iterative constant comparison. A state transition model-based cost-utility analysis was undertaken alongside the randomised trial. Quality-adjusted life-years (QALYs) were derived from the European Quality of Life-5 Dimensions (EQ-5D) and the Short Form questionnaire-6 Dimensions (SF-6D). The intention-to-treat analyses were reported as cost per QALY gained. Uncertainty was explored by conducting both deterministic and probabilistic sensitivity analyses. RESULTS: The MMAS total scores improved significantly in both groups at all time points, but were significantly greater for the LNG-IUS than for usual treatment [mean difference over 2 years was 13.4 points, 95% confidence interval (CI) 9.9 to 16.9 points; p < 0.001]. However, this difference between groups was reduced and no longer significant by 5 years (mean difference in scores 3.9 points, 95% CI -0.6 to 8.3 points; p = 0.09). By 5 years, only 47% of women had a LNG-IUS in place and 15% were still taking usual medical treatment. Five-year surgery rates were low, at 20%, and were similar, irrespective of initial treatments. There were no significant differences in serious adverse events between groups. Using the EQ-5D, at 2 years, the relative cost-effectiveness of the LNG-IUS compared with usual medical treatment was £1600 per QALY, which by 5 years was reduced to £114 per QALY. Using the SF-6D, usual medical treatment dominates the LNG-IUS. The qualitative findings show that women's experiences and expectations of medical treatments for HMB vary considerably and change over time. Women had high expectations of a prompt effect from medical treatments. CONCLUSIONS: The LNG-IUS, compared with usual medical therapies, resulted in greater improvement over 2 years in women's assessments of the effect of HMB on their daily routine, including work, social and family life, and psychological and physical well-being. At 5 years, the differences were no longer significant. A similar low proportion of women required surgical intervention in both groups. The LNG-IUS is cost-effective in both the short and medium term, using the method generally recommended by the National Institute for Health and Care Excellence. Using the alternative measures to value QoL will have a considerable impact on cost-effectiveness decisions. It will be important to explore the clinical and health-care trajectories of the ECLIPSE (clinical effectiveness and cost-effectiveness of levonorgestrel-releasing intrauterine system in primary care against standard treatment for menorrhagia) trial participants to 10 years, by which time half of the cohort will have reached menopause. TRIAL REGISTRATION: Current Controlled Trials ISRCTN86566246. FUNDING: This project was funded by the NIHR Health Technology Assessment programme and will be published in full in Health Technology Assessment; Vol. 19, No. 88. See the NIHR Journals Library website for further project information.
Resumo:
Presents a simulation study of the costing of police custody operations at a UK police force. The custody operation incorporates the arrest, booking-in, interview, detention and court appearance activities. The Activity Based Costing (ABC) approach is used as a framework to show how costs are generated by the three “drivers” of cost, activity and resource. These relate to the design efficiency of the process, the timing and mix of demand on the process and the cost of resources used to undertake the process respectively. The use of discrete-event simulation allows the incorporation of dynamic (time-dependent) and stochastic (variability) elements in the cost analysis. This enables both the amount and timing of the use of capacity and the generation of cost to be established. The concept of committed and flexible resources directs management decisions to the redeployment of unused capacity or alternatively the identification of additional capacity requirements.
Resumo:
Predicting future need for water resources has traditionally been, at best, a crude mixture of art and science. This has prevented the evaluation of water need from being carried out in either a consistent or comprehensive manner. This inconsistent and somewhat arbitrary approach to water resources planning led to well publicised premature developments in the 1970's and 1980's but privatisation of the Water Industry, including creation of the Office of Water Services and the National Rivers Authority in 1989, turned the tide of resource planning to the point where funding of schemes and their justification by the Regulators could no longer be assumed. Furthermore, considerable areas of uncertainty were beginning to enter the debate and complicate the assessment It was also no longer appropriate to consider that contingencies would continue to lie solely on the demand side of the equation. An inability to calculate the balance between supply and demand may mean an inability to meet standards of service or, arguably worse, an excessive provision of water resources and excessive costs to customers. United Kingdom Water Industry Research limited (UKWlR) Headroom project in 1998 provided a simple methodology for the calculation of planning margins. This methodology, although well received, was not, however, accepted by the Regulators as a tool sufficient to promote resource development. This thesis begins by considering the history of water resource planning in the UK, moving on to discuss events following privatisation of the water industry post·1985. The mid section of the research forms the bulk of original work and provides a scoping exercise which reveals a catalogue of uncertainties prevalent within the supply-demand balance. Each of these uncertainties is considered in terms of materiality, scope, and whether it can be quantified within a risk analysis package. Many of the areas of uncertainty identified would merit further research. A workable, yet robust, methodology for evaluating the balance between water resources and water demands by using a spreadsheet based risk analysis package is presented. The technique involves statistical sampling and simulation such that samples are taken from input distributions on both the supply and demand side of the equation and the imbalance between supply and demand is calculated in the form of an output distribution. The percentiles of the output distribution represent different standards of service to the customer. The model allows dependencies between distributions to be considered, for improved uncertainties to be assessed and for the impact of uncertain solutions to any imbalance to be calculated directly. The method is considered a Significant leap forward in the field of water resource planning.
Resumo:
Design verification in the digital domain, using model-based principles, is a key research objective to address the industrial requirement for reduced physical testing and prototyping. For complex assemblies, the verification of design and the associated production methods is currently fragmented, prolonged and sub-optimal, as it uses digital and physical verification stages that are deployed in a sequential manner using multiple systems. This paper describes a novel, hybrid design verification methodology that integrates model-based variability analysis with measurement data of assemblies, in order to reduce simulation uncertainty and allow early design verification from the perspective of satisfying key assembly criteria.
Resumo:
The focus of this study is on the governance decisions in a concurrent channels context, in the case of uncertainty. The study examines how a firm chooses to deploy its sales force in times of uncertainty, and the subsequent performance outcome of those deployment choices. The theoretical framework is based on multiple theories of governance, including transaction cost analysis (TCA), agency theory, and institutional economics. Three uncertainty variables are investigated in this study. The first two are demand and competitive uncertainty which are considered to be industry-level market uncertainty forms. The third uncertainty, political uncertainty, is chosen as it is an important dimension of institutional environments, capturing non-economic circumstances such as regulations and political systemic issues. The study employs longitudinal secondary data from a Thai hotel chain, comprising monthly observations from January 2007 – December 2012. This hotel chain has its operations in 4 countries, Thailand, the Philippines, United Arab Emirates – Dubai, and Egypt, all of which experienced substantial demand, competitive, and political uncertainty during the study period. This makes them ideal contexts for this study. Two econometric models, both deploying Newey-West estimations, are employed to test 13 hypotheses. The first model considers the relationship between uncertainty and governance. The second model is a version of Newey-West, using an Instrumental Variables (IV) estimator and a Two-Stage Least Squares model (2SLS), to test the direct effect of uncertainty on performance and the moderating effect of governance on the relationship between uncertainty and performance. The observed relationship between uncertainty and governance observed follows a core prediction of TCA; that vertical integration is the preferred choice of governance when uncertainty rises. As for the subsequent performance outcomes, the results corroborate that uncertainty has a negative effect on performance. Importantly, the findings show that becoming more vertically integrated cannot help moderate the effect of demand and competitive uncertainty, but can significantly moderate the effect of political uncertainty. These findings have significant theoretical and practical implications, and extend our knowledge of the impact on uncertainty significantly, as well as bringing an institutional perspective to TCA. Further, they offer managers novel insight into the nature of different types of uncertainty, their impact on performance, and how channel decisions can mitigate these impacts.
Resumo:
Tensor analysis plays an important role in modern image and vision computing problems. Most of the existing tensor analysis approaches are based on the Frobenius norm, which makes them sensitive to outliers. In this paper, we propose L1-norm-based tensor analysis (TPCA-L1), which is robust to outliers. Experimental results upon face and other datasets demonstrate the advantages of the proposed approach. © 2006 IEEE.
Resumo:
Quantitative analysis of solid-state processes from isothermal microcalorimetric data is straightforward if data for the total process have been recorded and problematic (in the more likely case) when they have not. Data are usually plotted as a function of fraction reacted (α); for calorimetric data, this requires knowledge of the total heat change (Q) upon completion of the process. Determination of Q is difficult in cases where the process is fast (initial data missing) or slow (final data missing). Here we introduce several mathematical methods that allow the direct calculation of Q by selection of data points when only partial data are present, based on analysis with the Pérez-Maqueda model. All methods in addition allow direct determination of the reaction mechanism descriptors m and n and from this the rate constant, k. The validity of the methods is tested with the use of simulated calorimetric data, and we introduce a graphical method for generating solid-state power-time data. The methods are then applied to the crystallization of indomethacin from a glass. All methods correctly recovered the total reaction enthalpy (16.6 J) and suggested that the crystallization followed an Avrami model. The rate constants for crystallization were determined to be 3.98 × 10-6, 4.13 × 10-6, and 3.98 × 10 -6 s-1 with methods 1, 2, and 3, respectively. © 2010 American Chemical Society.
Resumo:
The research investigates the past, present and potential future role of Information Specialists (ISps) in process oriented companies. It tests the proposition that ISps in companies that have undertaken formal process reengineering exercises are likely to become more proactive and more business oriented (as opposed to technically oriented) than they had previously been when their organisations were organised along traditional, functional lines. A review of existing literature in the area of Business Process Reengineering and Information Management reveals a lack of consensus amongst researchers concerning the appropriate role for ISps during and after BPR. Opinion is divided as to whether IS professionals should reactively support BPR or whether IT/IS developments should be driving these initiatives. A questionnaire based ‘Descriptive Survey’ with 60 respondents is used as a first stage of primary data gathering. This is followed by follow-up interviews with 20 of the participating organisations to gather further information on their experiences. The final stage of data collection consists of further in-depth interview with four case study companies to provide an even richer picture of their experiences. The results of the questionnaire are analysed and displayed in the form of simple means, frequencies and bar graphs. The ‘NU-DIST’ computer based discourse analysis package was tried in relation to summarising the interview findings, but this proved cumbersome and a visual collation method is preferred. Overall, the researcher contends that the supposition outlined above is proven, and she concludes the research by suggesting the implications of these findings. In particular she offers a ‘Framework for Understanding and Action’ which is deemed to be relevant to both practitioners and future researchers.
Resumo:
The research compares the usefullness of four remote sensing information sources, these being LANDSAT photographic prints, LANDSAT computer compatible tapes, Metric Camera and SIR-A photographic prints. These sources provide evaluations of the catchment characteristics of the Belize and Sibun river basins in Central America. Map evaluations at 1:250,000 scale are compared to the results of the same scale, remotely sensed information sources. The values of catchment characteristics for both maps and LANDSAT prints are used in multiple regression analysis, providing flood flow formulae, after investigations to provide a suitable dependent variable discharge series are made for short term records. The use of all remotely sensed information sources in providing evaluations of catchment characteristics is discussed. LANDSAT prints and computer compatible tapes of a post flood scene are used to estimate flood distributions and volumes. These are compared to values obtained from unit hydrograph analysis, using the dependent discharge series and evaluate the probable losses from the Belize river to the floodplain, thereby assessing the accuracy of LANDSAT estimates. Information relating to flood behaviour is discussed in terms of basic image presentation as well as image processing. A cost analysis of the purchase and use of all materials is provided. Conclusions of the research indicate that LANDSAT print material may provide information suitable for regression analysis at levels of accuracy as great as those of topographic maps, that the differing information sources are uniquely applicable and that accurate estimates of flood volumes may be determined even by post flood imagery.