903 resultados para cost-informed process improvement
Resumo:
There is an increasing need of a model for the process-based performance measurement of multispecialty tertiary care hospitals for quality improvement. Analytic hierarchy process (AHP) is utilized in this study to evolve such a model. Each step in the model was derived by group-discussions and brainstorming sessions among experienced clinicians and managers. This tool was applied to two tertiary care teaching hospitals in Barbados and India. The model enabled identification of specific areas where neither hospital performed very well, and helped to suggest recommendations to improve those areas. AHP is recommended as a valuable tool to measure the process-based performance of multispecialty tertiary care hospitals. © Emerald Group Publishing Limited.
Resumo:
Purpose: To develop a model for the global performance measurement of intensive care units (ICUs) and to apply that model to compare the services for quality improvement. Materials and Methods: Analytic hierarchy process, a multiple-attribute decision-making technique, is used in this study to evolve such a model. The steps consisted of identifying the critical success factors for the best performance of an ICU, identifying subfactors that influence the critical factors, comparing them pairwise, deriving their relative importance and ratings, and calculating the cumulative performance according to the attributes of a given ICU. Every step in the model was derived by group discussions, brainstorming, and consensus among intensivists. Results: The model was applied to 3 ICUs, 1 each in Barbados, Trinidad, and India in tertiary care teaching hospitals of similar setting. The cumulative performance rating of the Barbados ICU was 1.17 when compared with that of Trinidad and Indian ICU, which were 0.82 and 0.75, respectively, showing that the Trinidad and Indian ICUs performed 70% and 64% with respect to Barbados ICU. The model also enabled identifying specific areas where the ICUs did not perform well, which helped to improvise those areas. Conclusions: Analytic hierarchy process is a very useful model to measure the global performance of an ICU. © 2005 Elsevier Inc. All rights reserved.
Resumo:
The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests efficient design and operation philosophy, construction methodology and logical insurance plans. The risk-based model uses Analytic Hierarchy Process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.
Resumo:
In order to survive in the increasingly customer-oriented marketplace, continuous quality improvement marks the fastest growing quality organization’s success. In recent years, attention has been focused on intelligent systems which have shown great promise in supporting quality control. However, only a small number of the currently used systems are reported to be operating effectively because they are designed to maintain a quality level within the specified process, rather than to focus on cooperation within the production workflow. This paper proposes an intelligent system with a newly designed algorithm and the universal process data exchange standard to overcome the challenges of demanding customers who seek high-quality and low-cost products. The intelligent quality management system is equipped with the ‘‘distributed process mining” feature to provide all levels of employees with the ability to understand the relationships between processes, especially when any aspect of the process is going to degrade or fail. An example of generalized fuzzy association rules are applied in manufacturing sector to demonstrate how the proposed iterative process mining algorithm finds the relationships between distributed process parameters and the presence of quality problems.
Resumo:
Are the perceptions of professional economists on transaction costs consistent with make-or-buy decisions made within firms? The answer may have important implications for transaction cost research. Data on firms' outsourcing during the new product development process are taken from a largescale survey of UK, German and Irish manufacturing plants, and we test the consistency of these outsourcing decisions with the predictions derived from the transaction cost perceptions of a panel of economists. Little consistency is evident between actual outsourcing patterns and the predictions of the (Williamsonian) transactions cost model derived from the panel of economists. There is, however, evidence of a systematic pattern to the differences, suggesting that a competence or resource-based approach may be relevant to understanding firm outsourcing, and that firms are adopting a strategic approach to managing their external relationships. © Cambridge Political Economy Society 2005; all rights reserved.
Resumo:
The topic of bioenergy, biofuels and bioproducts remains at the top of the current political and research agenda. Identification of the optimum processing routes for biomass, in terms of efficiency, cost, environment and socio-economics is vital as concern grows over the remaining fossil fuel resources, climate change and energy security. It is known that the only renewable way of producing conventional hydrocarbon fuels and organic chemicals is from biomass, but the problem remains of identifying the best product mix and the most efficient way of processing biomass to products. The aim is to move Europe towards a biobased economy and it is widely accepted that biorefineries are key to this development. A methodology was required for the generation and evaluation of biorefinery process chains for converting biomass into one or more valuable products that properly considers performance, cost, environment, socio-economics and other factors that influence the commercial viability of a process. In this thesis a methodology to achieve this objective is described. The completed methodology includes process chain generation, process modelling and subsequent analysis and comparison of results in order to evaluate alternative process routes. A modular structure was chosen to allow greater flexibility and allowing the user to generate a large number of different biorefinery configurations The significance of the approach is that the methodology is defined and is thus rigorous and consistent and may be readily re-examined if circumstances change. There was the requirement for consistency in structure and use, particularly for multiple analyses. It was important that analyses could be quickly and easily carried out to consider, for example, different scales, configurations and product portfolios and so that previous outcomes could be readily reconsidered. The result of the completed methodology is the identification of the most promising biorefinery chains from those considered as part of the European Biosynergy Project.
Resumo:
Benchmarking techniques have evolved over the years since Xerox’s pioneering visits to Japan in the late 1970s. The focus of benchmarking has also shifted during this period. By tracing in detail the evolution of benchmarking in one specific area of business activity, supply and distribution management, as seen by the participants in that evolution, creates a picture of a movement from single function, cost-focused, competitive benchmarking, through cross-functional, cross-sectoral, value-oriented benchmarking to process benchmarking. As process efficiency and effectiveness become the primary foci of benchmarking activities, the measurement parameters used to benchmark performance converge with the factors used in business process modelling. The possibility is therefore emerging of modelling business processes and then feeding the models with actual data from benchmarking exercises. This would overcome the most common criticism of benchmarking, namely that it intrinsically lacks the ability to move beyond current best practice. In fact the combined power of modelling and benchmarking may prove to be the basic building block of informed business process re-engineering.
Resumo:
This paper presents a new method for the optimisation of the mirror element spacing arrangement and operating temperature of linear Fresnel reflectors (LFR). The specific objective is to maximise available power output (i.e. exergy) and operational hours whilst minimising cost. The method is described in detail and compared to an existing design method prominent in the literature. Results are given in terms of the exergy per total mirror area (W/m2) and cost per exergy (US $/W). The new method is applied principally to the optimisation of an LFR in Gujarat, India, for which cost data have been gathered. It is recommended to use a spacing arrangement such that the onset of shadowing among mirror elements occurs at a transversal angle of 45°. This results in a cost per exergy of 2.3 $/W. Compared to the existing design approach, the exergy averaged over the year is increased by 9% to 50 W/m2 and an additional 122 h of operation per year are predicted. The ideal operating temperature at the surface of the absorber tubes is found to be 300 °C. It is concluded that the new method is an improvement over existing techniques and a significant tool for any future design work on LFR systems
Resumo:
Purpose - The purpose of this study is to develop a performance measurement model for service operations using the analytic hierarchy process approach. Design/methodology/approach - The study reviews current relevant literature on performance measurement and develops a model for performance measurement. The model is then applied to the intensive care units (ICUs) of three different hospitals in developing nations. Six focus group discussions were undertaken, involving experts from the specific area under investigation, in order to develop an understandable performance measurement model that was both quantitative and hierarchical. Findings - A combination of outcome, structure and process-based factors were used as a foundation for the model. The analyses of the links between them were used to reveal the relative importance of each and their associated sub factors. It was considered to be an effective quantitative tool by the stakeholders. Research limitations/implications - This research only applies the model to ICUs in healthcare services. Practical implications - Performance measurement is an important area within the operations management field. Although numerous models are routinely being deployed both in practice and research, there is always room for improvement. The present study proposes a hierarchical quantitative approach, which considers both subjective and objective performance criteria. Originality/value - This paper develops a hierarchical quantitative model for service performance measurement. It considers success factors with respect to outcomes, structure and processes with the involvement of the concerned stakeholders based upon the analytic hierarchy process approach. The unique model is applied to the ICUs of hospitals in order to demonstrate its effectiveness. The unique application provides a comparative international study of service performance measurement in ICUs of hospitals in three different countries. © Emerald Group Publishing Limited.
Resumo:
The objective of this work was to design, construct, test and operate a novel circulating fluid bed fast pyrolysis reactor system for production of liquids from biomass. The novelty lies in incorporating an integral char combustor to provide autothermal operation. A reactor design methodology was devised which correlated input parameters to process variables, namely temperature, heat transfer and gas/vapour residence time, for both the char combustor and biomass pyrolyser. From this methodology a CFB reactor was designed with integral char combustion for 10 kg/h biomass throughput. A full-scale cold model of the CFB unit was constructed and tested to derive suitable hydrodynamic relationships and performance constraints. Early difficulties encountered with poor solids circulation and inefficient product recovery were overcome by a series of modifications. A total of 11 runs in a pyrolysis mode were carried out with a maximum total liquids yield of 61.50% wt on a maf biomass basis, obtained at 500°C and with 0.46 s gas/vapour residence time. This could be improved by improved vapour recovery by direct quenching up to an anticipated 75 % wt on a moisture-and-ash-free biomass basis. The reactor provides a very high specific throughput of 1.12 - 1.48 kg/hm2 and the lowest gas-to-feed ratio of 1.3 - 1.9 kg gas/kg feed compared to other fast pyrolysis processes based on pneumatic reactors and has a good scale-up potential. These features should provide significant capital cost reduction. Results to date suggest that the process is limited by the extent of char combustion. Future work will address resizing of the char combustor to increase overall system capacity, improvement in solid separation and substantially better liquid recovery. Extended testing will provide better evaluation of steady state operation and provide data for process simulation and reactor modeling.
Resumo:
The objective of the thesis was to analyse several process configurations for the production of electricity from biomass. Process simulation models using AspenPlus aimed at calculating the industrial performance of power plant concepts were built, tested, and used for analysis. The criteria used in analysis were performance and cost. All of the advanced systems appear to have higher efficiencies than the commercial reference, the Rankine cycle. However, advanced systems typically have a higher cost of electricity (COE) than the Rankine power plant. High efficiencies do not reduce fuel costs enough to compensate for the high capital costs of advanced concepts. The successful reduction of capital costs would appear to be the key to the introduction of the new systems. Capital costs account for a considerable, often dominant, part of the cost of electricity in these concepts. All of the systems have higher specific investment costs than the conventional industrial alternative, i.e. the Rankine power plant; Combined beat and power production (CUP) is currently the only industrial area of application in which bio-power costs can be considerably reduced to make them competitive. Based on the results of this work, AsperiPlus is an appropriate simulation platform. How-ever, the usefulness of the models could be improved if a number of unit operations were modelled in greater detail. The dryer, gasifier, fast pyrolysis, gas engine and gas turbine models could be improved.
Resumo:
Most parametric software cost estimation models used today evolved in the late 70's and early 80's. At that time, the dominant software development techniques being used were the early 'structured methods'. Since then, several new systems development paradigms and methods have emerged, one being Jackson Systems Development (JSD). As current cost estimating methods do not take account of these developments, their non-universality means they cannot provide adequate estimates of effort and hence cost. In order to address these shortcomings two new estimation methods have been developed for JSD projects. One of these methods JSD-FPA, is a top-down estimating method, based on the existing MKII function point method. The other method, JSD-COCOMO, is a sizing technique which sizes a project, in terms of lines of code, from the process structure diagrams and thus provides an input to the traditional COCOMO method.The JSD-FPA method allows JSD projects in both the real-time and scientific application areas to be costed, as well as the commercial information systems applications to which FPA is usually applied. The method is based upon a three-dimensional view of a system specification as opposed to the largely data-oriented view traditionally used by FPA. The method uses counts of various attributes of a JSD specification to develop a metric which provides an indication of the size of the system to be developed. This size metric is then transformed into an estimate of effort by calculating past project productivity and utilising this figure to predict the effort and hence cost of a future project. The effort estimates produced were validated by comparing them against the effort figures for six actual projects.The JSD-COCOMO method uses counts of the levels in a process structure chart as the input to an empirically derived model which transforms them into an estimate of delivered source code instructions.
Resumo:
This research examines the role of the information management process within a process-oriented enterprise, Xerox Ltd. The research approach is based on a post-positive paradigm and has resulted in thirty-five idiographic statements. The three major outcomes are: 1. The process-oriented holistic enterprise is an organisation that requires a long-term management commitment to its development. It depends on the careful management of people, tasks, information and technology. A complex integration of business processes is required and this can be managed through the use of consistent documentation techniques, clarity in the definition of process responsibilities and management attention to the global metrics and the centralisation of the management of the process model are critical to its success. 2. The role of the information management process within the context of a process-oriented enterprise is to provide flexible and cost-effective applications, technological, and process support to the business. This is best achieved through a centralisation of the management of information management and of the process model. A business-led approach combined with the consolidation of applications, information, process, and data architectures is central to providing effective business and process-focused support. 3. In a process oriented holistic enterprise, process and information management are inextricably linked. The model of process management depends heavily on information management, whilst the model of information management is totally focused around supporting and creating the process model. The two models are mutually creating - one cannot exist without the other. There is a duality concept of process and information management.
Resumo:
Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.
Resumo:
Since 1988, quasi-markets have been introduced into many areas of social policy in the UK, the NHS internal market is one example. Markets operate by price signals. The NHS Internal Market, if it is to operate efficiently, requires purchasers and providers to respond to price signals. The research hypothesis is - cost accounting methods can be developed to enable healthcare contracts to be priced on a cost-basis in a manner which will facilitate the achievement of economic efficiency in the NHS internal market. Surveys of hospitals in 1991 and 1994 established the cost methods adopted in deriving the prices for healthcare contracts in the first year of the market and three years on. An in-depth view of the costing for pricing process was gained through case studies. Hospitals had inadequate cost information on which to price healthcare contracts at the inception of the internal market: prices did not reflect the relative performance of healthcare providers sufficiently closely to enable the market's espoused efficiency aims to be achieved. Price variations were often due to differing costing approaches rather than efficiency. Furthermore, price comparisons were often meaningless because of inadequate definition of the services (products). In April 1993, the NHS Executive issued guidance on costing for contracting to all NHS providers in an attempt to improve the validity of price comparisons between alternative providers. The case studies and the 1994 survey show that although price comparison has improved, considerable problems remain. Consistency is not assured, and the problem of adequate product definition is still to be solved. Moreover, the case studies clearly highlight the mismatch of rigid, full-cost pricing rules with both the financial management considerations at local level and the emerging internal market(s). Incentives exist to cost-shift, and healthcare prices can easily be manipulated. In the search for a new health policy paradigm to replace traditional bureaucratic provision, cost-based pricing cannot be used to ensure a more efficient allocation of healthcare resources.