52 resultados para process quality indicator
em Aston University Research Archive
Resumo:
Integrated supplier selection and order allocation is an important decision for both designing and operating supply chains. This decision is often influenced by the concerned stakeholders, suppliers, plant operators and customers in different tiers. As firms continue to seek competitive advantage through supply chain design and operations they aim to create optimized supply chains. This calls for on one hand consideration of multiple conflicting criteria and on the other hand consideration of uncertainties of demand and supply. Although there are studies on supplier selection using advanced mathematical models to cover a stochastic approach, multiple criteria decision making techniques and multiple stakeholder requirements separately, according to authors' knowledge there is no work that integrates these three aspects in a common framework. This paper proposes an integrated method for dealing with such problems using a combined Analytic Hierarchy Process-Quality Function Deployment (AHP-QFD) and chance constrained optimization algorithm approach that selects appropriate suppliers and allocates orders optimally between them. The effectiveness of the proposed decision support system has been demonstrated through application and validation in the bioenergy industry.
Resumo:
In this work, we introduce the periodic nonlinear Fourier transform (PNFT) method as an alternative and efficacious tool for compensation of the nonlinear transmission effects in optical fiber links. In the Part I, we introduce the algorithmic platform of the technique, describing in details the direct and inverse PNFT operations, also known as the inverse scattering transform for periodic (in time variable) nonlinear Schrödinger equation (NLSE). We pay a special attention to explaining the potential advantages of the PNFT-based processing over the previously studied nonlinear Fourier transform (NFT) based methods. Further, we elucidate the issue of the numerical PNFT computation: we compare the performance of four known numerical methods applicable for the calculation of nonlinear spectral data (the direct PNFT), in particular, taking the main spectrum (utilized further in Part II for the modulation and transmission) associated with some simple example waveforms as the quality indicator for each method. We show that the Ablowitz-Ladik discretization approach for the direct PNFT provides the best performance in terms of the accuracy and computational time consumption.
Resumo:
Improving healthcare quality is a growing need of any society. Although various quality improvement projects are routinely deployed by the healthcare professional, they are characterised by a fragmented approach, i.e. they are not linked with the strategic intent of the organisation. This study introduces a framework which integrates all quality improvement projects with the strategic intent of the organisation. It first derives the strengths, weaknesses, opportunities and threats (SWOT) matrix of the system with the involvement of the concerned stakeholders (clinical professional), which helps identify a few projects, the implementation of which ensures achievement of desired quality. The projects are then prioritised using the analytic hierarchy process with the involvement of the concerned stakeholders (clinical professionals) and implemented in order to improve system performance. The effectiveness of the method has been demonstrated using a case study in the intensive care unit of Queen Elizabeth Hospital in Bridgetown, Barbados.
Resumo:
In this paper, a co-operative distributed process mining system (CDPMS) is developed to streamline the workflow along the supply chain in order to offer shorter delivery times, more flexibility and higher customer satisfaction with learning ability. The proposed system is equipped with the ‘distributed process mining’ feature which is used to discover the hidden relationships among each working decision in distributed manner. This method incorporates the concept of data mining and knowledge refinement into decision making process for ensuring ‘doing the right things’ within the workflow. An example of implementation is given, based on the case of slider manufacturer.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
Although the importance of dataset fitness-for-use evaluation and intercomparison is widely recognised within the GIS community, no practical tools have yet been developed to support such interrogation. GeoViQua aims to develop a GEO label which will visually summarise and allow interrogation of key informational aspects of geospatial datasets upon which users rely when selecting datasets for use. The proposed GEO label will be integrated in the Global Earth Observation System of Systems (GEOSS) and will be used as a value and trust indicator for datasets accessible through the GEO Portal. As envisioned, the GEO label will act as a decision support mechanism for dataset selection and thereby hopefully improve user recognition of the quality of datasets. To date we have conducted 3 user studies to (1) identify the informational aspects of geospatial datasets upon which users rely when assessing dataset quality and trustworthiness, (2) elicit initial user views on a GEO label and its potential role and (3), evaluate prototype label visualisations. Our first study revealed that, when evaluating quality of data, users consider 8 facets: dataset producer information; producer comments on dataset quality; dataset compliance with international standards; community advice; dataset ratings; links to dataset citations; expert value judgements; and quantitative quality information. Our second study confirmed the relevance of these facets in terms of the community-perceived function that a GEO label should fulfil: users and producers of geospatial data supported the concept of a GEO label that provides a drill-down interrogation facility covering all 8 informational aspects. Consequently, we developed three prototype label visualisations and evaluated their comparative effectiveness and user preference via a third user study to arrive at a final graphical GEO label representation. When integrated in the GEOSS, an individual GEO label will be provided for each dataset in the GEOSS clearinghouse (or other data portals and clearinghouses) based on its available quality information. Producer and feedback metadata documents are being used to dynamically assess information availability and generate the GEO labels. The producer metadata document can either be a standard ISO compliant metadata record supplied with the dataset, or an extended version of a GeoViQua-derived metadata record, and is used to assess the availability of a producer profile, producer comments, compliance with standards, citations and quantitative quality information. GeoViQua is also currently developing a feedback server to collect and encode (as metadata records) user and producer feedback on datasets; these metadata records will be used to assess the availability of user comments, ratings, expert reviews and user-supplied citations for a dataset. The GEO label will provide drill-down functionality which will allow a user to navigate to a GEO label page offering detailed quality information for its associated dataset. At this stage, we are developing the GEO label service that will be used to provide GEO labels on demand based on supplied metadata records. In this presentation, we will provide a comprehensive overview of the GEO label development process, with specific emphasis on the GEO label implementation and integration into the GEOSS.
Resumo:
The overall objective of this work was to compare the effect of pre-treatment and catalysts on the quality of liquid products from fast pyrolysis of biomass. This study investigated the upgrading of bio-oil in terms of its quality as a bio-fuel and/or source of chemicals. Bio-oil used directly as a biofuel for heat or power needs to be improved particularly in terms of temperature sensitivity, oxygen content, chemical instability, solid content, and heating values. Chemicals produced from bio-oil need to be able to meet product specifications for market acceptability. There were two main objectives in this research. The first was to examine the influence of pre-treatment of biomass on the fast pyrolysis process and liquid quality. The relationship between the method of pre-treatment of biomass feedstock to fast pyrolysis oil quality was studied. The thermal decomposition behaviour of untreated and pretreated feedstocks was studied by using a TGA (thermogravimetric analysis) and a Py-GC/MS (pyroprobe-gas chromatography/mass spectrometry). Laboratory scale reactors (100g/h, 300g/h, 1kg/h) were used to process untreated and pretreated feedstocks by fast pyrolysis. The second objective was to study the influence of numerous catalysts on fast pyrolysis liquids from wheat straw. The first step applied analytical pyrolysis (Py-GC/MS) to determine which catalysts had an effect on fast pyrolysis liquid, in order to select catalysts for further laboratory fast pyrolysis. The effect of activation, temperature, and biomass pre-treatment on catalysts were also investigated. Laboratory experiments were also conducted using the existing 300g/h fluidised bed reactor system with a secondary catalytic fixed bed reactor. The screening of catalysts showed that CoMo was a highly active catalyst, which particularly reduced the higher molecular weight products of fast pyrolysis. From these screening tests, CoMo catalyst was selected for larger scale laboratory experiments. With reference to the effect of pre-treatment work on fast pyrolysis process, a significant effect occurred on the thermal decomposition of biomass, as well as the pyrolysis products composition, and the proportion of key components in bio-oil. Torrefaction proved to have a mild influence on pyrolysis products, when compared to aquathermolysis and steam pre-treatment.
Resumo:
Purpose: The purpose of this paper is to describe how the application of systems thinking to designing, managing and improving business processes has resulted in a new and unique holonic-based process modeling methodology know as process orientated holonic modeling. Design/methodology/approach: The paper describes key systems thinking axioms that are built upon in an overview of the methodology; the techniques are described using an example taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. These were produced in an 18 month project, using an action research approach, to improve quality and process efficiency. Findings: The findings of this research show that this new methodology can support process depiction and improvement in industrial sectors which are characterized by environments of high variety and low volume (e.g. projects; such as the design and manufacture of a radar system or a hybrid production process) which do not provide repetitive learning opportunities. In such circumstances, the methodology has not only been able to deliver holonic-based process diagrams but also been able to transfer strategic vision from top management to middle and operational levels without being reductionistic. Originality/value: This paper will be of interest to organizational analysts looking at large complex projects whom require a methodology that does not confine them to thinking reductionistically in "task-breakdown" based approaches. The novel ideas in this paper have great impact on the way analysts should perceive organizational processes. Future research is applying the methodology in similar environments in other industries. © Emerald Group Publishing Limited.
Resumo:
The application of systems thinking to designing, managing, and improving business processes has developed a new "holonic-based" process modeling methodology. The theoretical background and the methodology are described using examples taken from a large organization designing and manufacturing capital goods equipment operating within a complex and dynamic environment. A key point of differentiation attributed to this methodology is that it allows a set of models to be produced without taking a task breakdown approach but instead uses systems thinking and a construct known as the "holon" to build process descriptions as a system of systems (i.e., a holarchy). The process-oriented holonic modeling methodology has been used for total quality management and business process engineering exercises in different industrial sectors and builds models that connect the strategic vision of a company to its operational processes. Exercises have been conducted in response to environmental pressures to make operations align with strategic thinking as well as becoming increasingly agile and efficient. This unique methodology is best applied in environments of high complexity, low volume, and high variety, where repeated learning opportunities are few and far between (e.g., large development projects). © 2007 IEEE.
Resumo:
Time, cost and quality achievements on large-scale construction projects are uncertain because of technological constraints, involvement of many stakeholders, long durations, large capital requirements and improper scope definitions. Projects that are exposed to such an uncertain environment can effectively be managed with the application of risk management throughout the project life cycle. Risk is by nature subjective. However, managing risk subjectively poses the danger of non-achievement of project goals. Moreover, risk analysis of the overall project also poses the danger of developing inappropriate responses. This article demonstrates a quantitative approach to construction risk management through an analytic hierarchy process (AHP) and decision tree analysis. The entire project is classified to form a few work packages. With the involvement of project stakeholders, risky work packages are identified. As all the risk factors are identified, their effects are quantified by determining probability (using AHP) and severity (guess estimate). Various alternative responses are generated, listing the cost implications of mitigating the quantified risks. The expected monetary values are derived for each alternative in a decision tree framework and subsequent probability analysis helps to make the right decision in managing risks. In this article, the entire methodology is explained by using a case application of a cross-country petroleum pipeline project in India. The case study demonstrates the project management effectiveness of using AHP and DTA.
Resumo:
There is an increasing need of a model for the process-based performance measurement of multispecialty tertiary care hospitals for quality improvement. Analytic hierarchy process (AHP) is utilized in this study to evolve such a model. Each step in the model was derived by group-discussions and brainstorming sessions among experienced clinicians and managers. This tool was applied to two tertiary care teaching hospitals in Barbados and India. The model enabled identification of specific areas where neither hospital performed very well, and helped to suggest recommendations to improve those areas. AHP is recommended as a valuable tool to measure the process-based performance of multispecialty tertiary care hospitals. © Emerald Group Publishing Limited.
Resumo:
Purpose: To develop a model for the global performance measurement of intensive care units (ICUs) and to apply that model to compare the services for quality improvement. Materials and Methods: Analytic hierarchy process, a multiple-attribute decision-making technique, is used in this study to evolve such a model. The steps consisted of identifying the critical success factors for the best performance of an ICU, identifying subfactors that influence the critical factors, comparing them pairwise, deriving their relative importance and ratings, and calculating the cumulative performance according to the attributes of a given ICU. Every step in the model was derived by group discussions, brainstorming, and consensus among intensivists. Results: The model was applied to 3 ICUs, 1 each in Barbados, Trinidad, and India in tertiary care teaching hospitals of similar setting. The cumulative performance rating of the Barbados ICU was 1.17 when compared with that of Trinidad and Indian ICU, which were 0.82 and 0.75, respectively, showing that the Trinidad and Indian ICUs performed 70% and 64% with respect to Barbados ICU. The model also enabled identifying specific areas where the ICUs did not perform well, which helped to improvise those areas. Conclusions: Analytic hierarchy process is a very useful model to measure the global performance of an ICU. © 2005 Elsevier Inc. All rights reserved.