10 resultados para Phasor measurement units
em Aston University Research Archive
Resumo:
Background/aims Macular pigment is thought to protect the macula against exposure to light and oxidative stress, both of which may play a role in the development of age-related macular degeneration. The aim was to clinically evaluate a novel cathode-ray-tube-based method for measurement of macular pigment optical density (MPOD) known as apparent motion photometry (AMP). Methods The authors took repeat readings of MPOD centrally (0°) and at 3° eccentricity for 76 healthy subjects (mean (±SD) 26.5±13.2 years, range 18–74 years). Results The overall mean MPOD for the cohort was 0.50±0.24 at 0°, and 0.28±0.20 at 3° eccentricity; these values were significantly different (t=-8.905, p<0.001). The coefficients of repeatability were 0.60 and 0.48 for the 0 and 3° measurements respectively. Conclusions The data suggest that when the same operator is taking repeated 0° AMP MPOD readings over time, only changes of more than 0.60 units can be classed as clinically significant. In other words, AMP is not suitable for monitoring changes in MPOD over time, as increases of this magnitude would not be expected, even in response to dietary modification or nutritional supplementation.
Resumo:
Purpose: To develop a model for the global performance measurement of intensive care units (ICUs) and to apply that model to compare the services for quality improvement. Materials and Methods: Analytic hierarchy process, a multiple-attribute decision-making technique, is used in this study to evolve such a model. The steps consisted of identifying the critical success factors for the best performance of an ICU, identifying subfactors that influence the critical factors, comparing them pairwise, deriving their relative importance and ratings, and calculating the cumulative performance according to the attributes of a given ICU. Every step in the model was derived by group discussions, brainstorming, and consensus among intensivists. Results: The model was applied to 3 ICUs, 1 each in Barbados, Trinidad, and India in tertiary care teaching hospitals of similar setting. The cumulative performance rating of the Barbados ICU was 1.17 when compared with that of Trinidad and Indian ICU, which were 0.82 and 0.75, respectively, showing that the Trinidad and Indian ICUs performed 70% and 64% with respect to Barbados ICU. The model also enabled identifying specific areas where the ICUs did not perform well, which helped to improvise those areas. Conclusions: Analytic hierarchy process is a very useful model to measure the global performance of an ICU. © 2005 Elsevier Inc. All rights reserved.
Resumo:
This paper explores the use of the optimisation procedures in SAS/OR software with application to the measurement of efficiency and productivity of decision-making units (DMUs) using data envelopment analysis (DEA) techniques. DEA was originally introduced by Charnes et al. [J. Oper. Res. 2 (1978) 429] is a linear programming method for assessing the efficiency and productivity of DMUs. Over the last two decades, DEA has gained considerable attention as a managerial tool for measuring performance of organisations and it has widely been used for assessing the efficiency of public and private sectors such as banks, airlines, hospitals, universities and manufactures. As a result, new applications with more variables and more complicated models are being introduced. Further to successive development of DEA a non-parametric productivity measure, Malmquist index, has been introduced by Fare et al. [J. Prod. Anal. 3 (1992) 85]. Employing Malmquist index, productivity growth can be decomposed into technical change and efficiency change. On the other hand, the SAS is a powerful software and it is capable of running various optimisation problems such as linear programming with all types of constraints. To facilitate the use of DEA and Malmquist index by SAS users, a SAS/MALM code was implemented in the SAS programming language. The SAS macro developed in this paper selects the chosen variables from a SAS data file and constructs sets of linear-programming models based on the selected DEA. An example is given to illustrate how one could use the code to measure the efficiency and productivity of organisations.
Resumo:
This paper discusses the use of comparative performance measurement by means of Data Envelopment Analysis in the context of the regulation of English and Welsh water companies. Specifically, the use of Data Envelopment Analysis to estimate potential cost savings in sewerage is discussed as it fed into the price review of water companies carried out by the regulator of water companies in 1994. The application is used as a vehicle for highlighting generic issues in terms of assessing the impact of factors on the ranking of units on performance, the insights gained from using alternative methods to assess comparative performance, and the issue of assessing comparative performance when few in number but highly complex entities are involved. The paper should prove of interest to those interested in regulation and, more generally, in the use of methods of comparative performance measurement.
Resumo:
Purpose - The purpose of this study is to develop a performance measurement model for service operations using the analytic hierarchy process approach. Design/methodology/approach - The study reviews current relevant literature on performance measurement and develops a model for performance measurement. The model is then applied to the intensive care units (ICUs) of three different hospitals in developing nations. Six focus group discussions were undertaken, involving experts from the specific area under investigation, in order to develop an understandable performance measurement model that was both quantitative and hierarchical. Findings - A combination of outcome, structure and process-based factors were used as a foundation for the model. The analyses of the links between them were used to reveal the relative importance of each and their associated sub factors. It was considered to be an effective quantitative tool by the stakeholders. Research limitations/implications - This research only applies the model to ICUs in healthcare services. Practical implications - Performance measurement is an important area within the operations management field. Although numerous models are routinely being deployed both in practice and research, there is always room for improvement. The present study proposes a hierarchical quantitative approach, which considers both subjective and objective performance criteria. Originality/value - This paper develops a hierarchical quantitative model for service performance measurement. It considers success factors with respect to outcomes, structure and processes with the involvement of the concerned stakeholders based upon the analytic hierarchy process approach. The unique model is applied to the ICUs of hospitals in order to demonstrate its effectiveness. The unique application provides a comparative international study of service performance measurement in ICUs of hospitals in three different countries. © Emerald Group Publishing Limited.
Resumo:
The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.
Resumo:
Off-highway motive plant equipment is costly in capital outlay and maintenance. To reduce these overheads and increase site safety and workrate, a technique of assessing and limiting the velocity of such equipment is required. Due to the extreme environmental conditions met on such sites, conventional velocity measurement techniques are inappropriate. Ogden Electronics Limited were formed specifically to manufacture a motive plant safety system incorporating a speed sensor and sanction unit; to date, the only such commercial unit available. However, problems plague the reliability, accuracy and mass production of this unit. This project assesses the company's exisiting product, and in conjunction with an appreciation of the company history and structure, concludes that this unit is unsuited to its intended application. Means of improving the measurement accuracy and longevity of this unit, commensurate with the company's limited resources and experience, are proposed, both for immediate retrofit and for longer term use. This information is presented in the form of a number of internal reports for the company. The off-highway environment is examined; and in conjunction with an evaluation of means of obtaining a returned signal, comparisons of processing techniques, and on-site gathering of previously unavailable data, preliminary designs for an alternative product are drafted. Theoretical aspects are covered by a literature review of ground-pointing radar, vehicular radar, and velocity measuring systems. This review establishes and collates the body of knowledge in areas previously considered unrelated. Based upon this work, a new design is proposed which is suitable for incorporation into the existing company product range. Following production engineering of the design, five units were constructed, tested and evaluated on-site. After extended field trials, this design has shown itself to possess greater accuracy, reliability and versatility than the existing sensor, at a lower unit cost.
Resumo:
Measurement of glycated haemoglobin A (HbA) provides an indication of longer-term glycaemic control. Standardisation of this test between laboratories is difficult to achieve, and most assays are currently calibrated to the values used in the Diabetes Control and Complications Trial (DCCT-aligned). With the availability of more specific reference standards it is now proposed that HbA is expressed as mmol HbA per mol of non-glycated haemoglobin. An HbA of 7% is approximately equal to 53 mmol/mol.
Resumo:
Performance evaluation in conventional data envelopment analysis (DEA) requires crisp numerical values. However, the observed values of the input and output data in real-world problems are often imprecise or vague. These imprecise and vague data can be represented by linguistic terms characterised by fuzzy numbers in DEA to reflect the decision-makers' intuition and subjective judgements. This paper extends the conventional DEA models to a fuzzy framework by proposing a new fuzzy additive DEA model for evaluating the efficiency of a set of decision-making units (DMUs) with fuzzy inputs and outputs. The contribution of this paper is threefold: (1) we consider ambiguous, uncertain and imprecise input and output data in DEA, (2) we propose a new fuzzy additive DEA model derived from the a-level approach and (3) we demonstrate the practical aspects of our model with two numerical examples and show its comparability with five different fuzzy DEA methods in the literature. Copyright © 2011 Inderscience Enterprises Ltd.
Resumo:
Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.