33 resultados para quality measurement


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The need to improve the management of language learning organizations in the light of the trend toward mass higher education and of the use of English as a world language was the starting point of this thesis. The thesis aims to assess the relevance, adequacy and the relative success of Total Quality Management (TQM) as a management philosophy. Taking this empirical evidence a TQM-oriented management project in a Turkish Higher Education context, the thesis observes the consequences of a change of organizational culture, with specific reference to teachers' attitudes towards management. Both qualitative and quantitative devices are employed to plot change and the value of these devices for identifying such is considered. The main focus of the thesis is the Soft S's (Shared Values, Style, Staff and Skills) of an organization rather than the Hard S's (System, Structure, Strategy). The thesis is not concerned with the teaching and learning processes, though the PDCA cycle (the Action Research Cycle) did play a part in the project for both teachers and the researcher involved in this study of organizational development. Both before the management project was launched, and at the end of the research period, the external measurement devices (Harrison's Culture Specification Device and Hofstede's VSM) were used to describe the culture of the Centre. During the management project, internal measurement devices were used to record the change including middle-management style change (the researcher in this case). The time period chosen for this study was between September 1991 and June 1994. During this period, each device was administered twice within a specific time period, ranging from a year to 32 months.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research is concerned with the measurement of residents' evaluations of the environmental quality of residential areas. The research reflects the increased attention being given to residents' values in planning decisions affecting the residential environment. The work was undertaken in co-operation with a local authority which was in the process of revising its housing strategy, and in particular the priorities for improvement action. The study critically examines the existing evidence on environmental values and their relationship to the environment and points to a number of methodological and conceptual deficiencies. The research strategy developed on the basis of the research review was constrained by the need to keep any survey methods simple so that they could easily be repeated, when necessary, by the sponsoring authority. A basic perception model was assumed, and a social survey carried out to measure residents' responses to different environmental conditions. The data was only assumed to have ordinal properties, necessitating the extensive use of non-parametric statistics. Residents' expressions of satisfaction with the component elements of the environment (ranging from convenience to upkeep and privacy) were successfully related to 'objective' measures of the environment. However the survey evidence did not justify the use of the 'objective' variables as environmental standards. A method of using the social survey data directly as an aid to decision-making is discussed. Alternative models of the derivation of overall satisfaction with the environment are tested, and the values implied by the additive model compared with residents' preferences as measured directly in the survey. Residents' overall satisfactions with the residential environment were most closely related to their satisfactions with the "Appearance" and the "Reputation" of their areas. By contrast the most important directly measured preference was "Friendliness of area". The differences point to the need to define concepts used in social research clearly in operational terms, and to take care in the use of values 'measured' by different methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coke oven liquor is a toxic wastewater produced in large quantities by the Iron and Steel, and Coking Industries, and gives rise to major effluent treatment problems in those industries. Conscious of the potentially serious environmental impact of the discharge of such wastes, pollution control agencies in many countries have made progressively more stringent quality requirements for the discharge of the treated waste. The most common means of treating the waste is the activated sludge process. Problems with achieving consistently satisfactory treatment by this process have been experienced in the past. The need to improve the quality of the discharge of the treated waste prompted attempts by TOMLINS to model the process using Adenosine Triphosophnte (ATP) as a measure of biomass, but these were unsuccessful. This thesis describes work that was carried out to determine the significance of ATP in the activated sludge treatment of the waste. The use of ATP measurements in wastewater treatment were reviewed. Investigations were conducted into the ATP behaviour of the batch activated sludge treatment of two major components of the waste, phenol, and thiocyanate, and the continuous activated sludge treatment of the liquor itself, using laboratory scale apparatus. On the basis of these results equations were formulated to describe the significance of ATP as a measured activity and biomass in the treatment system. These were used as the basis for proposals to use ATP as a control parameter in the activated sludge treatment of coke oven liquor, and wastewaters in general. These had relevance both to the treatment of the waste in the reactor and to the settlement of the sludge produced in the secondary settlement stage of the treatment process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Heterochromatic flicker photometry (HFP) is a psychophysical technique used to measure macular pigment optical density (MPOD). We used the MPS 9000 (MPS) HFP device. Our aim was to determine if the repeatability of the MPS could be improved to make it more suitable for monitoring MPOD over time. Methods: Intra-session repeatability was assessed in 25 participants (aged 20-50 years). The resulting data was explored in detail, e.g., by examining the effect of removal and adjustment of data with less than optimal quality parameters. A protocol was developed for improved overall reliability, which was then tested in terms of inter-session repeatability in a separate group of 27 participants (aged 19-52 years). Results: Removal and adjustment of data reduced the intra-session coefficient of repeatability (CR) by 0.04, on average, and the mean individual standard deviation by 0.004. Raw data observation offered further insight into ways of improving repeatability. The proposed protocol resulted in an inter-session CR of 0.08. Conclusions: Removal and adjustment of less than optimal data improved repeatability, and is therefore recommended. To further improve repeatability, in brief we propose that each patient perform each part of the test twice, and a third time where necessary (described in detail by the protocol). Doing so will make the MPS more useful in research and clinical settings. © 2012 Springer-Verlag.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The use of quality of life (QoL) instruments in menorrhagia research is increasing but there is concern that not enough emphasis is placed on patient-focus in these measurements, i.e. on issues which are of importance to patients and reflect their experiences and concerns (clinical face validity). The objective was to assess the quality of QoL instruments in studies of menorrhagia. STUDY DESIGN: A systematic review of published research. Papers were identified through MEDLINE (1966-April 2000), EMBASE (1980-April 2000), Science Citation Index (1981-April 2000), Social Science Citation Index (1981-April 2000), CINAHL (1982-1999) and PsychLIT (1966-1999), and by manual searching of bibliographies of known primary and review articles. Studies were selected if they assessed women with menorrhagia for life quality, either developing QoL instruments or applying them as an outcome measure. Selected studies were assessed for quality of their QoL instruments, using a 17 items checklist including 10 items for clinical face validity (issues of relevance to patients' expectations and concerns) and 7 items for measurement properties (such as reliability, responsiveness, etc.). RESULTS: A total of 19 articles, 8 on instrument development and 11 on application, were included in the review. The generic Short Form 36 Health Survey Questionnaire (SF36) was used in 12/19 (63%) studies. Only two studies developed new specific QoL instruments for menorrhagia but they complied with 7/17 (41%) and 10/17 (59%) of the quality criteria. Quality assessment showed that only 7/19 (37%) studies complied with more than half the criteria for face validity whereas 17/19 (90%) studies complied with more than half of the criteria for measurement properties (P = 0.0001). CONCLUSION: Among existing QoL instruments, there is good compliance with the quality criteria for measurement properties but not with those for clinical face validity. There is a need to develop methodologically sound disease specific QoL instruments in menorrhagia focussing both on face validity and measurement properties.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background/Aims: To develop and assess the psychometric validity of a Chinese language Vision Health related quality-of-life (VRQoL) measurement instrument for the Chinese visually impaired. Methods: The Low Vision Quality of Life Questionnaire (LVQOL) was translated and adapted into the Chinese-version Low Vision Quality of Life Questionnaire (CLVQOL). The CLVQOL was completed by 100 randomly selected people with low vision (primary group) and 100 people with normal vision (control group). Ninety-four participants from the primary group completed the CLVQOL a second time 2 weeks later (test-retest group). The internal consistency reliability, test-retest reliability, item-internal consistency, item-discrimination validity, construct validity and discriminatory power of the CLVQOL were calculated. Results: The review committee agreed that the CLVQOL replicated the meaning of the LVQOL and was sensitive to cultural differences. The Cronbach's α coefficient and the split-half coefficient for the four scales and total CLVQOL scales were 0.75-0.97. The test-retest reliability as estimated by the intraclass correlations coefficient was 0.69-0.95. Item-internal consistency was >0.4 and item-discrimination validity was generally <0.40. The Varimax rotation factor analysis of the CLVQOL identified four principal factors. the quality-of-life rating of four subscales and the total score of the CLVQOL of the primary group were lower than those of the Control group, both in hospital-based subjects and community-based subjects. Conclusion: The CLVQOL Chinese is a culturally specific vision-related quality-of-life measure instrument. It satisfies conventional psychometric criteria, discriminates visually healthy populations from low vision patients and may be valuable in screening the local community as well as for use in clinical practice or research. © Springer 2005.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A modal interferometer based on multimode-singlemode-multimode fiber structure built with a biconical taper for fiber curvature measurement is proposed and experimentally demonstrated. Due to the tapered singlemode fiber acting as a high-efficient mode power converter to enhance the modes coupling, curvature sensor with improved sensitivity is achieved by monitoring the defined fringe visibility of the interference spectrum. The measuring range can be tuned by changing the waist diameter of the fiber taper. Meanwhile, the sensor shows an intrinsic ability to overcome the influence of temperature cross-sensitivity and the power fluctuation of light source. The advantages of easy fabrication, high-quality spectrum with improved sensitivity, and small hysteresis will provide great potential for practical applications of the sensor. © 2013 Springer-Verlag Berlin Heidelberg.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

METPEX is a 3 year, FP7 project which aims to develop a PanEuropean tool to measure the quality of the passenger's experience of multimodal transport. Initial work has led to the development of a comprehensive set of variables relating to different passenger groups, forms of transport and journey stages. This paper addresses the main challenges in transforming the variables into usable, accessible computer based tools allowing for the real time collection of information, across multiple journey stages in different EU countries. Non-computer based measurement instruments will be used to gather information from those who may not have or be familiar with mobile technology. Smartphone-based measurement instruments will also be used, hosted in two applications. The mobile applications need to be easy to use, configurable and adaptable according to the context of use. They should also be inherently interesting and rewarding for the participant, whilst allowing for the collection of high quality, valid and reliable data from all journey types and stages (from planning, through to entry into and egress from different transport modes, travel on public and personal vehicles and support of active forms of transport (e.g. cycling and walking). During all phases of the data collection and processing, the privacy of the participant is highly regarded and is ensured. © 2014 Springer International Publishing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents for the first time the concept of measurement assisted assembly (MAA) and outlines the research priorities of the realisation of this concept in the industry. MAA denotes a paradigm shift in assembly for high value and complex products and encompasses the development and use of novel metrology processes for the holistic integration and capability enhancement of key assembly and ancillary processes. A complete framework for MAA is detailed showing how this can facilitate a step change in assembly process capability and efficiency for large and complex products, such as airframes, where traditional assembly processes exhibit the requirement for rectification and rework, use inflexible tooling and are largely manual, resulting in cost and cycle time pressures. The concept of MAA encompasses a range of innovativemeasurement- assisted processes which enable rapid partto- part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved levels of precision across the dimensional scales. A full scale industrial trial of MAA technologies has been carried out on an experimental aircraft wing demonstrating the viability of the approach while studies within 140 smaller companies have highlighted the need for better adoption of existing process capability and quality control standards. The identified research priorities for MAA include the development of both frameless and tooling embedded automated metrology networks. Other research priorities relate to the development of integrated dimensional variation management, thermal compensation algorithms as well as measurement planning and inspection of algorithms linking design to measurement and process planning. © Springer-Verlag London 2013.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

High precision manufacturers continuously seek out disruptive technologies to improve the quality, cost, and delivery of their products. With the advancement of machine tool and measurement technology many companies are ready to capitalise on the opportunity of on-machine measurement (OMM). Coupled with business case, manufacturing engineers are now questioning whether OMM can soon eliminate the need for post-process inspection systems. Metrologists will however argue that the machining environment is too hostile and that there are numerous process variables which need consideration before traceable measurement on-the-machine can be achieved. In this paper we test the measurement capability of five new multi-axis machine tools enabled as OMM systems via on-machine probing. All systems are tested under various operating conditions in order to better understand the effects of potentially significant variables. This investigation has found that key process variables such as machine tool warm-up and tool-change cycles can have an effect on machine tool measurement repeatability. New data presented here is important to many manufacturers whom are considering utilising their high precision multi-axis machine tools for both the creation and verification of their products.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Measurement assisted assembly (MAA) has the potential to facilitate a step change in assembly efficiency for large structures such as airframes through the reduction of rework, manually intensive processes and expensive monolithic assembly tooling. It is shown how MAA can enable rapid part-to-part assembly, increased use of flexible automation, traceable quality assurance and control, reduced structure weight and improved aerodynamic tolerances. These advances will require the development of automated networks of measurement instruments; model based thermal compensation, the automatic integration of 'live' measurement data into variation simulation and algorithms to generate cutting paths for predictive shimming and drilling processes. This paper sets out an architecture for digital systems which will enable this integrated approach to variation management. © 2013 The Authors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a new approach to the resource allocation and scheduling mechanism that reflects the effect of user's Quality of Experience is presented. The proposed scheduling algorithm is examined in the context of 3GPP Long Term Evolution (LTE) system. Pause Intensity (PI) as an objective and no-reference quality assessment metric is employed to represent user's satisfaction in the scheduler of eNodeB. PI is in fact a measurement of discontinuity in the service. The performance of the scheduling method proposed is compared with two extreme cases: maxCI and Round Robin scheduling schemes which correspond to the efficiency and fairness oriented mechanisms, respectively. Our work reveals that the proposed method is able to perform between fairness and efficiency requirements, in favor of higher satisfaction for the users to the desired level. © VDE VERLAG GMBH.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper a full analytic model for pause intensity (PI), a no-reference metric for video quality assessment, is presented. The model is built upon the video play out buffer behavior at the client side and also encompasses the characteristics of a TCP network. Video streaming via TCP produces impairments in play continuity, which are not typically reflected in current objective metrics such as PSNR and SSIM. Recently the buffer under run frequency/probability has been used to characterize the buffer behavior and as a measurement for performance optimization. But we show, using subjective testing, that under run frequency cannot reflect the viewers' quality of experience for TCP based streaming. We also demonstrate that PI is a comprehensive metric made up of a combination of phenomena observed in the play out buffer. The analytical model in this work is verified with simulations carried out on ns-2, showing that the two results are closely matched. The effectiveness of the PI metric has also been proved by subjective testing on a range of video clips, where PI values exhibit a good correlation with the viewers' opinion scores. © 2012 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we investigate a new objective measurement for assessing the video playback quality for services delivered in networks that use TCP as a transport layer protocol. We define the new metric as pause intensity to characterize the quality of playback in terms of its continuity since, in the case of TCP, data packets are protected from losses but not from delays. Using packet traces generated from real TCP connections in a lossy environment, we are able to simulate the playback of a video and monitor buffer behaviors in order to calculate pause intensity values. We also run subjective tests to verify the effectiveness of the metric introduced and show that the results of pause intensity and the subjective scores made over the same real video clips are closely correlated.