52 resultados para process quality indicator
Resumo:
Purpose - The purpose of the paper is to develop an integrated quality management model, which identifies problems, suggests solutions, develops a framework for implementation and helps evaluate performance of health care services dynamically. Design/methodology/approach - This paper uses logical framework analysis (LFA), a matrix approach to project planning for managing quality. This has been applied to three acute healthcare services (Operating room utilization, Accident and emergency, and Intensive care) in order to demonstrate its effectiveness. Findings - The paper finds that LFA is an effective method of quality management of hospital-based healthcare services. Research limitations/implications - This paper shows LFA application in three service processes in one hospital. However, ideally this is required to be tested in several hospitals and other services as well. Practical implications - In the paper the proposed model can be practised in hospital-based healthcare services for improving performance. Originality/value - The paper shows that quality improvement in healthcare services is a complex and multi-dimensional task. Although various quality management tools are routinely deployed for identifying quality issues in health care delivery and corrective measures are taken for superior performance, there is an absence of an integrated approach, which can identify and analyze issues, provide solutions to resolve those issues, develop a project management framework (planning, monitoring, and evaluating) to implement those solutions in order to improve process performance. This study introduces an integrated and uniform quality management tool. It integrates operations with organizational strategies. © Emerald Group Publishing Limited.
Resumo:
Due to its wide applicability and ease of use, the analytic hierarchy process (AHP) has been studied extensively for the last 20 years. Recently, it is observed that the focus has been confined to the applications of the integrated AHPs rather than the stand-alone AHP. The five tools that commonly combined with the AHP include mathematical programming, quality function deployment (QFD), meta-heuristics, SWOT analysis, and data envelopment analysis (DEA). This paper reviews the literature of the applications of the integrated AHPs. Related articles appearing in the international journals from 1997 to 2006 are gathered and analyzed so that the following three questions can be answered: (i) which type of the integrated AHPs was paid most attention to? (ii) which area the integrated AHPs were prevalently applied to? (iii) is there any inadequacy of the approaches? Based on the inadequacy, if any, some improvements and possible future work are recommended. This research not only provides evidence that the integrated AHPs are better than the stand-alone AHP, but also aids the researchers and decision makers in applying the integrated AHPs effectively.
Resumo:
In order to survive in the increasingly customer-oriented marketplace, continuous quality improvement marks the fastest growing quality organization’s success. In recent years, attention has been focused on intelligent systems which have shown great promise in supporting quality control. However, only a small number of the currently used systems are reported to be operating effectively because they are designed to maintain a quality level within the specified process, rather than to focus on cooperation within the production workflow. This paper proposes an intelligent system with a newly designed algorithm and the universal process data exchange standard to overcome the challenges of demanding customers who seek high-quality and low-cost products. The intelligent quality management system is equipped with the ‘‘distributed process mining” feature to provide all levels of employees with the ability to understand the relationships between processes, especially when any aspect of the process is going to degrade or fail. An example of generalized fuzzy association rules are applied in manufacturing sector to demonstrate how the proposed iterative process mining algorithm finds the relationships between distributed process parameters and the presence of quality problems.
Resumo:
Building on social exchange theory and qualitative inquiry, managerial responsiveness, caring, and aggressiveness were uncovered as three key social exchange dimensions used by sales managers when dealing with problem situations in the salesforce. We used Australian data to develop measures of these three constructs. Results of the development process indicate that the measures show good validity. Further to this, we also provide examination of the relationship of the three exchange dimensions with key organizational outcomes. Overall the findings suggest that the three constructs are important in sales manager problem resolution exchanges, and that they may ultimately influence the success of sales organizations.
Resumo:
The aim of this investigation was to study the chemical reactions occurring during the batchwise production of a butylated melamine-formaldehyde resin, in order to optimise the efficiency and economics of the batch processes. The batch process models are largely empirical in nature as the reaction mechanism is unknown. The process chemistry and the commercial manufacturing method are described. A small scale system was established in glass and the ability to produce laboratory resins with the required quality was demonstrated, simulating the full scale plant. During further experiments the chemical reactions of methylolation, condensation and butylation were studied. The important process stages were identified and studied separately. The effects of variation of certain process parameters on the chemical reactions were also studied. A published model of methylolation was modified and used to simulate the methylolation stage. A major result of this project was the development of an indirect method for studying the condensation and butylation reactions occurring during the dehydration and acid reaction stages, as direct quantitative methods were not available. A mass balance method was devised for this purpose and used to collect experimental data. The reaction scheme was verified using this data. The reactions stages were simulated using an empirical model. This has revealed new information regarding the mechanism and kinetics of the reactions. Laboratory results were shown to be comparable with plant scale results. This work has improved the understanding of the batch process, which can be used to improve product consistency. Future work has been identified and recommended to produce an optimum process and plant design to reduce the batch time.
Resumo:
A two-tier study is presented in this thesis. The first involves the commissioning of an extant but at the time, unproven bubbling fluidised bed fast pyrolysis unit. The unit was designed for an intended nominal throughput of 300 g/h of biomass. The unit came complete with solids separation, pyrolysis vapour quenching and oil collection systems. Modifications were carried out on various sections of the system including the reactor heating, quenching and liquid collection systems. The modifications allowed for fast pyrolysis experiments to be carried out at the appropriate temperatures. Bio-oil was generated using conventional biomass feedstocks including Willow, beechwood, Pine and Miscanthus. Results from this phase of the research showed however, that although the rig was capable of processing biomass to bio-oil, it was characterised by low mass balance closures and recurrent operational problems. The problems included blockages, poor reactor hydrodynamics and reduced organic liquid yields. The less than optimal performance of individual sections, particularly the feed and reactor systems of the rig, culminated in a poor overall performance of the system. The second phase of this research involved the redesign of two key components of the unit. An alternative feeding system was commissioned for the unit. The feed system included an off the shelf gravimetric system for accurate metering and efficient delivery of biomass. Similarly, a new bubbling fluidised bed reactor with an intended nominal throughput of 500g/h of biomass was designed and constructed. The design leveraged on experience from the initial commissioning phase with proven kinetic and hydrodynamic studies. These units were commissioned as part of the optimisation phase of the study. Also as part of this study, two varieties each, of previously unreported feedstocks namely Jatropha curcas and Moringa olifiera oil seed press cakes were characterised to determine their suitability as feedstocks for liquid fuel production via fast pyrolysis. Consequently, the feedstocks were used for the production of pyrolysis liquids. The quality of the pyrolysis liquids from the feedstocks were then investigated via a number of analytical techniques. The oils from the press cakes showed high levels of stability and reduced pH values. The improvements to the design of the fast pyrolysis unit led to higher mass balance closures and increased organic liquid yields. The maximum liquid yield obtained from the press cakes was from African Jatropha press cake at 66 wt% on a dry basis.
Resumo:
Pyrolysis is one of several thermochemical technologies that convert solid biomass into more useful and valuable bio-fuels. Pyrolysis is thermal degradation in the complete or partial absence of oxygen. Under carefully controlled conditions, solid biomass can be converted to a liquid known as bie-oil in 75% yield on dry feed. Bio-oil can be used as a fuel but has the drawback of having a high level of oxygen due to the presence of a complex mixture of molecular fragments of cellulose, hemicellulose and lignin polymers. Also, bio-oil has a number of problems in use including high initial viscosity, instability resulting in increased viscosity or phase separation and high solids content. Much effort has been spent on upgrading bio-oil into a more usable liquid fuel, either by modifying the liquid or by major chemical and catalytic conversion to hydrocarbons. The overall primary objective was to improve oil stability by exploring different ways. The first was to detennine the effect of feed moisture content on bio-oil stability. The second method was to try to improve bio-oil stability by partially oxygenated pyrolysis. The third one was to improve stability by co-pyrolysis with methanol. The project was carried out on an existing laboratory pyrolysis reactor system, which works well with this project without redesign or modification too much. During the finishing stages of this project, it was found that the temperature of the condenser in the product collection system had a marked impact on pyrolysis liquid stability. This was discussed in this work and further recommendation given. The quantity of water coming from the feedstock and the pyrolysis reaction is important to liquid stability. In the present work the feedstock moisture content was varied and pyrolysis experiments were carried out over a range of temperatures. The quality of the bio-oil produced was measured as water content, initial viscosity and stability. The result showed that moderate (7.3-12.8 % moisture) feedstock moisture led to more stable bio-oil. One of drawbacks of bio-oil was its instability due to containing unstable oxygenated chemicals. Catalytic hydrotreatment of the oil and zeolite cracking of pyrolysis vapour were discllssed by many researchers, the processes were intended to eliminate oxygen in the bio-oil. In this work an alternative way oxygenated pyrolysis was introduced in order to reduce oil instability, which was intended to oxidise unstable oxygenated chemicals in the bio-oil. The results showed that liquid stability was improved by oxygen addition during the pyrolysis of beech wood at an optimum air factor of about 0.09-0.15. Methanol as a postproduction additive to bio-oil has been studied by many researchers and the most effective result came from adding methanol to oil just after production. Co-pyrolysis of spruce wood with methanol was undertaken in the present work and it was found that methanol improved liquid stability as a co-pyrolysis solvent but was no more effective than when used as a postproduction additive.
Resumo:
Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.
Resumo:
On the basis of a review of the substantive quality and service marketing literature current knowledge regarding service quality expectations was found either absent or deficient. The phenomenon is of increasing importance to both marketing researchers and management and was therefore judged worthy of scholarly consideration. Because the service quality literature was insufficiently rich when embarking on the thesis three basic research issues were considered namely the nature, determinants, and dynamics of service quality expectations. These issues were first conceptually and then qualitatively explored. This process generated research hypotheses mainly relating to a model which were subsequently tested through a series of empirical investigations using questionnaire data from field studies in a single context. The results were internally consistent and strongly supported the main research hypotheses. It was found that service quality expectations can be meaningfully described in terms of generic/service-specific, intangible/tangible, and process/outcome categories. Service-specific quality expectations were also shown to be determined by generic service quality expectations, demographic variables, personal values, psychological needs, general service sophistication, service-specific sophistication, purchase motives, and service-specific information when treating service class involvement as an exogenous variable. Subjects who had previously not directly experienced a particular service were additionally found to revise their expectations of quality when exposed to the service with change being driven by a sub-set of identified determinants.
Resumo:
Exporting is one of the main ways in which organizations internationalize. With the more turbulent, heterogeneous, sophisticated and less familiar export environment, the organizational learning ability of the exporting organization may become its only source of sustainable competitive advantage. However, achieving a competitive level of learning is not easy. Companies must be able to find ways to improve their learning capability by enhancing the different aspects of the learning process. One of these is export memory. Building from an export information processing framework this research work particularly focuses on the quality of export memory, its determinants, its subsequent use in decision-making, and its ultimate relationship with export performance. Within export memory use, four export memory use dimensions have been discovered: instrumental, conceptual, legitimizing and manipulating. Results from the qualitative study based on the data from a mail survey with 354 responses reveal that the development of export memory quality is positively related with quality of export information acquisition, the quality of export information interpretation, export coordination, and integration of the information into the organizational system. Several company and environmental factors have also been examined in terms of their relationship with export memory use. The two factors found to be significantly related to the extent of export memory use are acquisition of export information quality and export memory quality. The results reveal that export memory quality is positively related to the extent of export memory use which in turn was found to be positively related to export performance. Furthermore, results of the study show that there is only one aspect of export memory use that significantly affects export performance – the extent of export memory use. This finding could mean that there is no particular type of export memory use favored since the choice of the type of use is situation specific. Additional results reveal that environmental turbulence and export memory overload have moderating effects on the relationship between export memory use and export performance.
Resumo:
The research is concerned with the measurement of residents' evaluations of the environmental quality of residential areas. The research reflects the increased attention being given to residents' values in planning decisions affecting the residential environment. The work was undertaken in co-operation with a local authority which was in the process of revising its housing strategy, and in particular the priorities for improvement action. The study critically examines the existing evidence on environmental values and their relationship to the environment and points to a number of methodological and conceptual deficiencies. The research strategy developed on the basis of the research review was constrained by the need to keep any survey methods simple so that they could easily be repeated, when necessary, by the sponsoring authority. A basic perception model was assumed, and a social survey carried out to measure residents' responses to different environmental conditions. The data was only assumed to have ordinal properties, necessitating the extensive use of non-parametric statistics. Residents' expressions of satisfaction with the component elements of the environment (ranging from convenience to upkeep and privacy) were successfully related to 'objective' measures of the environment. However the survey evidence did not justify the use of the 'objective' variables as environmental standards. A method of using the social survey data directly as an aid to decision-making is discussed. Alternative models of the derivation of overall satisfaction with the environment are tested, and the values implied by the additive model compared with residents' preferences as measured directly in the survey. Residents' overall satisfactions with the residential environment were most closely related to their satisfactions with the "Appearance" and the "Reputation" of their areas. By contrast the most important directly measured preference was "Friendliness of area". The differences point to the need to define concepts used in social research clearly in operational terms, and to take care in the use of values 'measured' by different methods.
Resumo:
Recent advances in technology have produced a significant increase in the availability of free sensor data over the Internet. With affordable weather monitoring stations now available to individual meteorology enthusiasts a reservoir of real time data such as temperature, rainfall and wind speed can now be obtained for most of the United States and Europe. Despite the abundance of available data, obtaining useable information about the weather in your local neighbourhood requires complex processing that poses several challenges. This paper discusses a collection of technologies and applications that harvest, refine and process this data, culminating in information that has been tailored toward the user. In this case we are particularly interested in allowing a user to make direct queries about the weather at any location, even when this is not directly instrumented, using interpolation methods. We also consider how the uncertainty that the interpolation introduces can then be communicated to the user of the system, using UncertML, a developing standard for uncertainty representation.
Resumo:
BACKGROUND: Gilles de la Tourette syndrome (GTS) is a chronic childhood-onset neuropsychiatric disorder with a significant impact on patients' health-related quality of life (HR-QOL). Cavanna et al. (Neurology 2008; 71: 1410-1416) developed and validated the first disease-specific HR-QOL assessment tool for adults with GTS (Gilles de la Tourette Syndrome-Quality of Life Scale, GTS-QOL). This paper presents the translation, adaptation and validation of the GTS-QOL for young Italian patients with GTS. METHODS: A three-stage process involving 75 patients with GTS recruited through three Departments of Child and Adolescent Neuropsychiatry in Italy led to the development of a 27-item instrument (Gilles de la Tourette Syndrome-Quality of Life Scale in children and adolescents, C&A-GTS-QOL) for the assessment of HR-QOL through a clinician-rated interview for 6-12 year-olds and a self-report questionnaire for 13-18 year-olds. RESULTS: The C&A-GTS-QOL demonstrated satisfactory scaling assumptions and acceptability. Internal consistency reliability was high (Cronbach's alpha > 0.7) and validity was supported by interscale correlations (range 0.4-0.7), principal-component factor analysis and correlations with other rating scales and clinical variables. CONCLUSIONS: The present version of the C&A-GTS-QOL is the first disease-specific HR-QOL tool for Italian young patients with GTS, satisfying criteria for acceptability, reliability and validity. © 2013 - IOS Press and the authors. All rights reserved.
Resumo:
This thesis examines the transition of employees into entrepreneurship, with particular emphasis on the role of workplace characteristics in influencing this movement. The first main chapter examines whether the determinants of becoming an intrapreneur differ from those that support transitions into independent entrepreneurship. The results show that intrapreneurs resemble employees rather than entrepreneurs, contrary to what the entrepreneurship theory would suggest. Yet it shows that those intrapreneurs that expect to acquire an ownership stake in the business, unlike the rest of intrapreneurs, possess traditional entrepreneurial traits. Chapter 3 investigates how workers’ degree of specialisation determines their decision to found a firm. It shows that entrepreneurs emerging from small firms, i.e. generalists, transfer knowledge from more diverse aspects of the business and create firms more related to the main activity of their last employer. Workers in large firms, however, benefit from higher returns to human capital that increase their opportunity costs to switch to entrepreneurship. Since becoming an entrepreneur would make part of their specialised skills unutilised, the minimum quality of the idea at which they would be willing to leave will be higher and, therefore, entrepreneurs emerging from large firms will be of highest quality. Chapter 4 analyses whether the reason to terminate an employment contract is associated with the fact that the majority of entrepreneurs appear to set up their business after having worked for a small firm. Moreover, it studies how this pattern varies as the labour market conditions worsen. The effect of layoffs turns out to be a key driver in the entry to entrepreneurship and it is found to exert a greater effect the smaller the firm workers are dismissed from. This has been reflected in an overall larger flow of employees from small firms moving into entrepreneurship over the recession.
Resumo:
In an Arab oil producing country in the Middle East such as Kuwait, Oil industry is considered as the main and most important industry of the country. This industry’s importance emerged from the significant role it plays in both country’s national economy and also global economy. Moreover, Oil industry’s criticality comes from its interconnectivity with national security and power in the Middle East region. Hence, conducting this research in this crucial industry had certainly added values to companies in this industry as it investigated thoroughly the main components of the TQM implementation process and identified which components affects significantly TQM’s implementation and its gained business results. In addition, as the Oil sector is a large sector that is known for its richness of employees with different national cultures and backgrounds. Thus, this culture-heterogeneous industry seems to be the most appropriate environment to address and satisfy a need in the literature to investigate the national culture values’ effects on TQM implementation process. Furthermore, this research has developed a new conceptual model of TQM implementation process in the Kuwaiti Oil industry that applies in general to operations and productions organizations at the Kuwaiti business environment and in specific to organizations in the Oil industry, as well it serves as a good theoretical model for improving operations and production level of the oil industry in other developing and developed countries. Thus, such research findings minimized the literature’s gap found the limited amount of empirical research of TQM implementation in well-developed industries existing in an Arab, developing countries and specifically in Kuwait, where there was no coherent national model for a universal TQM implementation in the Kuwaiti Oil industry in specific and Kuwaiti business environment in general. Finally, this newly developed research framework, which emerged from the literature search, was validated by rigorous quantitative analysis tools including SPSS and Structural Equation Modeling. The quantitative findings of questionnaires collected were supported by the qualitative findings of interviews conducted.