18 resultados para bread-making quality

em Aston University Research Archive


Relevância:

90.00% 90.00%

Publicador:

Resumo:

Purpose: The aim of this article is to detail the correlation between quality management, specifically its tools and critical success factors, and performance in terms of primary operational and secondary organisational performances. Design/methodology/approach: Survey data from the UK and Turkey were analysed using exploratory factor analyses, structural equation modelling and regression analysis. Findings: The results show that quality management has a significant and positive impact on both primary and secondary performances; that Turkish and UK attitudes to quality management are similar; and that quality management is widely practised in manufacturing and service industries but has more statistical emphasis in the manufacturing sector. The main challenge for making quality management practice more effective lies in an appropriate balanced use of the different sorts of the tools and critical success factors. Originality/value: This study takes a novel approach by: (i) exploring the relationship between primary operational and secondary organisational performances, (ii) using service and manufacturing data and (iii) making a cross-country comparison between the UK (a developed economy) and Turkey (a developing economy). Limitations: Detailed contrast provided between only two countries. © 2013 Copyright Taylor and Francis Group, LLC.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Plantain (Banana-Musa AAB) is a widely growing but commercially underexploited tropical fruit. This study demonstrates the processing of plantain to flour and extends its use and convenience as a constituent of bread, cake and biscuit. Plantain was peeled, dried and milled to produce flour. Proximate analysis was carried out on the flour to determine the food composition. Drying at temperatures below 70ºC produced light coloured plantain flour. Experiments were carried out to determine the mechanism of drying, the heat and mass transfer coefficients, effect of air velocity, temperature and cube size on the rate of drying of plantain cubes. The drying was diffusion controlled. Pilot scale drying of plantain cubes in a cabinet dryer showed no significant increase of drying rate above 70ºC. In the temperature range found most suitable for plantain drying (ie 60 to 70ºC) the total drying time was adequately predicted using a modified equation based on Fick's Law provided the cube temperature was taken to be about 5ºC below the actual drying air temperature. Studies of baking properties of plantain flour revealed that plantain flour can be substituted for strong wheat flour up to 15% for bread making and up to 50% for madeira cake. A shortcake biscuit was produced using 100% plantain flour and test-marketed. Detailed economic studies showed that the production of plantain fruit and its processing into flour would be economically viable in Nigeria when the flour is sold at the wholesale price of NO.65 per kilogram provided a minimum sale of 25% plantain suckers. There is need for government subsidy if plantain flour is to compete with imported wheat flour. The broader economic benefits accruing from the processing of plantain fruit into flour and its use in bakery products include employment opportunity, savings in foreign exchange and stimulus to home agriculture.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Decision-making in product quality is an indispensable stage in product development, in order to reduce product development risk. Based on the identification of the deficiencies of quality function deployment (QFD) and failure modes and effects analysis (FMEA), a novel decision-making method is presented that draws upon a knowledge network of failure scenarios. An ontological expression of failure scenarios is presented together with a framework of failure knowledge network (FKN). According to the roles of quality characteristics (QCs) in failure processing, QCs are set into three categories namely perceptible QCs, restrictive QCs, and controllable QCs, which present the monitor targets, control targets and improvement targets respectively for quality management. A mathematical model and algorithms based on the analytic network process (ANP) is introduced for calculating the priority of QCs with respect to different development scenarios. A case study is provided according to the proposed decision-making procedure based on FKN. This methodology is applied in the propeller design process to solve the problem of prioritising QCs. This paper provides a practical approach for decision-making in product quality. Copyright © 2011 Inderscience Enterprises Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, a co-operative distributed process mining system (CDPMS) is developed to streamline the workflow along the supply chain in order to offer shorter delivery times, more flexibility and higher customer satisfaction with learning ability. The proposed system is equipped with the ‘distributed process mining’ feature which is used to discover the hidden relationships among each working decision in distributed manner. This method incorporates the concept of data mining and knowledge refinement into decision making process for ensuring ‘doing the right things’ within the workflow. An example of implementation is given, based on the case of slider manufacturer.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Problem-structuring group workshops can be used in organizations as a consulting tool and as a research tool. One example of the latter is using a problem-structuring method (PSM) to help a group tackle an organizational issue; meanwhile, researchers collect the participants' initial views, discussion of divergent views, the negotiated agreement, and the reasoning for outcomes emerging. Technology can help by supporting participants in freely sharing their opinions and by logging data for post-workshop analyses. For example, computers let participants share views anonymously and without being influenced by others (as well as logging those views), and video-cameras can record discussions and intra-group dynamics. This paper evaluates whether technology-supported Journey Making workshops can be effective research tools that can capture quality research data when compared against theoretical performance benchmarks and other qualitative research tools. © 2006 Operational Research Society Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study integrates research on minority dissent and individual creativity, as well as team diversity and the quality of group decision making, with research on team participation in decision making. From these lines of research, it was proposed that minority dissent would predict innovation in teams but only when teams have high levels of participation in decision making. This hypothesis was tested in 2 studies, 1 involving a homogeneous sample of self-managed teams and 1 involving a heterogeneous sample of cross-functional teams. Study 1 suggested that a newly developed scale to measure minority dissent has discriminant validity. Both Study 1 and Study 2 showed more innovations under high rather than low levels of minority dissent but only when there was a high degree of participation in team decision making. It is concluded that minority dissent stimulates creativity and divergent thought, which, through participation, manifest as innovation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This sustained longitudinal study, carried out in a single local authority, investigates the implementation of a Total Quality Management (TQM) philosophy in professional local government services. At the start of this research, a large majority of what was written about TQM was polemical and based on limited empirical evidence. This thesis seeks to provide a significant and important piece of work, making a considerable contribution to the current state of knowledge in this area. Teams from four professional services within a single local authority participated in this research, providing the main evidence on how the quality management agenda in a local authority can be successfully implemented. To supplement this rich source of data, various other sources and methods of data collection have been used: 1) Interviews were carried out with senior managers from within the authority; 2) Customer focus groups and questionnaires were used; 3) Interviews were carried out with other organisations, all of which were proponents of a TQM philosophy. A number of tools have been developed to assist in gathering data: 1) The CSFs (critical success factors) benchmarking tool; 2) Five Stages of Quality Improvement Model. A Best Practice Quality Improvement Model, arising from an analysis of the literature and the researcher's own experience is proposed and tested. From the results a number of significant conclusions have been drawn relating to: 1) Triggers for change; 2) Resistance of local government professionals to change 3) Critical success factors and barriers to quality improvement in professional local government services; 4) The problems associated with participant observation and other methodological issues used.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exporting is one of the main ways in which organizations internationalize. With the more turbulent, heterogeneous, sophisticated and less familiar export environment, the organizational learning ability of the exporting organization may become its only source of sustainable competitive advantage. However, achieving a competitive level of learning is not easy. Companies must be able to find ways to improve their learning capability by enhancing the different aspects of the learning process. One of these is export memory. Building from an export information processing framework this research work particularly focuses on the quality of export memory, its determinants, its subsequent use in decision-making, and its ultimate relationship with export performance. Within export memory use, four export memory use dimensions have been discovered: instrumental, conceptual, legitimizing and manipulating. Results from the qualitative study based on the data from a mail survey with 354 responses reveal that the development of export memory quality is positively related with quality of export information acquisition, the quality of export information interpretation, export coordination, and integration of the information into the organizational system. Several company and environmental factors have also been examined in terms of their relationship with export memory use. The two factors found to be significantly related to the extent of export memory use are acquisition of export information quality and export memory quality. The results reveal that export memory quality is positively related to the extent of export memory use which in turn was found to be positively related to export performance. Furthermore, results of the study show that there is only one aspect of export memory use that significantly affects export performance – the extent of export memory use. This finding could mean that there is no particular type of export memory use favored since the choice of the type of use is situation specific. Additional results reveal that environmental turbulence and export memory overload have moderating effects on the relationship between export memory use and export performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research was conducted at the Space Research and Technology Centre o the European Space Agency at Noordvijk in the Netherlands. ESA is an international organisation that brings together a range of scientists, engineers and managers from 14 European member states. The motivation for the work was to enable decision-makers, in a culturally and technologically diverse organisation, to share information for the purpose of making decisions that are well informed about the risk-related aspects of the situations they seek to address. The research examined the use of decision support system DSS) technology to facilitate decision-making of this type. This involved identifying the technology available and its application to risk management. Decision-making is a complex activity that does not lend itself to exact measurement or precise understanding at a detailed level. In view of this, a prototype DSS was developed through which to understand the practical issues to be accommodated and to evaluate alternative approaches to supporting decision-making of this type. The problem of measuring the effect upon the quality of decisions has been approached through expert evaluation of the software developed. The practical orientation of this work was informed by a review of the relevant literature in decision-making, risk management, decision support and information technology. Communication and information technology unite the major the,es of this work. This allows correlation of the interests of the research with European public policy. The principles of communication were also considered in the topic of information visualisation - this emerging technology exploits flexible modes of human computer interaction (HCI) to improve the cognition of complex data. Risk management is itself an area characterised by complexity and risk visualisation is advocated for application in this field of endeavour. The thesis provides recommendations for future work in the fields of decision=making, DSS technology and risk management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The research is concerned with the measurement of residents' evaluations of the environmental quality of residential areas. The research reflects the increased attention being given to residents' values in planning decisions affecting the residential environment. The work was undertaken in co-operation with a local authority which was in the process of revising its housing strategy, and in particular the priorities for improvement action. The study critically examines the existing evidence on environmental values and their relationship to the environment and points to a number of methodological and conceptual deficiencies. The research strategy developed on the basis of the research review was constrained by the need to keep any survey methods simple so that they could easily be repeated, when necessary, by the sponsoring authority. A basic perception model was assumed, and a social survey carried out to measure residents' responses to different environmental conditions. The data was only assumed to have ordinal properties, necessitating the extensive use of non-parametric statistics. Residents' expressions of satisfaction with the component elements of the environment (ranging from convenience to upkeep and privacy) were successfully related to 'objective' measures of the environment. However the survey evidence did not justify the use of the 'objective' variables as environmental standards. A method of using the social survey data directly as an aid to decision-making is discussed. Alternative models of the derivation of overall satisfaction with the environment are tested, and the values implied by the additive model compared with residents' preferences as measured directly in the survey. Residents' overall satisfactions with the residential environment were most closely related to their satisfactions with the "Appearance" and the "Reputation" of their areas. By contrast the most important directly measured preference was "Friendliness of area". The differences point to the need to define concepts used in social research clearly in operational terms, and to take care in the use of values 'measured' by different methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three novel solar thermal collector concepts derived from the Linear Fresnel Reflector (LFR) are developed and evaluated through a multi-criteria decision-making methodology, comprising the following techniques: Quality Function Deployment (QFD), the Analytical Hierarchy Process (AHP) and the Pugh selection matrix. Criteria are specified by technical and customer requirements gathered from Gujarat, India. The concepts are compared to a standard LFR for reference, and as a result, a novel 'Elevation Linear Fresnel Reflector' (ELFR) concept using elevating mirrors is selected. A detailed version of this concept is proposed and compared against two standard LFR configurations, one using constant and the other using variable horizontal mirror spacing. Annual performance is analysed for a typical meteorological year. Financial assessment is made through the construction of a prototype. The novel LFR has an annual optical efficiency of 49% and increases exergy by 13-23%. Operational hours above a target temperature of 300 C are increased by 9-24%. A 17% reduction in land usage is also achievable. However, the ELFR suffers from additional complexity and a 16-28% increase in capital cost. It is concluded that this novel design is particularly promising for industrial applications and locations with restricted land availability or high land costs. The decision analysis methodology adopted is considered to have a wider potential for applications in the fields of renewable energy and sustainable design. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Indicators which summarise the characteristics of spatiotemporal data coverages significantly simplify quality evaluation, decision making and justification processes by providing a number of quality cues that are easy to manage and avoiding information overflow. Criteria which are commonly prioritised in evaluating spatial data quality and assessing a dataset’s fitness for use include lineage, completeness, logical consistency, positional accuracy, temporal and attribute accuracy. However, user requirements may go far beyond these broadlyaccepted spatial quality metrics, to incorporate specific and complex factors which are less easily measured. This paper discusses the results of a study of high level user requirements in geospatial data selection and data quality evaluation. It reports on the geospatial data quality indicators which were identified as user priorities, and which can potentially be standardised to enable intercomparison of datasets against user requirements. We briefly describe the implications for tools and standards to support the communication and intercomparison of data quality, and the ways in which these can contribute to the generation of a GEO label.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Extending the growing interest in affect in work groups, we propose that groups with distributed information make higher quality decisions when they are in a negative rather than a positive mood, but that these effects are moderated by group members' trait negative affect. In support of this hypothesis, an experiment (N = 175 groups) showed that positive mood led to lower quality decisions than did negative or neutral moods when group members were low in trait negative affect, whereas such mood effects were not observed in groups higher in trait negative affect. Mediational analysis based on behavioral observations of group process confirmed that group information elaboration mediated this effect. These results provide an important caveat on the benefits of positive moods in work groups, and suggest that the study of trait × state affect interactions is an important avenue for future research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Guest editorial Ali Emrouznejad is a Senior Lecturer at the Aston Business School in Birmingham, UK. His areas of research interest include performance measurement and management, efficiency and productivity analysis as well as data mining. He has published widely in various international journals. He is an Associate Editor of IMA Journal of Management Mathematics and Guest Editor to several special issues of journals including Journal of Operational Research Society, Annals of Operations Research, Journal of Medical Systems, and International Journal of Energy Management Sector. He is in the editorial board of several international journals and co-founder of Performance Improvement Management Software. William Ho is a Senior Lecturer at the Aston University Business School. Before joining Aston in 2005, he had worked as a Research Associate in the Department of Industrial and Systems Engineering at the Hong Kong Polytechnic University. His research interests include supply chain management, production and operations management, and operations research. He has published extensively in various international journals like Computers & Operations Research, Engineering Applications of Artificial Intelligence, European Journal of Operational Research, Expert Systems with Applications, International Journal of Production Economics, International Journal of Production Research, Supply Chain Management: An International Journal, and so on. His first authored book was published in 2006. He is an Editorial Board member of the International Journal of Advanced Manufacturing Technology and an Associate Editor of the OR Insight Journal. Currently, he is a Scholar of the Advanced Institute of Management Research. Uses of frontier efficiency methodologies and multi-criteria decision making for performance measurement in the energy sector This special issue aims to focus on holistic, applied research on performance measurement in energy sector management and for publication of relevant applied research to bridge the gap between industry and academia. After a rigorous refereeing process, seven papers were included in this special issue. The volume opens with five data envelopment analysis (DEA)-based papers. Wu et al. apply the DEA-based Malmquist index to evaluate the changes in relative efficiency and the total factor productivity of coal-fired electricity generation of 30 Chinese administrative regions from 1999 to 2007. Factors considered in the model include fuel consumption, labor, capital, sulphur dioxide emissions, and electricity generated. The authors reveal that the east provinces were relatively and technically more efficient, whereas the west provinces had the highest growth rate in the period studied. Ioannis E. Tsolas applies the DEA approach to assess the performance of Greek fossil fuel-fired power stations taking undesirable outputs into consideration, such as carbon dioxide and sulphur dioxide emissions. In addition, the bootstrapping approach is deployed to address the uncertainty surrounding DEA point estimates, and provide bias-corrected estimations and confidence intervals for the point estimates. The author revealed from the sample that the non-lignite-fired stations are on an average more efficient than the lignite-fired stations. Maethee Mekaroonreung and Andrew L. Johnson compare the relative performance of three DEA-based measures, which estimate production frontiers and evaluate the relative efficiency of 113 US petroleum refineries while considering undesirable outputs. Three inputs (capital, energy consumption, and crude oil consumption), two desirable outputs (gasoline and distillate generation), and an undesirable output (toxic release) are considered in the DEA models. The authors discover that refineries in the Rocky Mountain region performed the best, and about 60 percent of oil refineries in the sample could improve their efficiencies further. H. Omrani, A. Azadeh, S. F. Ghaderi, and S. Abdollahzadeh presented an integrated approach, combining DEA, corrected ordinary least squares (COLS), and principal component analysis (PCA) methods, to calculate the relative efficiency scores of 26 Iranian electricity distribution units from 2003 to 2006. Specifically, both DEA and COLS are used to check three internal consistency conditions, whereas PCA is used to verify and validate the final ranking results of either DEA (consistency) or DEA-COLS (non-consistency). Three inputs (network length, transformer capacity, and number of employees) and two outputs (number of customers and total electricity sales) are considered in the model. Virendra Ajodhia applied three DEA-based models to evaluate the relative performance of 20 electricity distribution firms from the UK and the Netherlands. The first model is a traditional DEA model for analyzing cost-only efficiency. The second model includes (inverse) quality by modelling total customer minutes lost as an input data. The third model is based on the idea of using total social costs, including the firm’s private costs and the interruption costs incurred by consumers, as an input. Both energy-delivered and number of consumers are treated as the outputs in the models. After five DEA papers, Stelios Grafakos, Alexandros Flamos, Vlasis Oikonomou, and D. Zevgolis presented a multiple criteria analysis weighting approach to evaluate the energy and climate policy. The proposed approach is akin to the analytic hierarchy process, which consists of pairwise comparisons, consistency verification, and criteria prioritization. In the approach, stakeholders and experts in the energy policy field are incorporated in the evaluation process by providing an interactive mean with verbal, numerical, and visual representation of their preferences. A total of 14 evaluation criteria were considered and classified into four objectives, such as climate change mitigation, energy effectiveness, socioeconomic, and competitiveness and technology. Finally, Borge Hess applied the stochastic frontier analysis approach to analyze the impact of various business strategies, including acquisition, holding structures, and joint ventures, on a firm’s efficiency within a sample of 47 natural gas transmission pipelines in the USA from 1996 to 2005. The author finds that there were no significant changes in the firm’s efficiency by an acquisition, and there is a weak evidence for efficiency improvements caused by the new shareholder. Besides, the author discovers that parent companies appear not to influence a subsidiary’s efficiency positively. In addition, the analysis shows a negative impact of a joint venture on technical efficiency of the pipeline company. To conclude, we are grateful to all the authors for their contribution, and all the reviewers for their constructive comments, which made this special issue possible. We hope that this issue would contribute significantly to performance improvement of the energy sector.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Purpose – The purpose of this paper is to investigate the effectiveness of quality management training by reviewing commonly used critical success factors and tools rather than the overall methodological approach. Design/methodology/approach – The methodology used a web-based questionnaire. It consisted of 238 questions covering 77 tools and 30 critical success factors selected from leading academic and practitioner sources. The survey had 79 usable responses and the data were analysed using relevant statistical quality management tools. The results were validated in a series of structured workshops with quality management experts. Findings – Findings show that in general most of the critical success factor statements for quality management are agreed with, although not all are implemented well. The findings also show that many quality tools are not known or understood well; and that training has an important role in raising their awareness and making sure they are used correctly. Research limitations/implications – Generalisations are limited by the UK-centric nature of the sample. Practical implications – The practical implications are discussed for organisations implementing quality management initiatives, training organisations revising their quality management syllabi and academic institutions teaching quality management. Originality/value – Most recent surveys have been aimed at methodological levels (i.e. “lean”, “Six Sigma”, “total quality management” etc.); this research proposes that this has limited value as many of the tools and critical success factors are common to most of the methodologies. Therefore, quite uniquely, this research focuses on the tools and critical success factors. Additionally, other recent comparable surveys have been less comprehensive and not focused on training issues.