83 resultados para 091006 Manufacturing Processes and Technologies (excl. Textiles)
em Aston University Research Archive
Resumo:
Purpose: This paper aims to explore practices and technologies successfully servitised manufacturers employ in the delivery of advanced services. Design/methodology/approach: A case study methodology is applied across four manufacturing organisations successful in servitization. Through interviews with personnel across host manufacturers, their partners, and key customers, extensive data are collected about service delivery systems. Analyses identify convergence in their practices and technologies. Findings: Six distinct technologies and practices are revealed: facilities and their location, micro-vertical integration and supplier relationships, information and communication technologies (ICTs), performance measurement and value demonstration, people deployment and their skills, and business processes and customer relationships. These are then combined in an integrative framework that illustrates how operations are configured to successfully deliver advanced services. Research limitations/implications: The analyses are reductive and rationalising. Future studies could identify other technologies and practices. Case study as a method is inherently limited in the extent to which findings can be generalised. Practical implications: Awareness and interest in servitization is growing, yet adoption of a servitization strategy requires particular organisational capabilities on the part of the manufacturer. This study identifies technologies and practices that underpin these capabilities. Originality/value: This paper contributes to the understanding of the servitization process and, in particular, the implications to broader operations of the firm. © Emerald Group Publishing Limited.
Resumo:
This article categorises manufacturing strategy design processes and presents the characteristics of resulting strategies. This work will therefore assist practitioners to appreciate the implications of planning activities. The article presents a framework for classifying manufacturing strategy processes and the resulting strategies. Each process and respective strategy is then considered in detail. In this consideration the preferred approach is presented for formulating a world class manufacturing strategy. Finally, conclusions and recommendations for further work are given.
Resumo:
Reliability modelling and verification is indispensable in modern manufacturing, especially for product development risk reduction. Based on the discussion of the deficiencies of traditional reliability modelling methods for process reliability, a novel modelling method is presented herein that draws upon a knowledge network of process scenarios based on the analytic network process (ANP). An integration framework of manufacturing process reliability and product quality is presented together with a product development and reliability verification process. According to the roles of key characteristics (KCs) in manufacturing processes, KCs are organised into four clusters, that is, product KCs, material KCs, operation KCs and equipment KCs, which represent the process knowledge network of manufacturing processes. A mathematical model and algorithm is developed for calculating the reliability requirements of KCs with respect to different manufacturing process scenarios. A case study on valve-sleeve component manufacturing is provided as an application example of the new reliability modelling and verification procedure. This methodology is applied in the valve-sleeve component manufacturing processes to manage and deploy production resources.
Resumo:
The fast spread of the Internet and the increasing demands of the service are leading to radical changes in the structure and management of underlying telecommunications systems. Active networks (ANs) offer the ability to program the network on a per-router, per-user, or even per-packet basis, thus promise greater flexibility than current networks. To make this new network paradigm of active network being widely accepted, a lot of issues need to be solved. Management of the active network is one of the challenges. This thesis investigates an adaptive management solution based on genetic algorithm (GA). The solution uses a distributed GA inspired by bacterium on the active nodes within an active network, to provide adaptive management for the network, especially the service provision problems associated with future network. The thesis also reviews the concepts, theories and technologies associated with the management solution. By exploring the implementation of these active nodes in hardware, this thesis demonstrates the possibility of implementing a GA based adaptive management in the real network that being used today. The concurrent programming language, Handel-C, is used for the description of the design system and a re-configurable computer platform based on a FPGA process element is used for the hardware implementation. The experiment results demonstrate both the availability of the hardware implementation and the efficiency of the proposed management solution.
Resumo:
In this study some common types of Rolling Bearing vibrations are analysed in depth both theoretically and experimentally. The study is restricted to vibrations in the radial direction of bearings having pure radial load and a positive radial clearance. The general vibrational behaviour of such bearings has been investigated with respect to the effects of varying compliance, manufacturing tolerances and the interaction between the bearing and the machine structure into which it is fitted. The equations of motion for a rotor supported by a bearing in which the stiffness varies with cage position has been set up and examples of solutions,obtained by digital simulation. is given. A method to calculate amplitudes and frequencies of vibration components due to out of roundness of the inner ring and varying roller diameters has been developed. The results from these investigations have been combined with a theory for bearing/machine frame interaction using mechanical impedance technique, thereby facilitating prediction of the vibrational behaviour of the whole set up. Finally. the effects of bearing fatigue and wear have been studied with particular emphasis on the use of vibration analysis for condition monitoring purposes. A number of monitoring methods have been tried and their effectiveness discussed. The experimental investigation was carried out using two purpose built rigs. For the purpose of analysis of the experimental measurements a digital mini computer was adapted for signal processing and a suite of programs was written. The program package performs several of the commonly used signal analysis processes and :include all necessary input and output functions.
Resumo:
This paper is based a major research project run by a team from the Innovation, Design and Operations Management Research Unit at the Aston Business School under SERC funding. International Computers Limited (!CL), the UK's largest indigenous manufacturer of mainframe computer products, was the main industrial collaborator in the research. During the period 1985-89 an integrated production system termed the "Modular Assembly Cascade'' was introduced to the Company's mainframe assembly plant at Ashton-under-Lyne near Manchester. Using a methodology primarily based upon 'participative observation', the researchers developed a model for analysing this manufacturing system design called "DRAMA". Following a critique of the existing literature on Manufacturing Strategy, this paper will describe the basic DRAMA model and its development from an industry specific design methodology to DRAMA II, a generic model for studying organizational decision processes in the design and implementation of production systems. From this, the potential contribution of the DRAMA model to the existing knowledge on the process of manufacturing system design will be apparent.
Resumo:
In this chapter, we elaborate on the well-known relationship between Gaussian processes (GP) and Support Vector Machines (SVM). Secondly, we present approximate solutions for two computational problems arising in GP and SVM. The first one is the calculation of the posterior mean for GP classifiers using a `naive' mean field approach. The second one is a leave-one-out estimator for the generalization error of SVM based on a linear response method. Simulation results on a benchmark dataset show similar performances for the GP mean field algorithm and the SVM algorithm. The approximate leave-one-out estimator is found to be in very good agreement with the exact leave-one-out error.
Resumo:
There is controversy over whether integrated manufacturing (IM), comprising advanced manufacturing technology, just-in-time inventory control and total quality management, empowers or deskills shop floor work. Moreover, both IM and empowerment are promoted on the assumption that they enhance competitiveness. We examine these issues in a study of 80 manufacturing companies. The extent of use of IM was positively associated with empowerment (i.e., job enrichment and employee skill enhancement), but, with the minor exception of AMT, bore little relationship with subsequent company performance. In contrast, the extent of empowerment within companies predicted the subsequent level of company performance controlling for prior performance, with the effect on productivity mediating that on profit. Copyright © 2004 John Wiley & Sons, Ltd.
Resumo:
The present global economic crisis creates doubts about the good use of accumulated experience and knowledge in managing risk in financial services. Typically, risk management practice does not use knowledge management (KM) to improve and to develop new answers to the threats. A key reason is that it is not clear how to break down the “organizational silos” view of risk management (RM) that is commonly taken. As a result, there has been relatively little work on finding the relationships between RM and KM. We have been doing research for the last couple of years on the identification of relationships between these two disciplines. At ECKM 2007 we presented a general review of the literature(s) and some hypotheses for starting research on KM and its relationship to the perceived value of enterprise risk management. This article presents findings based on our preliminary analyses, concentrating on those factors affecting the perceived quality of risk knowledge sharing. These come from a questionnaire survey of RM employees in organisations in the financial services sector, which yielded 121 responses. We have included five explanatory variables for the perceived quality of risk knowledge sharing. These comprised two variables relating to people (organizational capacity for work coordination and perceived quality of communication among groups), one relating to process (perceived quality of risk control) and two related to technology (web channel functionality and RM information system functionality). Our findings so far are that four of these five variables have a significant positive association with the perceived quality of risk knowledge sharing: contrary to expectations, web channel functionality did not have a significant association. Indeed, in some of our exploratory regression studies its coefficient (although not significant) was negative. In stepwise regression, the variable organizational capacity for work coordination accounted for by far the largest part of the variation in the dependent variable perceived quality of risk knowledge sharing. The “people” variables thus appear to have the greatest influence on the perceived quality of risk knowledge sharing, even in a sector that relies heavily on technology and on quantitative approaches to decision making. We have also found similar results with the dependent variable perceived value of Enterprise Risk Management (ERM) implementation.
Resumo:
Knowledge has been a subject of interest and inquiry for thousands of years since at least the time of the ancient Greeks, and no doubt even before that. “What is knowledge” continues to be an important topic of discussion in philosophy. More recently, interest in managing knowledge has grown in step with the perception that increasingly we live in a knowledge-based economy. Drucker (1969) is usually credited as being the first to popularize the knowledge-based economy concept by linking the importance of knowledge with rapid technological change in Drucker (1969). Karl Wiig coined the term knowledge management (hereafter KM) for a NATO seminar in 1986, and its popularity took off following the publication of Nonaka and Takeuchi’s book “The Knowledge Creating Company” (Nonaka & Takeuchi, 1995). Knowledge creation is in fact just one of many activities involved in KM. Others include sharing, retaining, refining, and using knowledge. There are many such lists of activities (Holsapple & Joshi, 2000; Probst, Raub, & Romhardt, 1999; Skyrme, 1999; Wiig, De Hoog, & Van der Spek, 1997). Both academic and practical interest in KM has continued to increase throughout the last decade. In this article, first the different types of knowledge are outlined, then comes a discussion of various routes by which knowledge management can be implemented, advocating a process-based route. An explanation follows of how people, processes, and technology need to fit together for effective KM, and some examples of this route in use are given. Finally, there is a look towards the future.
Resumo:
The assessment of the reliability of systems which learn from data is a key issue to investigate thoroughly before the actual application of information processing techniques to real-world problems. Over the recent years Gaussian processes and Bayesian neural networks have come to the fore and in this thesis their generalisation capabilities are analysed from theoretical and empirical perspectives. Upper and lower bounds on the learning curve of Gaussian processes are investigated in order to estimate the amount of data required to guarantee a certain level of generalisation performance. In this thesis we analyse the effects on the bounds and the learning curve induced by the smoothness of stochastic processes described by four different covariance functions. We also explain the early, linearly-decreasing behaviour of the curves and we investigate the asymptotic behaviour of the upper bounds. The effect of the noise and the characteristic lengthscale of the stochastic process on the tightness of the bounds are also discussed. The analysis is supported by several numerical simulations. The generalisation error of a Gaussian process is affected by the dimension of the input vector and may be decreased by input-variable reduction techniques. In conventional approaches to Gaussian process regression, the positive definite matrix estimating the distance between input points is often taken diagonal. In this thesis we show that a general distance matrix is able to estimate the effective dimensionality of the regression problem as well as to discover the linear transformation from the manifest variables to the hidden-feature space, with a significant reduction of the input dimension. Numerical simulations confirm the significant superiority of the general distance matrix with respect to the diagonal one.In the thesis we also present an empirical investigation of the generalisation errors of neural networks trained by two Bayesian algorithms, the Markov Chain Monte Carlo method and the evidence framework; the neural networks have been trained on the task of labelling segmented outdoor images.
Resumo:
In response to the increasing international competitiveness, many manufacturing businesses are rethinking their management strategies and philosophies towards achieving a computer integrated environment. The explosive growth in Advanced Manufacturing Technology (AMI) has resulted in the formation of functional "Islands of Automation" such as Computer Aided Design (CAD), Computer Aided Manufacturing (CAM), Computer Aided Process Planning (CAPP) and Manufacturing Resources Planning (MRPII). This has resulted in an environment which has focussed areas of excellence and poor overall efficiency, co-ordination and control. The main role of Computer Integrated Manufacturing (CIM) is to integrate these islands of automation and develop a totally integrated and controlled environment. However, the various perceptions of CIM, although developing, remain focussed on a very narrow integration scope and have consequently resulted in mere linked islands of automation with little improvement in overall co-ordination and control. This thesis, that is the research described within, develops and examines a more holistic view of CIM, which is based on the integration of various business elements. One particular business element, namely control, has been shown to have a multi-facetted and underpinning relationship with the CIM philosophy. This relationship impacts various CIM system design aspects including the CIM business analysis and modelling technique, the specification of systems integration requirements, the CIM system architectural form and the degree of business redesign. The research findings show that fundamental changes to CIM system design are required; these are incorporated in a generic CIM design methodology. The affect and influence of this holistic view of CIM on a manufacturing business has been evaluated through various industrial case study applications. Based on the evidence obtained, it has been concluded that this holistic, control based approach to CIM can provide a greatly improved means of achieving a totally integrated and controlled business environment. This generic CIM methodology will therefore make a significant contribution to the planning, modelling, design and development of future CIM systems.
Resumo:
The aim of this study was to comparatively investigate the impact of visual-verbal relationships that exist in expository texts on the reading process and comprehension of readers from different language background: native speakers of English (LI) and speakers of English as a foreign language (EFL). The study focussed, in this respect, on the visual elements (VEs) mainly graphs and tables that accompanied the selected texts. Two major experiments were undertaken. The first, was for the reading process using the post-reading questionnaire technique. Participants were 163 adult readers representing three groups: 77 (LI), 56 (EFL postgraduates); and 30 (EFL undergraduates). The second experiment was for the reading comprehension using cloze procedure. Participants were 123 representing the same above gorups: 50, 33 and 40 respectively. It was hypothesised that the LI readers would make use of VEs in the reading process in ways different from both EFL groups and that use would enhance each group's comprehension in different aspects and to different levels. In the analysis of the data of both experiments two statistical measurements were used. The chi-square was used to measure the differences between frequencies and the t-test was used to measure the differences between means. The results indicated a significant relationship between readers' language background and the impact of visual-verbal relationships on their reading processes and comprehension of such type of texts. The results also revealed considerable similarities between the two EFL groups in the reading process of texts accompanied by VEs. In the reading comprehension, however, the EFL undergraduates seemed to benefit from the visual-verbal relationships in their comprehension more than the postgraduates, suggesting a weak relationship of this impact for older EFL readers. Furthermore, the results showed considerable similarities between the reading process of texts accompanied by VEs and of whole prose texts. Finally an evaluation of this study was undertaken as well as practical implications for EFL readers and suggestions for future research.