22 resultados para number and operation
em Aston University Research Archive
Resumo:
Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.
Resumo:
This study concerns the application of a model of effective interpersonal relationships to problems arising from staff assessment at I.C.I. Ltd. Corporate Laboratory between 1972 and 1974. In collaboration with academic and industrial supervision, the study commenced with a survey of management and supervisor opinions about the effectiveness of current staff (work) relationships, with particular reference to the problem of recognising and developing creative potential. This survey emphasised a need to improve the relationships between staff in the staff assessment context. A survey of research into creativity emphasised the importance of the interpersonal environment for obtaining creative behaviour in an organisation context. A further survey of theories of how interpersonal behaviour related to personal creativity (therapeutic psychology) provided a model of effective interpersonal behaviour (Carkhuff, 1969) that could be applied to the organisation context of staff assessment. The objective of the project was redefined as a need to improve the conditions of interpersonal behaviour in relation to certain (career development) problems arising from staff assessment practices. In order to demonstrate the application of the model of effective interpersonal behaviour, the research student recorded interviews between himself and members of staff designed to develop and operate the dimensions of the model. Different samples of staff were used to develop the 'facilitative' and the 'action oriented' dimensions of bahaviour, and then for the operation of a helping programme (based on vocational guidance tests). These interactions have been analysed, according to the scales of measurement in the model ana the results are presented in case study form in this thesis. At each stage of the project, results and conclusions were presented to the sponsoring organisation (e.g. industrial supervisor) in order to assess their (subjective) opinion of relevance to the organ isation. Finally, recommendations on further actions towards general improvement of the work relationships in the laboratory were presented in a brief report to the sponsor.
Resumo:
Refractive index and structural characteristics of optical polymers are strongly influenced by the thermal history of the material. Polymer optical fibres (POF) are drawn under tension, resulting in axial orientation of the polymer molecular chains due to their susceptibility to align in the fibre direction. This change in orientation from the drawing process results in residual strain in the fibre and also affects the transparency and birefringence of the material (1-3). PMMA POF has failure strain as high as over 100%. POF has to be drawn under low tension to achieve this value. The drawing tension affects the magnitude of molecular alignment along the fibre axis, thus affecting the failure strain. The higher the tension the lower the failure stain will be. However, the properties of fibre drawn under high tension can approach that of fibre drawn under low tension by means of an annealing process. Annealing the fibre can generally optimise the performance of POF while keeping most advantages intact. Annealing procedures can reduce index difference throughout the bulk and also reduce residual stress that may cause fracture or distortion. POF can be annealed at temperatures approaching the glass transition temperature (Tg) of the polymer to produce FBG with a permanent blue Bragg wave-length shift at room temperature. At this elevated temperature segmental motion in the structure results in a lower viscosity. The material softens and the molecular chains relax from the axial orientation causing shrinking of the fibre. The large attenuation of typically 1dB/cm in the 1550nm spectral region of PMMA POF has limited FBG lengths to less than 10cm. The more expensive fluorinated polymers with lower absorption have had no success as FBG waveguides. Bragg grating have been inscribed onto various POF in the 800nm spectral region using a 30mW continuous wave 325nm helium cadmium laser, with a much reduced attenuation coefficient of 10dB/m (5). Fabricating multiplexed FBGs in the 800nm spectral region in TOPAS and PMMA POF consistently has lead to fabrication of multiplexed FBG in the 700nm spectral region by a method of prolonged annealing. The Bragg wavelength shift of gratings fabricated in PMMA fibre at 833nm and 867nm was monitored whilst the POF was thermally annealed at 80°C. Permanent shifts exceeding 80nm into the 700nm spectral region was attained by both gratings on the fibre. The large permanent shift creates the possibility of multiplexed Bragg sensors operating over a broad range. -------------------------------------------------------------------------------------------------------------------- 1. Pellerin C, Prud'homme RE, Pézolet M. Effect of thermal history on the molecular orientation in polystyrene/poly (vinyl methyl ether) blends. Polymer. 2003;44(11):3291-7. 2. Dvoránek L, Machová L, Šorm M, Pelzbauer Z, Švantner J, Kubánek V. Effects of drawing conditions on the properties of optical fibers made from polystyrene and poly (methyl methacrylate). Die Angewandte Makromolekulare Chemie. 1990;174(1):25-39. 3. Dugas J, Pierrejean I, Farenc J, Peichot JP. Birefringence and internal stress in polystyrene optical fibers. Applied optics. 1994;33(16):3545-8. 4. Jiang C, Kuzyk MG, Ding JL, Johns WE, Welker DJ. Fabrication and mechanical behavior of dye-doped polymer optical fiber. Journal of applied physics. 2002;92(1):4-12. 5. Johnson IP, Webb DJ, Kalli K, Yuan W, Stefani A, Nielsen K, et al., editors. Polymer PCF Bragg grating sensors based on poly (methyl methacrylate) and TOPAS cyclic olefin copolymer2011: SPIE.
Resumo:
Modelling architectural information is particularly important because of the acknowledged crucial role of software architecture in raising the level of abstraction during development. In the MDE area, the level of abstraction of models has frequently been related to low-level design concepts. However, model-driven techniques can be further exploited to model software artefacts that take into account the architecture of the system and its changes according to variations of the environment. In this paper, we propose model-driven techniques and dynamic variability as concepts useful for modelling the dynamic fluctuation of the environment and its impact on the architecture. Using the mappings from the models to implementation, generative techniques allow the (semi) automatic generation of artefacts making the process more efficient and promoting software reuse. The automatic generation of configurations and reconfigurations from models provides the basis for safer execution. The architectural perspective offered by the models shift focus away from implementation details to the whole view of the system and its runtime change promoting high-level analysis. © 2009 Springer Berlin Heidelberg.
Resumo:
Refractive index and structural characteristics of optical polymers are strongly influenced by the thermal history of the material. Polymer optical fibres (POF) are drawn under tension, resulting in axial orientation of the polymer molecular chains due to their susceptibility to align in the fibre direction. This change in orientation from the drawing process results in residual strain in the fibre and also affects the transparency and birefringence of the material (1-3). PMMA POF has failure strain as high as over 100%. POF has to be drawn under low tension to achieve this value. The drawing tension affects the magnitude of molecular alignment along the fibre axis, thus affecting the failure strain. The higher the tension the lower the failure stain will be. However, the properties of fibre drawn under high tension can approach that of fibre drawn under low tension by means of an annealing process. Annealing the fibre can generally optimise the performance of POF while keeping most advantages intact. Annealing procedures can reduce index difference throughout the bulk and also reduce residual stress that may cause fracture or distortion. POF can be annealed at temperatures approaching the glass transition temperature (Tg) of the polymer to produce FBG with a permanent blue Bragg wave-length shift at room temperature. At this elevated temperature segmental motion in the structure results in a lower viscosity. The material softens and the molecular chains relax from the axial orientation causing shrinking of the fibre. The large attenuation of typically 1dB/cm in the 1550nm spectral region of PMMA POF has limited FBG lengths to less than 10cm. The more expensive fluorinated polymers with lower absorption have had no success as FBG waveguides. Bragg grating have been inscribed onto various POF in the 800nm spectral region using a 30mW continuous wave 325nm helium cadmium laser, with a much reduced attenuation coefficient of 10dB/m (5). Fabricating multiplexed FBGs in the 800nm spectral region in TOPAS and PMMA POF consistently has lead to fabrication of multiplexed FBG in the 700nm spectral region by a method of prolonged annealing. The Bragg wavelength shift of gratings fabricated in PMMA fibre at 833nm and 867nm was monitored whilst the POF was thermally annealed at 80°C. Permanent shifts exceeding 80nm into the 700nm spectral region was attained by both gratings on the fibre. The large permanent shift creates the possibility of multiplexed Bragg sensors operating over a broad range. -------------------------------------------------------------------------------------------------------------------- 1. Pellerin C, Prud'homme RE, Pézolet M. Effect of thermal history on the molecular orientation in polystyrene/poly (vinyl methyl ether) blends. Polymer. 2003;44(11):3291-7. 2. Dvoránek L, Machová L, Šorm M, Pelzbauer Z, Švantner J, Kubánek V. Effects of drawing conditions on the properties of optical fibers made from polystyrene and poly (methyl methacrylate). Die Angewandte Makromolekulare Chemie. 1990;174(1):25-39. 3. Dugas J, Pierrejean I, Farenc J, Peichot JP. Birefringence and internal stress in polystyrene optical fibers. Applied optics. 1994;33(16):3545-8. 4. Jiang C, Kuzyk MG, Ding JL, Johns WE, Welker DJ. Fabrication and mechanical behavior of dye-doped polymer optical fiber. Journal of applied physics. 2002;92(1):4-12. 5. Johnson IP, Webb DJ, Kalli K, Yuan W, Stefani A, Nielsen K, et al., editors. Polymer PCF Bragg grating sensors based on poly (methyl methacrylate) and TOPAS cyclic olefin copolymer2011: SPIE.
Resumo:
Concept evaluation at the early phase of product development plays a crucial role in new product development. It determines the direction of the subsequent design activities. However, the evaluation information at this stage mainly comes from experts' judgments, which is subjective and imprecise. How to manage the subjectivity to reduce the evaluation bias is a big challenge in design concept evaluation. This paper proposes a comprehensive evaluation method which combines information entropy theory and rough number. Rough number is first presented to aggregate individual judgments and priorities and to manipulate the vagueness under a group decision-making environment. A rough number based information entropy method is proposed to determine the relative weights of evaluation criteria. The composite performance values based on rough number are then calculated to rank the candidate design concepts. The results from a practical case study on the concept evaluation of an industrial robot design show that the integrated evaluation model can effectively strengthen the objectivity across the decision-making processes.
Resumo:
Using methods of statistical physics, we study the average number and kernel size of general sparse random matrices over GF(q), with a given connectivity profile, in the thermodynamical limit of large matrices. We introduce a mapping of GF(q) matrices onto spin systems using the representation of the cyclic group of order q as the q-th complex roots of unity. This representation facilitates the derivation of the average kernel size of random matrices using the replica approach, under the replica symmetric ansatz, resulting in saddle point equations for general connectivity distributions. Numerical solutions are then obtained for particular cases by population dynamics. Similar techniques also allow us to obtain an expression for the exact and average number of random matrices for any general connectivity profile. We present numerical results for particular distributions.
Resumo:
The recent explosive growth in advanced manufacturing technology (AMT) and continued development of sophisticated information technologies (IT) is expected to have a profound effect on the way we design and operate manufacturing businesses. Furthermore, the escalating capital requirements associated with these developments have significantly increased the level of risk associated with initial design, ongoing development and operation. This dissertation has examined the integration of two key sub-elements of the Computer Integrated Manufacturing (CIM) system, namely the manufacturing facility and the production control system. This research has concentrated on the interactions between production control (MRP) and an AMT based production facility. The disappointing performance of such systems has been discussed in the context of a number of potential technological and performance incompatibilities between these two elements. It was argued that the design and selection of operating policies for both is the key to successful integration. Furthermore, policy decisions are shown to play an important role in matching the performance of the total system to the demands of the marketplace. It is demonstrated that a holistic approach to policy design must be adopted if successful integration is to be achieved. It is shown that the complexity of the issues resulting from such an approach required the formulation of a structured design methodology. Such a methodology was subsequently developed and discussed. This combined a first principles approach to the behaviour of system elements with the specification of a detailed holistic model for use in the policy design environment. The methodology aimed to make full use of the `low inertia' characteristics of AMT, whilst adopting a JIT configuration of MRP and re-coupling the total system to the market demands. This dissertation discussed the application of the methodology to an industrial case study and the subsequent design of operational policies. Consequently a novel approach to production control resulted. A central feature of which was a move toward reduced manual intervention in the MRP processing and scheduling logic with increased human involvement and motivation in the management of work-flow on the shopfloor. Experimental results indicated that significant performance advantages would result from the adoption of the recommended policy set.
Resumo:
Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.
Resumo:
The performances of five different ESI sources coupled to a polystyrene-divinylbenzene monolithic column were compared in a series of LC-ESI-MS/MS analyses of Escherichia coli outer membrane proteins. The sources selected for comparison included two different modifications of the standard electrospray source, a commercial low-flow sprayer, a stainless steel nanospray needle and a coated glass Picotip. Respective performances were judged on sensitivity and the number and reproducibility of significant protein identifications obtained through the analysis of multiple identical samples. Data quality varied between that of a ground silica capillary, with 160 total protein identifications, the lowest number of high quality peptide hits obtained (3012), and generally peaks of lower intensity; and a stainless steel nanospray needle, which resulted in increased precursor ion abundance, the highest-quality peptide fragmentation spectra (5414) and greatest number of total protein identifications (259) exhibiting the highest MASCOT scores (average increase in score of 27.5% per identified protein). The data presented show that, despite increased variability in comparative ion intensity, the stainless steel nanospray needle provides the highest overall sensitivity. However, the resulting data were less reproducible in terms of proteins identified in complex mixtures -- arguably due to an increased number of high intensity precursor ion candidates.
Resumo:
For forty years linguists have talked about idiolect and the uniqueness of individual utterances. This article explores how far these two concepts can be used to answer certain questions about the authorship of written documents—for instance how similar can two student essays be before one begins to suspect plagiarism? The article examines two ways of measuring similarity: the proportion of shared vocabulary and the number and length of shared phrases, and illustrates with examples drawn from both actual criminal court cases and incidents of student plagiarism. The article ends by engaging with Solan and Tiersma's contribution to this volume and considering whether such forensic linguistic evidence would be acceptable in American courts as well as how it might successfully be presented to a lay audience.
Resumo:
The existing method of pipeline health monitoring, which requires an entire pipeline to be inspected periodically, is both time-wasting and expensive. A risk-based model that reduces the amount of time spent on inspection has been presented. This model not only reduces the cost of maintaining petroleum pipelines, but also suggests efficient design and operation philosophy, construction methodology and logical insurance plans. The risk-based model uses Analytic Hierarchy Process (AHP), a multiple attribute decision-making technique, to identify the factors that influence failure on specific segments and analyzes their effects by determining probability of risk factors. The severity of failure is determined through consequence analysis. From this, the effect of a failure caused by each risk factor can be established in terms of cost, and the cumulative effect of failure is determined through probability analysis. The technique does not totally eliminate subjectivity, but it is an improvement over the existing inspection method.