29 resultados para production techniques
Resumo:
The present study describes a pragmatic approach to the implementation of production planning and scheduling techniques in foundries of all types and looks at the use of `state-of-the-art' management control and information systems. Following a review of systems for the classification of manufacturing companies, a definitive statement is made which highlights the important differences between foundries (i.e. `component makers') and other manufacturing companies (i.e. `component buyers'). An investigation of the manual procedures which are used to plan and control the manufacture of components reveals the inherent problems facing foundry production management staff, which suggests the unsuitability of many manufacturing techniques which have been applied to general engineering companies. From the literature it was discovered that computer-assisted systems are required which are primarily `information-based' rather than `decision based', whilst the availability of low-cost computers and `packaged-software' has enabled foundries to `get their feet wet' without the financial penalties which characterized many of the early attempts at computer-assistance (i.e. pre-1980). Moreover, no evidence of a single methodology for foundry scheduling emerged from the review. A philosophy for the development of a CAPM system is presented, which details the essential information requirements and puts forward proposals for the subsequent interactions between types of information and the sub-system of CAPM which they support. The work developed was oriented specifically at the functions of production planning and scheduling and introduces the concept of `manual interaction' for effective scheduling. The techniques developed were designed to use the information which is readily available in foundries and were found to be practically successful following the implementation of the techniques into a wide variety of foundries. The limitations of the techniques developed are subsequently discussed within the wider issues which form a CAPM system, prior to a presentation of the conclusions which can be drawn from the study.
Resumo:
Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.
Resumo:
Many planning and control tools, especially network analysis, have been developed in the last four decades. The majority of them were created in military organization to solve the problem of planning and controlling research and development projects. The original version of the network model (i.e. C.P.M/PERT) was transplanted to the construction industry without the consideration of the special nature and environment of construction projects. It suited the purpose of setting up targets and defining objectives, but it failed in satisfying the requirement of detailed planning and control at the site level. Several analytical and heuristic rules based methods were designed and combined with the structure of C.P.M. to eliminate its deficiencies. None of them provides a complete solution to the problem of resource, time and cost control. VERT was designed to deal with new ventures. It is suitable for project evaluation at the development stage. CYCLONE, on the other hand, is concerned with the design and micro-analysis of the production process. This work introduces an extensive critical review of the available planning techniques and addresses the problem of planning for site operation and control. Based on the outline of the nature of site control, this research developed a simulation based network model which combines part of the logics of both VERT and CYCLONE. Several new nodes were designed to model the availability and flow of resources, the overhead and operating cost and special nodes for evaluating time and cost. A large software package is written to handle the input, the simulation process and the output of the model. This package is designed to be used on any microcomputer using MS-DOS operating system. Data from real life projects were used to demonstrate the capability of the technique. Finally, a set of conclusions are drawn regarding the features and limitations of the proposed model, and recommendations for future work are outlined at the end of this thesis.
Resumo:
The production of composite particles using dry powder coating is a one-step, environmentally friendly, process for the fabrication of particles with targeted properties and favourable functionalities. Diverse functionalities, such flowability enhancement, content uniformity, and dissolution, can be developed from dry particle coating. In this review, we discuss the particle functionalities that can be tailored and the selection of characterisation techniques relevant to understanding their molecular basis. We address key features in the powder blend sampling process and explore the relevant characterisation techniques, focussing on the functionality delivered by dry coating and on surface profiling that explores the dynamics and surface characteristics of the composite blends. Dry particle coating is a solvent- and heat-free process that can be used to develop functionalised particles. However, assessment of the resultant functionality requires careful selection of sensitive analytical techniques that can distinguish particle surface changes within nano and/or micrometre ranges.
Resumo:
This review covers the production and utilisation of liquids from the thermal processing of biomass and related materials to substitute for synthetic phenol and formaldehyde in phenol formaldehyde resins. These resins are primarily employed in the manufacture of wood panels such as plywood, MDF, particle-board and OSB. The most important thermal conversion methods for this purpose are fast pyrolysis and vacuum pyrolysis, pressure liquefaction and phenolysis. Many feedstocks have been tested for their suitability as sources of phenolics including hard and softwoods, bark and residual lignins. Resins have been prepared utilising either the whole liquid product, or a phenolics enriched fraction obtained after fractional condensation or further processing, such as solvent extraction. None of the phenolics production and fractionation techniques covered in this review are believed to allow substitution of 100% of the phenol content of the resin without impacting its effectiveness compared to commercial formulations based on petroleum derived phenol. This survey shows that considerable progress has been made towards reaching the goal of a price competitive renewable resin, but that further research is required to meet the twin challenges of low renewable resin cost and satisfactory quality requirements. Particular areas of concern are wood panel press times, variability of renewable resin properties, odour, lack of reactive sites compared to phenol and potential for increased emissions of volatile organic compounds.
Resumo:
Practising engineers frequently seek to understand what the effects of various manufacturing strategies will be on the performance of their production facilities. In this situation a computer model can help to provide insight and form predictions about future manufacturing system performance. Various types of modelling methods exist and each provide models that have distinct characteristics. This paper presents a review of popular modelling techniques and, based on the results of a structured experimental study, summarises their capabilities to support the evaluation of manufacturing strategies.
Resumo:
Drying is a major and challenging step in the pre-treatment of biomass for production of second generation synfuels for transport. The biomass feedstocks are mostly wet and need to be dried from 30 to 60 wt% moisture content to about 10-15 wt%. The present survey aims to define and evaluate a few of the most promising optimised concepts for biomass pre-treatment scheme in the production of second generation synfuels for transport. The most promising commercially available drying processes were reviewed, focusing on the applications, operational factors and emissions of dryers. The most common dryers applied now for biomass in bio-energy plants are direct rotary dryers, but the use of steam drying techniques is increasing. Steam drying systems enable the integration of the dryer to existing energy sources. In addition to integration, emissions and fire or explosion risks have to be considered when selecting a dryer for the plant. In steam drying there will be no gaseous emissions, but the aqueous effluents need often treatment. Concepts for biomass pre-treatment were defined for two different cases including a large-scale wood-based gasification synfuel production and a small-scale pyrolysis process based on wood chips and miscanthus bundles. For the first case a pneumatic conveying steam dryer was suggested. In the second case the flue gas will be used as drying medium in a direct or indirect rotary dryer.
Resumo:
With the growth of the multi-national corporation (MNCs) has come the need to understand how parent companies transfer knowledge to, and manage the operations of, their subsidiaries. This is of particular interest to manufacturing companies transferring their operations overseas. Japanese companies in particular have been pioneering in the development of techniques such as Kaizen, and elements of the Toyota Production System (TPS) such as Kanban, which can be useful tools for transferring the ethos of Japanese manufacturing and maintaining quality and control in overseas subsidiaries. Much has been written about the process of transferring Japanese manufacturing techniques but much less is understood about how the subsidiaries themselves – which are required to make use of such techniques – actually acquire and incorporate them into their operations. This research therefore takes the perspective of the subsidiary in examining how knowledge of manufacturing techniques is transferred from the parent company within its surrounding (subsidiary). There is clearly a need to take a practice-based view to understanding how the local managers and operatives incorporate this knowledge into their working practices. A particularly relevant theme is how subsidiaries both replicate and adapt knowledge from parents and the circumstances in which replication or adaptation occurs. However, it is shown that there is a lack of research which takes an in-depth look at these processes from the perspective of the participants themselves. This is particularly important as much knowledge literature argues that knowledge is best viewed as enacted and learned in practice – and therefore transferred in person – rather than by the transfer of abstract and de-contextualised information. What is needed, therefore, is further research which makes an in-depth examination of what happens at the subsidiary level for this transfer process to occur. There is clearly a need to take a practice-based view to understanding how the local managers and operatives incorporate knowledge about manufacturing techniques into their working practices. In depth qualitative research was, therefore, conducted in the subsidiary of a Japanese multinational, Gambatte Corporation, involving three main manufacturing initiatives (or philosophies), namely 'TPS‘, 'TPM‘ and 'TS‘. The case data were derived from 52 in-depth interviews with project members, moderate-participant observations, and documentations and presented and analysed in episodes format. This study contributes to our understanding of knowledge transfer in relation to the approaches and circumstances of adaptation and replication of knowledge within the subsidiary, how the whole process is developed, and also how 'innovation‘ takes place. This study further understood that the process of knowledge transfer could be explained as a process of Reciprocal Provider-Learner Exchange that can be linked to the Experiential Learning Theory.
Resumo:
This thesis describes the production of advanced materials comprising a wide array of polymer-based building blocks. These materials include bio-hybrid polymer-peptide conjugates, based on phenylalanine and poly(ethylene oxide), and polymers with intrinsic microporosity (PIMs). Polymer-peptides conjugates were previously synthesised using click chemistry. Due to the inherent disadvantages of the reported synthesis, a new, simpler, inexpensive protocol was sought. Three synthetic methods based on amidation chemistry were investigated for both oligopeptide and polymerpeptide coupling. The resulting conjugates produced were then assessed by various analytical techniques, and the new synthesis was compared with the established protocol. An investigation was also carried out focussing on polymer-peptide coupling via ester chemistry, involving deprotection of the carboxyl terminus of the peptide. Polymer-peptide conjugates were also assessed for their propensity to self-assemble into thixotropic gels in an array of solvent mixtures. Determination of the rules governing this particular self-assembly (gelation) was required. Initial work suggested that at least four phenylalanine peptide units were necessary for self-assembly, due to favourable hydrogen bond interactions. Quantitative analysis was carried out using three analytical techniques (namely rheology, FTIR, and confocal microscopy) to probe the microstructure of the material and provided further information on the conditions for self-assembly. Several polymers were electrospun in order to produce nanofibres. These included novel materials such as PIMs and the aforementioned bio-hybrid conjugates. An investigation of the parameters governing successful fibre production was carried out for PIMs, polymer-peptide conjugates, and for nanoparticle cages coupled to a polymer scaffold. SEM analysis was carried out on all material produced during these electrospinning experiments.
Resumo:
With the growth of the multinational corporation (MNC) has come the need to understand how parent companies transfer knowledge to, and manage the operations of, their subsidiaries. This is of particular interest to manufacturing companies transferring their operations overseas. Japanese companies in particular have been pioneering in this regard, with techniques such as the Toyota Production System (TPS) for transferring the ethos of Japanese manufacturing and maintaining quality and control in overseas subsidiaries. A great deal has been written about the process of transferring Japanese manufacturing techniques, but much less is understood about how the subsidiaries themselves, which are required to make use of such techniques, actually acquire and incorporate them into their operations. The research on which this paper is based therefore examines how, from the perspective of the subsidiary, knowledge of manufacturing techniques is transferred from the parent company. There is clearly a need to take a practice-based view to understanding how the local managers and operatives incorporate knowledge about manufacturing techniques into their working practices. In-depth qualitative research was, therefore, conducted in the subsidiary of a Japanese multinational, Denso Corporation, involving three main manufacturing initiatives (or philosophies), namely ‘TPS’, ‘TPM’ and ‘TS’. The case data were derived from 52 in-depth interviews with project members, moderate participant observations, and documentations. The aim of this paper is to present the preliminary findings from the case analyses. The research contributes to our understanding of knowledge transfer in relation to the circumstances of the selection between adaptation and replication of knowledge in the subsidiary from its parent. In particular this understanding relates to transfer across different flows and levels in the organisational hierarchy, how the whole process is managed, and also how modification takes place.
Resumo:
Saturation mutagenesis is a powerful tool in modern protein engineering. This can allow the analysis of potential new properties thus allowing key residues within a protein to be targeted and randomised. However, the creation of large libraries using conventional saturation mutagenesis with degenerate codons (NNN or NNK) has inherent redundancy and disparities in residue representation. In this we describe the combination of ProxiMAX randomisation and CIS display for the use of generating novel peptides. Unlike other methods ProxiMAX randomisation does not require any intricate chemistry but simply utilises synthetic DNA and molecular biology techniques. Designed ‘MAX’ oligonucleotides were ligated, amplified and digested in an iterative cycle. Results show that randomised ‘MAX’ codons can be added sequentially to the base sequence creating a series of randomised non-degenerate codons that can subsequently be inserted into a gene. CIS display (Isogencia, UK) is an in vitro DNA based screening method that creates a genotype to phenotype link between a peptide and the nucleic acid that encodes it. The use of straight forward in vitro transcription/translation and other molecular biology techniques permits ease of use along with flexibility making it a potent screening technique. Using ProxiMAX randomisation in combination with CIS display, the aim is to produce randomised anti-nerve growth factor (NGF) and calcitonin gene-related (CGRP) peptides to demonstrate the high-throughput nature of this combination.
Resumo:
Full text: The idea of producing proteins from recombinant DNA hatched almost half a century ago. In his PhD thesis, Peter Lobban foresaw the prospect of inserting foreign DNA (from any source, including mammalian cells) into the genome of a λ phage in order to detect and recover protein products from Escherichia coli [ 1 and 2]. Only a few years later, in 1977, Herbert Boyer and his colleagues succeeded in the first ever expression of a peptide-coding gene in E. coli — they produced recombinant somatostatin [ 3] followed shortly after by human insulin. The field has advanced enormously since those early days and today recombinant proteins have become indispensable in advancing research and development in all fields of the life sciences. Structural biology, in particular, has benefitted tremendously from recombinant protein biotechnology, and an overwhelming proportion of the entries in the Protein Data Bank (PDB) are based on heterologously expressed proteins. Nonetheless, synthesizing, purifying and stabilizing recombinant proteins can still be thoroughly challenging. For example, the soluble proteome is organized to a large part into multicomponent complexes (in humans often comprising ten or more subunits), posing critical challenges for recombinant production. A third of all proteins in cells are located in the membrane, and pose special challenges that require a more bespoke approach. Recent advances may now mean that even these most recalcitrant of proteins could become tenable structural biology targets on a more routine basis. In this special issue, we examine progress in key areas that suggests this is indeed the case. Our first contribution examines the importance of understanding quality control in the host cell during recombinant protein production, and pays particular attention to the synthesis of recombinant membrane proteins. A major challenge faced by any host cell factory is the balance it must strike between its own requirements for growth and the fact that its cellular machinery has essentially been hijacked by an expression construct. In this context, Bill and von der Haar examine emerging insights into the role of the dependent pathways of translation and protein folding in defining high-yielding recombinant membrane protein production experiments for the common prokaryotic and eukaryotic expression hosts. Rather than acting as isolated entities, many membrane proteins form complexes to carry out their functions. To understand their biological mechanisms, it is essential to study the molecular structure of the intact membrane protein assemblies. Recombinant production of membrane protein complexes is still a formidable, at times insurmountable, challenge. In these cases, extraction from natural sources is the only option to prepare samples for structural and functional studies. Zorman and co-workers, in our second contribution, provide an overview of recent advances in the production of multi-subunit membrane protein complexes and highlight recent achievements in membrane protein structural research brought about by state-of-the-art near-atomic resolution cryo-electron microscopy techniques. E. coli has been the dominant host cell for recombinant protein production. Nonetheless, eukaryotic expression systems, including yeasts, insect cells and mammalian cells, are increasingly gaining prominence in the field. The yeast species Pichia pastoris, is a well-established recombinant expression system for a number of applications, including the production of a range of different membrane proteins. Byrne reviews high-resolution structures that have been determined using this methylotroph as an expression host. Although it is not yet clear why P. pastoris is suited to producing such a wide range of membrane proteins, its ease of use and the availability of diverse tools that can be readily implemented in standard bioscience laboratories mean that it is likely to become an increasingly popular option in structural biology pipelines. The contribution by Columbus concludes the membrane protein section of this volume. In her overview of post-expression strategies, Columbus surveys the four most common biochemical approaches for the structural investigation of membrane proteins. Limited proteolysis has successfully aided structure determination of membrane proteins in many cases. Deglycosylation of membrane proteins following production and purification analysis has also facilitated membrane protein structure analysis. Moreover, chemical modifications, such as lysine methylation and cysteine alkylation, have proven their worth to facilitate crystallization of membrane proteins, as well as NMR investigations of membrane protein conformational sampling. Together these approaches have greatly facilitated the structure determination of more than 40 membrane proteins to date. It may be an advantage to produce a target protein in mammalian cells, especially if authentic post-translational modifications such as glycosylation are required for proper activity. Chinese Hamster Ovary (CHO) cells and Human Embryonic Kidney (HEK) 293 cell lines have emerged as excellent hosts for heterologous production. The generation of stable cell-lines is often an aspiration for synthesizing proteins expressed in mammalian cells, in particular if high volumetric yields are to be achieved. In his report, Buessow surveys recent structures of proteins produced using stable mammalian cells and summarizes both well-established and novel approaches to facilitate stable cell-line generation for structural biology applications. The ambition of many biologists is to observe a protein's structure in the native environment of the cell itself. Until recently, this seemed to be more of a dream than a reality. Advances in nuclear magnetic resonance (NMR) spectroscopy techniques, however, have now made possible the observation of mechanistic events at the molecular level of protein structure. Smith and colleagues, in an exciting contribution, review emerging ‘in-cell NMR’ techniques that demonstrate the potential to monitor biological activities by NMR in real time in native physiological environments. A current drawback of NMR as a structure determination tool derives from size limitations of the molecule under investigation and the structures of large proteins and their complexes are therefore typically intractable by NMR. A solution to this challenge is the use of selective isotope labeling of the target protein, which results in a marked reduction of the complexity of NMR spectra and allows dynamic processes even in very large proteins and even ribosomes to be investigated. Kerfah and co-workers introduce methyl-specific isotopic labeling as a molecular tool-box, and review its applications to the solution NMR analysis of large proteins. Tyagi and Lemke next examine single-molecule FRET and crosslinking following the co-translational incorporation of non-canonical amino acids (ncAAs); the goal here is to move beyond static snap-shots of proteins and their complexes and to observe them as dynamic entities. The encoding of ncAAs through codon-suppression technology allows biomolecules to be investigated with diverse structural biology methods. In their article, Tyagi and Lemke discuss these approaches and speculate on the design of improved host organisms for ‘integrative structural biology research’. Our volume concludes with two contributions that resolve particular bottlenecks in the protein structure determination pipeline. The contribution by Crepin and co-workers introduces the concept of polyproteins in contemporary structural biology. Polyproteins are widespread in nature. They represent long polypeptide chains in which individual smaller proteins with different biological function are covalently linked together. Highly specific proteases then tailor the polyprotein into its constituent proteins. Many viruses use polyproteins as a means of organizing their proteome. The concept of polyproteins has now been exploited successfully to produce hitherto inaccessible recombinant protein complexes. For instance, by means of a self-processing synthetic polyprotein, the influenza polymerase, a high-value drug target that had remained elusive for decades, has been produced, and its high-resolution structure determined. In the contribution by Desmyter and co-workers, a further, often imposing, bottleneck in high-resolution protein structure determination is addressed: The requirement to form stable three-dimensional crystal lattices that diffract incident X-ray radiation to high resolution. Nanobodies have proven to be uniquely useful as crystallization chaperones, to coax challenging targets into suitable crystal lattices. Desmyter and co-workers review the generation of nanobodies by immunization, and highlight the application of this powerful technology to the crystallography of important protein specimens including G protein-coupled receptors (GPCRs). Recombinant protein production has come a long way since Peter Lobban's hypothesis in the late 1960s, with recombinant proteins now a dominant force in structural biology. The contributions in this volume showcase an impressive array of inventive approaches that are being developed and implemented, ever increasing the scope of recombinant technology to facilitate the determination of elusive protein structures. Powerful new methods from synthetic biology are further accelerating progress. Structure determination is now reaching into the living cell with the ultimate goal of observing functional molecular architectures in action in their native physiological environment. We anticipate that even the most challenging protein assemblies will be tackled by recombinant technology in the near future.
Resumo:
Technology changes rapidly over years providing continuously more options for computer alternatives and making life easier for economic, intra-relation or any other transactions. However, the introduction of new technology “pushes” old Information and Communication Technology (ICT) products to non-use. E-waste is defined as the quantities of ICT products which are not in use and is bivariate function of the sold quantities, and the probability that specific computers quantity will be regarded as obsolete. In this paper, an e-waste generation model is presented, which is applied to the following regions: Western and Eastern Europe, Asia/Pacific, Japan/Australia/New Zealand, North and South America. Furthermore, cumulative computer sales were retrieved for selected countries of the regions so as to compute obsolete computer quantities. In order to provide robust results for the forecasted quantities, a selection of forecasting models, namely (i) Bass, (ii) Gompertz, (iii) Logistic, (iv) Trend model, (v) Level model, (vi) AutoRegressive Moving Average (ARMA), and (vii) Exponential Smoothing were applied, depicting for each country that model which would provide better results in terms of minimum error indices (Mean Absolute Error and Mean Square Error) for the in-sample estimation. As new technology does not diffuse in all the regions of the world with the same speed due to different socio-economic factors, the lifespan distribution, which provides the probability of a certain quantity of computers to be considered as obsolete, is not adequately modeled in the literature. The time horizon for the forecasted quantities is 2014-2030, while the results show a very sharp increase in the USA and United Kingdom, due to the fact of decreasing computer lifespan and increasing sales.
Resumo:
Atomisation of an aqueous solution for tablet film coating is a complex process with multiple factors determining droplet formation and properties. The importance of droplet size for an efficient process and a high quality final product has been noted in the literature, with smaller droplets reported to produce smoother, more homogenous coatings whilst simultaneously avoiding the risk of damage through over-wetting of the tablet core. In this work the effect of droplet size on tablet film coat characteristics was investigated using X-ray microcomputed tomography (XμCT) and confocal laser scanning microscopy (CLSM). A quality by design approach utilising design of experiments (DOE) was used to optimise the conditions necessary for production of droplets at a small (20 μm) and large (70 μm) droplet size. Droplet size distribution was measured using real-time laser diffraction and the volume median diameter taken as a response. DOE yielded information on the relationship three critical process parameters: pump rate, atomisation pressure and coating-polymer concentration, had upon droplet size. The model generated was robust, scoring highly for model fit (R2 = 0.977), predictability (Q2 = 0.837), validity and reproducibility. Modelling confirmed that all parameters had either a linear or quadratic effect on droplet size and revealed an interaction between pump rate and atomisation pressure. Fluidised bed coating of tablet cores was performed with either small or large droplets followed by CLSM and XμCT imaging. Addition of commonly used contrast materials to the coating solution improved visualisation of the coating by XμCT, showing the coat as a discrete section of the overall tablet. Imaging provided qualitative and quantitative evidence revealing that smaller droplets formed thinner, more uniform and less porous film coats.