91 resultados para Manufacturing processes parameters
Resumo:
This paper is based a major research project run by a team from the Innovation, Design and Operations Management Research Unit at the Aston Business School under SERC funding. International Computers Limited (!CL), the UK's largest indigenous manufacturer of mainframe computer products, was the main industrial collaborator in the research. During the period 1985-89 an integrated production system termed the "Modular Assembly Cascade'' was introduced to the Company's mainframe assembly plant at Ashton-under-Lyne near Manchester. Using a methodology primarily based upon 'participative observation', the researchers developed a model for analysing this manufacturing system design called "DRAMA". Following a critique of the existing literature on Manufacturing Strategy, this paper will describe the basic DRAMA model and its development from an industry specific design methodology to DRAMA II, a generic model for studying organizational decision processes in the design and implementation of production systems. From this, the potential contribution of the DRAMA model to the existing knowledge on the process of manufacturing system design will be apparent.
Resumo:
Gelatin is a principal excipient used as a binder in the formulation of lyophilized orally disintegrating tablets. The current study focuses on exploiting the physicochemical properties of gelatin by varying formulation parameters to determine their influence on orally disintegrating tablet (ODT) characteristics. Process parameters, namely pH and ionic strength of the formulations, and ball milling were investigated to observe their effects on excipient characteristics and tablet formation. The properties and characteristics of the formulations and tablets which were investigated included: glass transition temperature, wettability, porosity, mechanical properties, disintegration time, morphology of the internal structure of the freeze-dried tablets, and drug dissolution. The results from the pH study revealed that adjusting the pH of the formulation away from the isoelectric point of gelatin, resulted in an improvement in tablet disintegration time possibly due to increase in gelatin swelling resulting in greater tablet porosity. The results from the ionic strength study revealed that the inclusion of sodium chloride influenced tablet porosity, tablet morphology and the glass transition temperature of the formulations. Data from the milling study showed that milling the excipients influenced formulation characteristics, namely wettability and powder porosity. The study concludes that alterations of simple parameters such as pH and salt concentration have a significant influence on formulation of ODT. © 2011 by the authors; licensee MDPI, Basel, Switzerland.
Resumo:
Reliability modelling and verification is indispensable in modern manufacturing, especially for product development risk reduction. Based on the discussion of the deficiencies of traditional reliability modelling methods for process reliability, a novel modelling method is presented herein that draws upon a knowledge network of process scenarios based on the analytic network process (ANP). An integration framework of manufacturing process reliability and product quality is presented together with a product development and reliability verification process. According to the roles of key characteristics (KCs) in manufacturing processes, KCs are organised into four clusters, that is, product KCs, material KCs, operation KCs and equipment KCs, which represent the process knowledge network of manufacturing processes. A mathematical model and algorithm is developed for calculating the reliability requirements of KCs with respect to different manufacturing process scenarios. A case study on valve-sleeve component manufacturing is provided as an application example of the new reliability modelling and verification procedure. This methodology is applied in the valve-sleeve component manufacturing processes to manage and deploy production resources.
Resumo:
In today’s modern manufacturing industry there is an increasing need to improve internal processes to meet diverse client needs. Process re-engineering is an important activity that is well understood by industry but its rate of application within small to medium size enterprises (SME) is less developed. Business pressures shift the focus of SMEs toward winning new projects and contracts rather than developing long-term, sustainable manufacturing processes. Variations in manufacturing processes are inevitable, but the amount of non-conformity often exceeds the acceptable levels. This paper is focused on the re-engineering of the manufacturing and verification procedure for discrete parts production with the aim of enhancing process control and product verification. The ideologies of the ‘Push’ and ‘Pull’ approaches to manufacturing are useful in the context of process re-engineering for data improvement. Currently information is pulled from the market and prominent customers, and manufacturing companies always try to make the right product, by following customer procedures that attempt to verify against specifications. This approach can result in significant quality control challenges. The aim of this paper is to highlight the importance of process re-engineering in product verification in SMEs. Leadership, culture, ownership and process management are among the main attributes required for the successful deployment of process re-engineering. This paper presents the findings from a case study showcasing the application of a modified re-engingeering method for the manufacturing and verification process. The findings from the case study indicate there are several advantages to implementing the re-engineering method outlined in this paper.
Resumo:
A wide range of studies have shown that liposomes can act as suitable adjuvants for a range of vaccine antigens. Properties such as their amphiphilic character and biphasic nature allow them to incorporate antigens within the lipid bilayer, on the surface, or encapsulated within the inner core. However, appropriate methods for the manufacture of liposomes are limited and this has resulted in issues with cost, supply, and wider scale application of these systems. Within this chapter we explore manufacturing processes that can be used for the production of liposomal adjuvants, and we outline new manufacturing methods can that offer fast, scalable, and cost-effective production of liposomal adjuvants.
Resumo:
This PhD thesis belongs to three main knowledge domains: operations management, environmental management, and decision making. Having the automotive industry as the key sector, the investigation was undertaken aiming at deepening the understanding of environmental decision making processes in the operations function. The central research question for this thesis is ?Why and how do manufacturing companies take environmental decisions? This PhD research project used a case study research strategy supplemented by secondary data analysis and the testing and evaluation of a proposed systems thinking model for environmental decision making. Interviews and focus groups were the main methods for data collection. The findings of the thesis show that companies that want to be in the environmental leadership will need to take environmental decisions beyond manufacturing processes. Because the benefits (including financial gain) of non-manufacturing activities are not clear yet the decisions related to product design, supply chain and facilities are fully embedded with complexity, subjectivism, and intrinsic risk. Nevertheless, this is the challenge environmental leaders will face - they may enter in a paradoxical state of their decisions – where although the risk of going greener is high, the risk of not doing it is even higher.
Resumo:
This thesis address the creation of fibre Bragg grating based sensors and the fabrication systems which are used to manufacture them. The information is presented primarily with experimental evidence, backed up with the current theoretical concepts. The issues involved in fabricating high quality fibre Bragg gratings are systematically investigated. Sources of errors in the manufacturing processes are detected, analysed and reduced to allow higher quality gratings to be fabricated. The use of chirped Moiré gratings as distributed sensors is explored, the spatial resolution is increased beyond that of any previous work and the use of the gratings as distributed load sensors is also presented. Chirped fibre Bragg gratings are shown to be capable of operating as in-situ wear sensors, capable of accurately measuring the wear or erosion of the surface of a material. Two methods of measuring the wear are compared, giving a comparison between an expensive high resolution method and a cheap lower resolution method. The wear sensor is also shown to be capable of measuring the physical size and location of damage induced on the surface of a material. An array method is demonstrated to provide a high survivability such that the array may be damaged yet operate with minimal degradation in performance.
Resumo:
Two key issues defined the focus of this research in manufacturing plasmid DNA for use In human gene therapy. First, the processing of E.coli bacterial cells to effect the separation of therapeutic plasmid DNA from cellular debris and adventitious material. Second, the affinity purification of the plasmid DNA in a Simple one-stage process. The need arises when considering the concerns that have been recently voiced by the FDA concerning the scalability and reproducibility of the current manufacturing processes in meeting the quality criteria of purity, potency, efficacy, and safety for a recombinant drug substance for use in humans. To develop a preliminary purification procedure, an EFD cross-flow micro-filtration module was assessed for its ability to effect the 20-fold concentration, 6-time diafiltration, and final clarification of the plasmid DNA from the subsequent cell lysate that is derived from a 1 liter E.coli bacterial cell culture. Historically, the employment of cross-flow filtration modules within procedures for harvesting cells from bacterial cultures have failed to reach the required standards dictated by existing continuous centrifuge technologies, frequently resulting in the rapid blinding of the membrane with bacterial cells that substantially reduces the permeate flux. By challenging the EFD module, containing six helical wound tubular membranes promoting centrifugal instabilities known as Dean vortices, with distilled water between the Dean number's of 187Dn and 818Dn,and the transmembrane pressures (TMP) of 0 to 5 psi. The data demonstrated that the fluid dynamics significantly influenced the permeation rate, displaying a maximum at 227Dn (312 Imh) and minimum at 818Dn (130 Imh) for a transmembrane pressure of 1 psi. Numerical studies indicated that the initial increase and subsequent decrease resulted from a competition between the centrifugal and viscous forces that create the Dean vortices. At Dean numbers between 187Dn and 227Dn , the forces combine constructively to increase the apparent strength and influence of the Dean vortices. However, as the Dean number in increases above 227 On the centrifugal force dominates the viscous forces, compressing the Dean vortices into the membrane walls and reducing their influence on the radial transmembrane pressure i.e. the permeate flux reduced. When investigating the action of the Dean vortices in controlling tile fouling rate of E.coli bacterial cells, it was demonstrated that the optimum cross-flow rate at which to effect the concentration of a bacterial cell culture was 579Dn and 3 psi TMP, processing in excess of 400 Imh for 20 minutes (i.e., concentrating a 1L culture to 50 ml in 10 minutes at an average of 450 Imh). The data demonstrated that there was a conflict between the Dean number at which the shear rate could control the cell fouling, and the Dean number at which tile optimum flux enhancement was found. Hence, the internal geometry of the EFD module was shown to sub-optimal for this application. At 579Dn and 3 psi TMP, the 6-fold diafiltration was shown to occupy 3.6 minutes of process time, processing at an average flux of 400 Imh. Again, at 579Dn and 3 psi TMP the clarification of the plasmid from tile resulting freeze-thaw cell lysate was achieved at 120 Iml1, passing 83% (2,5 mg) of the plasmid DNA (6,3 ng μ-1 10.8 mg of genomic DNA (∼23,00 Obp, 36 ng μ-1 ), and 7.2 mg of cellular proteins (5-100 kDa, 21.4 ngμ-1 ) into the post-EFD process stream. Hence the EFD module was shown to be effective, achieving the desired objectives in approximately 25 minutes. On the basis of its ability to intercalate into low molecular weight dsDNA present in dilute cell lysates, and be electrophoresed through agarose, the fluorophore PicoGreen was selected for the development of a suitable dsDNA assay. It was assesseel for its accuracy, and reliability, In determining the concentration and identity of DNA present in samples that were eleclrophoresed through agarose gels. The signal emitted by intercalated PicoGreen was shown to be constant and linear, and that the mobility of the PicaGreen-DNA complex was not affected by the intercalation. Concerning the secondary purification procedure, various anion-exchange membranes were assessed for their ability to capture plasmid DNA from the post-EFD process stream. For a commercially available Sartorius Sartobind Q15 membrane, the reduction in the equilibriumbinding capacity for ctDNA in buffer of increasing ionic demonstrated that DNA was being.adsorbed by electrostatic interactions only. However, the problems associated with fluid distribution across the membrane demonstrated that the membrane housing was the predominant cause of the .erratic breakthrough curves. Consequently, this would need to be rectified before such a membrane could be integrated into the current system, or indeed be scaled beyond laboratory scale. However, when challenged with the process material, the data showed that considerable quantities of protein (1150 μg) were adsorbed preferentially to the plasmid DNA (44 μg). This was also shown for derived Pall Gelman UltraBind US450 membranes that had been functionalised by varying molecular weight poly-L~lysine and polyethyleneimine ligands. Hence the anion-exchange membranes were shown to be ineffective in capturing plasmid DNA from the process stream. Finally, work was performed to integrate a sequence-specific DNA·binding protein into a single-stage DNA chromatography, isolating plasmid DNA from E.coli cells whilst minimising the contamination from genomic DNA and cellular protein. Preliminary work demonstrated that the fusion protein was capable of isolating pUC19 DNA into which the recognition sequence for the fusion-protein had been inserted (pTS DNA) when in the presence of the conditioned process material. Althougth the pTS recognition sequence differs from native pUC19 sequences by only 2 bp, the fusion protein was shown to act as a highly selective affinity ligand for pTS DNA alone. Subsequently, the scale of the process was scaled 25-fold and positioned directly following the EFD system. In conclusion, the integration of the EFD micro-filtration system and zinc-finger affinity purification technique resulted in the capture of approximately 1 mg of plasmid DNA was purified from 1L of E.coli culture in a simple two stage process, resulting in the complete removal of genomic DNA and 96.7% of cellular protein in less than 1 hour of process time.
Resumo:
The work described in this thesis is directed towards the reduction of noise levels in the Hoover Turbopower upright vacuum cleaner. The experimental work embodies a study of such factors as the application of noise source identification techniques, investigation of the noise generating principles for each major source and evaluation of the noise reducing treatments. It was found that the design of the vacuum cleaner had not been optimised from the standpoint of noise emission. Important factors such as noise `windows', isolation of vibration at the source, panel rattle, resonances and critical speeds had not been considered. Therefore, a number of experimentally validated treatments are proposed. Their noise reduction benefit together with material and tooling costs are presented. The solutions to the noise problems were evaluated on a standard Turbopower and the sound power level of the cleaner was reduced from 87.5 dB(A) to 80.4 db(A) at a cost of 93.6 pence per cleaner.The designers' lack of experience in noise reduction was identified as one of the factors for the low priority given to noise during design of the cleaner. Consequently, the fundamentals of acoustics, principles of noise prediction and absorption and guidelines for good acoustical design were collated into a Handbook and circulated at Hoover plc.Mechanical variations during production of the motor and the cleaner were found to be important. These caused a vast spread in the noise levels of the cleaners. Subsequently, the manufacturing processes were briefly studied to identify their source and recommendations for improvement are made.Noise of a product is quality related and a high level of noise is considered to be a bad feature. This project suggested that the noise level be used constructively both as a test on the production line to identify cleaners above a certain noise level and also to promote the product by `designing' the characteristics of the sound so that the appliance is pleasant to the user. This project showed that good noise control principles should be implemented early in the design stage.As yet there are no mandatory noise limits or noise-labelling requirements for household appliances. However, the literature suggests that noise-labelling is likely in the near future and the requirement will be to display the A-weighted sound power level. However, the `noys' scale of perceived noisiness was found more appropriate to the rating of appliance noise both as it is linear and therefore, a sound level that seems twice as loud is twice the value in noys and also takes into consideration the presence of pure tones, which even in the absence of a high noise level can lead to annoyance.
Resumo:
Traditional machinery for manufacturing processes are characterised by actuators powered and co-ordinated by mechanical linkages driven from a central drive. Increasingly, these linkages are replaced by independent electrical drives, each performs a different task and follows a different motion profile, co-ordinated by computers. A design methodology for the servo control of high speed multi-axis machinery is proposed, based on the concept of a highly adaptable generic machine model. In addition to the dynamics of the drives and the loads, the model includes the inherent interactions between the motion axes and thus provides a Multi-Input Multi-Output (MIMO) description. In general, inherent interactions such as structural couplings between groups of motion axes are undesirable and needed to be compensated. On the other hand, imposed interactions such as the synchronisation of different groups of axes are often required. It is recognised that a suitable MIMO controller can simultaneously achieve these objectives and reconciles their potential conflicts. Both analytical and numerical methods for the design of MIMO controllers are investigated. At present, it is not possible to implement high order MIMO controllers for practical reasons. Based on simulations of the generic machine model under full MIMO control, however, it is possible to determine a suitable topology for a blockwise decentralised control scheme. The Block Relative Gain array (BRG) is used to compare the relative strength of closed loop interactions between sub-systems. A number of approaches to the design of the smaller decentralised MIMO controllers for these sub-systems has been investigated. For the purpose of illustration, a benchmark problem based on a 3 axes test rig has been carried through the design cycle to demonstrate the working of the design methodology.
Resumo:
A view has emerged within manufacturing and service organizations that the operations management function can hold the key to achieving competitive edge. This has recently been emphasized by the demands for greater variety and higher quality which must be set against a background of increasing cost of resources. As nations' trade barriers are progressively lowered and removed, so producers of goods and service products are becoming more exposed to competition that may come from virtually anywhere around the world. To simply survive in this climate many organizations have found it necessary to improve their manufacturing or service delivery systems. To become real ''winners'' some have adopted a strategic approach to operations and completely reviewed and restructured their approach to production system design and operations planning and control. The articles in this issue of the International journal of Operations & Production Management have been selected to illustrate current thinking and practice in relation to this situation. They are all based on papers presented to the Sixth International Conference of the Operations Management Association-UK which was held at Aston University in June 1991. The theme of the conference was "Achieving Competitive Edge" and authors from 15 countries around the world contributed to more than 80 presented papers. Within this special issue five topic areas are addressed with two articles relating to each. The topics are: strategic management of operations; managing change; production system design; production control; and service operations. Under strategic management of operations De Toni, Filippini and Forza propose a conceptual model which considers the performance of an operating system as a source of competitive advantage through the ''operation value chain'' of design, purchasing, production and distribution. Their model is set within the context of the tendency towards globalization. New's article is somewhat in contrast to the more fashionable literature on operations strategy. It challenges the validity of the current idea of ''world-class manufacturing'' and, instead, urges a reconsideration of the view that strategic ''trade-offs'' are necessary to achieve a competitive edge. The importance of managing change has for some time been recognized within the field of organization studies but its relevance in operations management is now being realized. Berger considers the use of "organization design", ''sociotechnical systems'' and change strategies and contrasts these with the more recent idea of the ''dialogue perspective''. A tentative model is suggested to improve the analysis of different strategies in a situation specific context. Neely and Wilson look at an essential prerequisite if change is to be effected in an efficient way, namely product goal congruence. Using a case study as its basis, their article suggests a method of measuring goal congruence as a means of identifying the extent to which key performance criteria relating to quality, time, cost and flexibility are understood within an organization. The two articles on production systems design represent important contributions to the debate on flexible production organization and autonomous group working. Rosander uses the results from cases to test the applicability of ''flow groups'' as the optimal way of organizing batch production. Schuring also examines cases to determine the reasons behind the adoption of ''autonomous work groups'' in The Netherlands and Sweden. Both these contributions help to provide a greater understanding of the production philosophies which have emerged as alternatives to more conventional systems -------for intermittent and continuous production. The production control articles are both concerned with the concepts of ''push'' and ''pull'' which are the two broad approaches to material planning and control. Hirakawa, Hoshino and Katayama have developed a hybrid model, suitable for multistage manufacturing processes, which combines the benefits of both systems. They discuss the theoretical arguments in support of the system and illustrate its performance with numerical studies. Slack and Correa's concern is with the flexibility characteristics of push and pull material planning and control systems. They use the case of two plants using the different systems to compare their performance within a number of predefined flexibility types. The two final contributions on service operations are complementary. The article by Voss really relates to manufacturing but examines the application of service industry concepts within the UK manufacturing sector. His studies in a number of companies support the idea of the ''service factory'' and offer a new perspective for manufacturing. Harvey's contribution by contrast, is concerned with the application of operations management principles in the delivery of professional services. Using the case of social-service provision in Canada, it demonstrates how concepts such as ''just-in-time'' can be used to improve service performance. The ten articles in this special issue of the journal address a wide range of issues and situations. Their common aspect is that, together, they demonstrate the extent to which competitiveness can be improved via the application of operations management concepts and techniques.
Resumo:
Quality, production and technological innovation management rank among the most important matters of concern to modern manufacturing organisations. They can provide companies with the decisive means of gaining a competitive advantage, especially within industries where there is an increasing similarity in product design and manufacturing processes. The papers in this special issue of International Journal of Technology Management have all been selected as examples of how aspects of quality, production and technological innovation can help to improve competitive performance. Most are based on presentations made at the UK Operations Management Association's Sixth International Conference held at Aston University at which the theme was 'Getting Ahead Through Technology and People'. At the conference itself over 80 papers were presented by authors from 15 countries around the world. Among the many topics addressed within the conference theme, technological innovation, quality and production management emerged as attracting the greatest concern and interest of delegates, particularly those from industry. For any new initiative to be implemented successfully, it should be led from the top of the organization. Achieving the desired level of commitment from top management can, however, be a difficulty. In the first paper of this issue, Mackness investigates this question by explaining how systems thinking can help. In the systems approach, properties such as 'emergence', 'hierarchy', 'commnication' and 'control' are used to assist top managers in preparing for change. Mackness's paper is then complemented by Iijima and Hasegawa's contribution in which they investigate the development of Quality Information Management (QIM) in Japan. They present the idea of a Design Review and demonstrate how it can be used to trace and reduce quality-related losses. The next paper on the subject of quality is by Whittle and colleagues. It relates to total quality and the process of culture change within organisations. Using the findings of investigations carried out in a number of case study companies, they describe four generic models which have been identified as characterising methods of implementing total quality within existing organisation cultures. Boaden and Dale's paper also relates to the management of quality, but looks specifically at the construction industry where it has been found there is still some confusion over the role of Quality Assurance (QA) and Total Quality Management (TQM). They describe the results of a questionnaire survey of forty companies in the industry and compare them to similar work carried out in other industries. Szakonyi's contribution then completes this group of papers which all relate specifically to the question of quality. His concern is with the two ways in which R&D or engineering managers can work on improving quality. The first is by improving it in the laboratory, while the second is by working with other functions to improve quality in the company. The next group of papers in this issue all address aspects of production management. Umeda's paper proposes a new manufacturing-oriented simulation package for production management which provides important information for both design and operation of manufacturing systems. A simulation for production strategy in a Computer Integrated Manufacturing (CIM) environment is also discussed. This paper is then followed by a contribution by Tanaka and colleagues in which they consider loading schedules for manufacturing orders in a Material Requirements Planning (MRP) environment. They compare mathematical programming with a knowledge-based approach, and comment on their relative effectiveness for different practical situations. Engstrom and Medbo's paper then looks at a particular aspect of production system design, namely the question of devising group working arrangements for assembly with new product structures. Using the case of a Swedish vehicle assembly plant where long cycle assembly work has been adopted, they advocate the use of a generally applicable product structure which can be adapted to suit individual local conditions. In the last paper of this particular group, Tay considers how automation has affected the production efficiency in Singapore. Using data from ten major industries he identifies several factors which are positively correlated with efficiency, with capital intensity being of greatest interest to policy makers. The two following papers examine the case of electronic data interchange (EDI) as a means of improving the efficiency and quality of trading relationships. Banerjee and Banerjee consider a particular approach to material provisioning for production systems using orderless inventory replenishment. Using the example of a single supplier and multiple buyers they develop an analytical model which is applicable for the exchange of information between trading partners using EDI. They conclude that EDI-based inventory control can be attractive from economic as well as other standpoints and that the approach is consistent with and can be instrumental in moving towards just-in-time (JIT) inventory management. Slacker's complementary viewpoint on EDI is from the perspective of the quality relation-ship between the customer and supplier. Based on the experience of Lucas, a supplier within the automotive industry, he concludes that both banks and trading companies must take responsibility for the development of payment mechanisms which satisfy the requirements of quality trading. The three final papers of this issue relate to technological innovation and are all country based. Berman and Khalil report on a survey of US technological effectiveness in the global economy. The importance of education is supported in their conclusions, although it remains unclear to what extent the US government can play a wider role in promoting technological innovation and new industries. The role of technology in national development is taken up by Martinsons and Valdemars who examine the case of the former Soviet Union. The failure to successfully infuse technology into Soviet enterprises is seen as a factor in that country's demise, and it is anticipated that the newly liberalised economies will be able to encourage greater technological creativity. This point is then taken up in Perminov's concluding paper which looks in detail at Russia. Here a similar analysis is made of the concluding paper which looks in detail at Russia. Here a similar analysis is made of the Soviet Union's technological decline, but a development strategy is also presented within the context of the change from a centralised to a free market economy. The papers included in this special issue of the International Journal of Technology Management each represent a unique and particular contribution to their own specific area of concern. Together, however, they also argue or demonstrate the general improvements in competitive performance that can be achieved through the application of modern principles and practice to the management of quality, production and technological innovation.
Resumo:
Purpose - This paper aims to provide empirical results which suggest that there is a need for more widespread adoption of supply chain management among Irish firms. Design/methodology/approach - The Republic of Ireland is a small, open, trade-dependent economy and is one of the fastest growing economies in the developed world. However, due to rising costs, there is an increasing trend in Ireland to outsource lower function manufacturing processes to lower-cost locations but to retain high-skill functions (such as R&D). This trend, together with other factors such as its peripheral location, suggests that supply chain management is critical from an Irish perspective. In order to gain unique insights of current levels of awareness/adoption of SCM and the potential impact SCM could have on competitiveness, a survey was conducted among 776 Irish firms. Findings - Overall, the findings suggest that many firms in Ireland pay lip-service to the importance of SCM elements and objectives but the majority of firms, about two thirds, have only a passing understanding of what constitutes SCM. Only 25 per cent adopt SCM programmes and only 9 per cent of Irish companies have a specialised SCM or logistics manager. The gaps in their understanding of SCM are matched by the gaps in their awareness of key costs (e.g. 59 per cent of companies do not know their total supply chain costs). While there are supply chain management adopters in Ireland that are already well up the s-curve of innovation transfer, it is the larger group of less aware companies that must become better at how they manage their supply chains. Originality/value - The paper offers a useful insight into supply chain management and its role in Irish industry. © Emerald Group Publishing Limited.
Resumo:
Dwindling oil reserves and growing concerns over carbon dioxide emissions and associated climate change are driving the utilisation of renewable feedstocks as alternative, sustainable fuel sources. Catalysis has a rich history of facilitating energy efficient, selective molecular transformations, and contributes to 90% of current chemical manufacturing processes. In a post-petroleum era, catalysis will be pivotal in overcoming the scientific and engineering barriers to economically feasible bio-fuels. This perspective highlights some recent developments in heterogeneous catalysts for the synthesis of biodiesel from renewable resources, derived from plant and aquatic oil sources. Particular attention will be paid to the importance of catalyst pore architecture, surface polarity and acid and base properties, in meeting the challenge of transforming highly polar and viscous bio-based reactants. © 2012 The Royal Society of Chemistry.
Resumo:
The combination of dwindling oil reserves and growing concerns over carbon dioxide emissions and associated climate change is driving the urgent development of routes to utilise renewable feedstocks as sustainable sources of fuel and chemicals. Catalysis has a rich history of facilitating energy-efficient selective molecular transformations and contributes to 90% of chemical manufacturing processes and to more than 20% of all industrial products. In a post-petroleum era, catalysis will be central to overcoming the engineering and scientific barriers to economically feasible routes to biofuels and chemicals. This chapter will highlight some of the recent developments in heterogeneous catalytic technology for the synthesis of fuels and chemicals from renewable resources, derived from plant and aquatic oil sources as well as lignocellulosic feedstocks. Particular attention will be paid to the challenges faced when developing new catalysts and importance of considering the design of pore architectures and effect of tuning surface polarity to improve catalyst compatibility with highly polar bio-based substrates.