36 resultados para architectural project process
Resumo:
The aim of this investigation was to study the chemical reactions occurring during the batchwise production of a butylated melamine-formaldehyde resin, in order to optimise the efficiency and economics of the batch processes. The batch process models are largely empirical in nature as the reaction mechanism is unknown. The process chemistry and the commercial manufacturing method are described. A small scale system was established in glass and the ability to produce laboratory resins with the required quality was demonstrated, simulating the full scale plant. During further experiments the chemical reactions of methylolation, condensation and butylation were studied. The important process stages were identified and studied separately. The effects of variation of certain process parameters on the chemical reactions were also studied. A published model of methylolation was modified and used to simulate the methylolation stage. A major result of this project was the development of an indirect method for studying the condensation and butylation reactions occurring during the dehydration and acid reaction stages, as direct quantitative methods were not available. A mass balance method was devised for this purpose and used to collect experimental data. The reaction scheme was verified using this data. The reactions stages were simulated using an empirical model. This has revealed new information regarding the mechanism and kinetics of the reactions. Laboratory results were shown to be comparable with plant scale results. This work has improved the understanding of the batch process, which can be used to improve product consistency. Future work has been identified and recommended to produce an optimum process and plant design to reduce the batch time.
Resumo:
The topic of bioenergy, biofuels and bioproducts remains at the top of the current political and research agenda. Identification of the optimum processing routes for biomass, in terms of efficiency, cost, environment and socio-economics is vital as concern grows over the remaining fossil fuel resources, climate change and energy security. It is known that the only renewable way of producing conventional hydrocarbon fuels and organic chemicals is from biomass, but the problem remains of identifying the best product mix and the most efficient way of processing biomass to products. The aim is to move Europe towards a biobased economy and it is widely accepted that biorefineries are key to this development. A methodology was required for the generation and evaluation of biorefinery process chains for converting biomass into one or more valuable products that properly considers performance, cost, environment, socio-economics and other factors that influence the commercial viability of a process. In this thesis a methodology to achieve this objective is described. The completed methodology includes process chain generation, process modelling and subsequent analysis and comparison of results in order to evaluate alternative process routes. A modular structure was chosen to allow greater flexibility and allowing the user to generate a large number of different biorefinery configurations The significance of the approach is that the methodology is defined and is thus rigorous and consistent and may be readily re-examined if circumstances change. There was the requirement for consistency in structure and use, particularly for multiple analyses. It was important that analyses could be quickly and easily carried out to consider, for example, different scales, configurations and product portfolios and so that previous outcomes could be readily reconsidered. The result of the completed methodology is the identification of the most promising biorefinery chains from those considered as part of the European Biosynergy Project.
Resumo:
As levels of investment in advanced manufacturing systems increase, effective project management becomes ever more critical. This paper demonstrates how the model proposed by Mintzberg, Raisinghani and Theoret in 1976, which structures complicated strategic decision processes, can be applied to the design of new production systems for both descriptive and analytical research purposes. This paper sets a detailed case study concerning the design and development of an advanced manufacturing system within the Mintzberg decision model and so breaks down the decision sequence into constituent parts. It thus shows how a structured model can provide a framework for the researcher who wishes to study decision episodes in the design of manufacturing facilities in greater depth.
Resumo:
Pyrolysis is one of several thermochemical technologies that convert solid biomass into more useful and valuable bio-fuels. Pyrolysis is thermal degradation in the complete or partial absence of oxygen. Under carefully controlled conditions, solid biomass can be converted to a liquid known as bie-oil in 75% yield on dry feed. Bio-oil can be used as a fuel but has the drawback of having a high level of oxygen due to the presence of a complex mixture of molecular fragments of cellulose, hemicellulose and lignin polymers. Also, bio-oil has a number of problems in use including high initial viscosity, instability resulting in increased viscosity or phase separation and high solids content. Much effort has been spent on upgrading bio-oil into a more usable liquid fuel, either by modifying the liquid or by major chemical and catalytic conversion to hydrocarbons. The overall primary objective was to improve oil stability by exploring different ways. The first was to detennine the effect of feed moisture content on bio-oil stability. The second method was to try to improve bio-oil stability by partially oxygenated pyrolysis. The third one was to improve stability by co-pyrolysis with methanol. The project was carried out on an existing laboratory pyrolysis reactor system, which works well with this project without redesign or modification too much. During the finishing stages of this project, it was found that the temperature of the condenser in the product collection system had a marked impact on pyrolysis liquid stability. This was discussed in this work and further recommendation given. The quantity of water coming from the feedstock and the pyrolysis reaction is important to liquid stability. In the present work the feedstock moisture content was varied and pyrolysis experiments were carried out over a range of temperatures. The quality of the bio-oil produced was measured as water content, initial viscosity and stability. The result showed that moderate (7.3-12.8 % moisture) feedstock moisture led to more stable bio-oil. One of drawbacks of bio-oil was its instability due to containing unstable oxygenated chemicals. Catalytic hydrotreatment of the oil and zeolite cracking of pyrolysis vapour were discllssed by many researchers, the processes were intended to eliminate oxygen in the bio-oil. In this work an alternative way oxygenated pyrolysis was introduced in order to reduce oil instability, which was intended to oxidise unstable oxygenated chemicals in the bio-oil. The results showed that liquid stability was improved by oxygen addition during the pyrolysis of beech wood at an optimum air factor of about 0.09-0.15. Methanol as a postproduction additive to bio-oil has been studied by many researchers and the most effective result came from adding methanol to oil just after production. Co-pyrolysis of spruce wood with methanol was undertaken in the present work and it was found that methanol improved liquid stability as a co-pyrolysis solvent but was no more effective than when used as a postproduction additive.
Resumo:
This research was undertaken to: develop a process for the direct solvent extraction of castor oil seeds. A literature survey confirmed the desirability of establishing such a process with emphasis on the decortication, size, reduction, detoxification-deallergenization, and solvent·extraction operations. A novel process was developed for the dehulling of castor seeds which consists of pressurizing the beans and then suddenly releasing the pressure to vaccum. The degree of dehulling varied according to the pressure applied and the size of the beans. Some of the batches were difficult-to-hull, and this phenomenon was investigated using the scanning electron microscope and by thickness and compressive strength measurements. The other variables studied to lesser degrees included residence time, moisture, content, and temperature.The method was successfully extended to cocoa beans, and (with modifications) to peanuts. The possibility of continuous operation was looked into, and a mechanism was suggested to explain the method works. The work on toxins and allergens included an extensive literature survey on the properties of these substances and the methods developed for their deactivation Part of the work involved setting up an assay method for measuring their concentration in the beans and cake, but technical difficulties prevented the completion of this aspect of the project. An appraisal of the existing deactivation methods was made in the course of searching for new ones. A new method of reducing the size of oilseeds was introduced in this research; it involved freezing the beans in cardice and milling them in a coffee grinder, the method was found to be a quick, efficient, and reliable. An application of the freezing technique was successful in dehulling soybeans and de-skinning peanut kernels. The literature on the solvent extraction, of oilseeds, especially castor, was reviewed: The survey covered processes, equipment, solvents, and mechanism of leaching. three solvents were experimentally investigated: cyclohexane, ethanol, and acetone. Extraction with liquid ammonia and liquid butane was not effective under the conditions studied. Based on the results of the research a process has been suggested for the direct solvent extraction of castor seeds, the various sections of the process have analysed, and the factors affecting the economics of the process were discussed.
Resumo:
Despite the voluminous studies written about organisational innovation over the last 30-40 years our understanding of this phenomenon continues to be inconsistent and inconclusive (Wolfe, 1994). An assessment of the theoretical and methodological issues influencing the explanatory utility of many studies has led scholars (e.g. Slappendel, 1996) to re-evaluate the assumptions used to ground studies. Building on these criticisms the current study contributes to the development of an interactive perspective of organisational innovation. This work contributes empirically and theoretically to an improved understanding of the innovation process and the interaction between the realm of action and the mediating effects of pre-existing contingencies i.e. social control, economic exchange and the communicability of knowledge (Scarbrough, 1996). Building on recent advances in institutional theory (see Barley, 1986; 1990; Barley and Tolbert, 1997) and critical theory (Morrow, 1994, Sayer, 1992) the study aims to demonstrate, via longitudinal intensive research, the process through which ideas are translated into reality. This is significant because, despite a growing recognition of the implicit link between the strategic conduct of actors and the institutional realm in organisational analysis, there are few examples that theorise and empirically test these connections. By assessing an under researched example of technology transfer; the government's Teaching Company Scheme (TCS) this project provides a critique of the innovation process that contributes to theory and our appreciation of change in the UK government's premier technology transfer scheme (QR, 1996). Critical moments during the translation of ideas illustrate how elements that are linked to social control, economic exchange and communicability mediate the innovation process. Using analytical categories i.e. contradiction, slippage and dysfunctionality these are assessed in relation to the actions (coping strategies) of programme members over a two-year period. Drawing on Giddens' (1995) notion of the duality of structure this study explores the nature of the relationship between the task environment and institutional environment demonstrating how and why knowledge is both an enabler and barrier to organisational innovation.
Resumo:
In recent years there has been a great effort to combine the technologies and techniques of GIS and process models. This project examines the issues of linking a standard current generation 2½d GIS with several existing model codes. The focus for the project has been the Shropshire Groundwater Scheme, which is being developed to augment flow in the River Severn during drought periods by pumping water from the Shropshire Aquifer. Previous authors have demonstrated that under certain circumstances pumping could reduce the soil moisture available for crops. This project follows earlier work at Aston in which the effects of drawdown were delineated and quantified through the development of a software package that implemented a technique which brought together the significant spatially varying parameters. This technique is repeated here, but using a standard GIS called GRASS. The GIS proved adequate for the task and the added functionality provided by the general purpose GIS - the data capture, manipulation and visualisation facilities - were of great benefit. The bulk of the project is concerned with examining the issues of the linkage of GIS and environmental process models. To this end a groundwater model (Modflow) and a soil moisture model (SWMS2D) were linked to the GIS and a crop model was implemented within the GIS. A loose-linked approach was adopted and secondary and surrogate data were used wherever possible. The implications of which relate to; justification of a loose-linked versus a closely integrated approach; how, technically, to achieve the linkage; how to reconcile the different data models used by the GIS and the process models; control of the movement of data between models of environmental subsystems, to model the total system; the advantages and disadvantages of using a current generation GIS as a medium for linking environmental process models; generation of input data, including the use of geostatistic, stochastic simulation, remote sensing, regression equations and mapped data; issues of accuracy, uncertainty and simply providing adequate data for the complex models; how such a modelling system fits into an organisational framework.
Resumo:
After a brief review of the various forms of thermal spraying equipment and processes, descriptions of the basic principles involved and the general functions for which thermally sprayed coatings are used are given. The background of the collaborating company, Metallisation, is described and their position in the overall market discussed, providing a backdrop against which the appropriateness of various project options might be judged. Current arc-spraying equipment is then examined, firstly in terms of the workings of their constituent parts and subsequently by examining the effects of changes in design and in operating parameters both upon equipment operation and the coatings produced. Published literature relating to these matters is reviewed. Literature relating to the production, comminution and propulsion of the particles which form the spray is discussed as are the mechanisms involved at impact with the substrate. Literature on the use of rockets for thermal spraying and induction heating as a process for feedstock melting are also reviewed. Three distinct options for further study are derived and preliminary tests and costings made to allow one option alone, the use of rocket acceleration, to go forward to the experimental phase. A suitable rocket burner was developed, tested and incorporated into an arc-spray system so that the sprayability of the whole could be assessed. Coatings were made using various parameters and these are compared with coatings produced by a standard system. Coatings were examined for macro and micro hardness, cohesive strength, porosity and by microstructural examination. The results indicate a high degree of similarity between the coatings produced by the standard system and the high velocity system. This was surprising in view of the very different atomising media and velocities. Possible causes for this similarity and the general behaviour of this new system and the standard system are discussed before the study reaches its conclusions in not proving the hypothesis that an increase in particle velocity would improve the mechanical properties of arc-sprayed steel coatings. KEY WORDS: Sprayed metal coatings, Electric arc spraying, High velocity flame spraying, Sprayed coating properties
Resumo:
A survey of the existing state-of-the-art of turbine blade manufacture highlights two operations that have not been automated namely that of loading of a turbine blade into an encapsulation die, and that of removing a machined blade from the encapsulation block. The automation of blade decapsulation has not been pursued. In order to develop a system to automate the loading of an encapsulation die a prototype mechanical handling robot has been designed together with a computer controlled encapsulation die. The robot has been designed as a mechanical handling robot of cylindrical geometry, suitable for use in a circular work cell. It is the prototype for a production model to be called `The Cybermate'. The prototype robot is mechanically complete but due to unforeseen circumstances the robot control system is not available (the development of the control system did not form a part of this project), hence it has not been possible to fully test and assess the robot mechanical design. Robot loading of the encapsulation die has thus been simulated. The research work with regard to the encapsulation die has focused on the development of computer controlled, hydraulically actuated, location pins. Such pins compensate for the inherent positional inaccuracy of the loading robot and reproduce the dexterity of the human operator. Each pin comprises a miniature hydraulic cylinder, controlled by a standard bidirectional flow control valve. The precision positional control is obtained through pulsing of the valves under software control, with positional feedback from an 8-bit transducer. A test-rig comprising one hydraulic location pin together with an opposing spring loaded pin has demonstrated that such a pin arrangement can be controlled with a repeatability of +/-.00045'. In addition this test-rig has demonstrated that such a pin arrangement can be used to gauge and compensate for the dimensional error of the component held between the pins, by offsetting the pin datum positions to allow for the component error. A gauging repeatability of +/- 0.00015' was demonstrated. This work has led to the design and manufacture of an encapsulation die comprising ten such pins and the associated computer software. All aspects of the control software except blade gauging and positional data storage have been demonstrated. Work is now required to achieve the accuracy of control demonstrated by the single pin test-rig, with each of the ten pins in the encapsulation die. This would allow trials of the complete loading cycle to take place.
Resumo:
The process framework comprises three phases, as follows: scope the supply chain/network; identify the options for supply system architecture and select supply system architecture. It facilitates a structured approach that analyses the supply chain/network contextual characteristics, in order to ensure alignment with the appropriate supply system architecture. The process framework was derived from comprehensive literature review and archival case study analysis. The review led to the classification of supply system architectures according to their orientation, whether integrated; partially integrated; co-ordinated or independent. The classification was combined with the characteristics that influence the selection of supply system architecture to encapsulate the conceptual framework. It builds upon existing frameworks and methodologies by focusing on structured procedure; supporting project management; facilitating participation and clarifying point of entry. The process framework was initially tested in three case study applications from the food, automobile and hand tool industries. A variety of industrial settings was chosen to illustrate transferability. The case study applications indicate that the process framework is a valid approach to the problem; however, further testing is required. In particular, the use of group support system technologies to support the process and the steps involving the participation of software vendors need further testing. However, the process framework can be followed due to the clarity of its presentation. It considers the issue of timing by including alternative decision-making techniques, dependent on the constraints. It is useful for ensuring a sound business case is developed, with supporting documentation and analysis that identifies the strategic and functional requirements of supply system architecture.
Resumo:
Protein crystallization has gained a new strategic and commercial relevance in the postgenomic era due to its pivotal role in structural genomics. Producing high quality crystals has always been a bottleneck to efficient structure determination, and this problem is becoming increasingly acute. This is especially true for challenging, therapeutically important proteins that typically do not form suitable crystals. The OptiCryst consortium has focused on relieving this bottleneck by making a concerted effort to improve the crystallization techniques usually employed, designing new crystallization tools, and applying such developments to the optimization of target protein crystals. In particular, the focus has been on the novel application of dual polarization interferometry (DPI) to detect suitable nucleation; the application of in situ dynamic light scattering (DLS) to monitor and analyze the process of crystallization; the use of UV-fluorescence to differentiate protein crystals from salt; the design of novel nucleants and seeding technologies; and the development of kits for capillary counterdiffusion and crystal growth in gels. The consortium collectively handled 60 new target proteins that had not been crystallized previously. From these, we generated 39 crystals with improved diffraction properties. Fourteen of these 39 were only obtainable using OptiCryst methods. For the remaining 25, OptiCryst methods were used in combination with standard crystallization techniques. Eighteen structures have already been solved (30% success rate), with several more in the pipeline.
Resumo:
The Library of Birmingham (LoB) is a £193million project designed to provide a new space for lifelong learning and knowledge growth, a physical and virtual portal for Birmingham's citizens to the wider world. In cooperation with a range of private, public, and third-sector bodies, as well as individual citizens, the library, due to open in June 2013, will articulate a continuing process of organic growth and emergence. Key delivery themes focus on: arts and creativity, citizenship and community, enterprise and innovation, learning and skills and the new media ecology. A landmark design in the heart of the cultural district of the city, the LoB aims to stimulate sustainable economic growth, urban regeneration and social inclusion by offering a wide range of new digital learning services, real and virtual community spaces, and new opportunities for interpreting and exploiting internationally significant collections of documentary archives, photography, moving image, and rare printed books. Additionally, the LoB will offer physical space for creative, cultural, enterprise, and knowledge development. This paper outlines the cultural and educational thinking that informs the project and the challenges experienced in developing innovative service redesign.
Resumo:
Methodologies for understanding business processes and their information systems (IS) are often criticized, either for being too imprecise and philosophical (a criticism often levied at softer methodologies) or too hierarchical and mechanistic (levied at harder methodologies). The process-oriented holonic modelling methodology combines aspects of softer and harder approaches to aid modellers in designing business processes and associated IS. The methodology uses holistic thinking and a construct known as the holon to build process descriptions into a set of models known as a holarchy. This paper describes the methodology through an action research case study based in a large design and manufacturing organization. The scientific contribution is a methodology for analysing business processes in environments that are characterized by high complexity, low volume and high variety where there are minimal repeated learning opportunities, such as large IS development projects. The practical deliverables from the project gave IS and business process improvements for the case study company.
Resumo:
This study proposes an integrated analytical framework for effective management of project risks using combined multiple criteria decision-making technique and decision tree analysis. First, a conceptual risk management model was developed through thorough literature review. The model was then applied through action research on a petroleum oil refinery construction project in the Central part of India in order to demonstrate its effectiveness. Oil refinery construction projects are risky because of technical complexity, resource unavailability, involvement of many stakeholders and strict environmental requirements. Although project risk management has been researched extensively, practical and easily adoptable framework is missing. In the proposed framework, risks are identified using cause and effect diagram, analysed using the analytic hierarchy process and responses are developed using the risk map. Additionally, decision tree analysis allows modelling various options for risk response development and optimises selection of risk mitigating strategy. The proposed risk management framework could be easily adopted and applied in any project and integrated with other project management knowledge areas.
Resumo:
This project is focused on exchanging knowledge between ABS, UKBI and managers of business incubators in the UK. The project relates to exploitation of extant knowledge-base on assessing and improving business incubation management practice and performance and builds on two earlier studies. It addresses a pressing need for assessing and benchmarking business incubation input, process and outcome performance and highlighting best practice. The overarching aim of this project was to obtain proof-of-concept for a business incubation performance assessment and benchmarking online tool, fine-tune it and put it in use by nurturing a community of business incubation management practice, aligned by the resultant tool. The purpose was to offer an appropriate set of measures, in areas identified by relevant research on business incubation performance management and impact as critical, against which: 1.The input and process performance of business incubation management practice can be assessed and benchmarked within the auspices of a community of incubator managers concerned with best practice 2.The outcome performance and impact of business incubators can be assessed longitudinally. As such, the developed online assessment framework is geared towards the needs of researchers, policy makers and practitioners concerned with business incubation performance, added value and impact.