909 resultados para process concentrated work


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The object of this work was to further develop the idea introduced by Muaddi et al (1981) which enables some of the disadvantages of earlier destructive adhesion test methods to be overcome. The test is non-destructive in nature but it does need to be calibrated against a destructive method. Adhesion is determined by measuring the effect of plating on internal friction. This is achieved by determining the damping of vibrations of a resonating specimen before and after plating. The level of adhesion was considered by the above authors to influence the degree of damping. In the major portion of the research work the electrodeposited metal was Watt's nickel, which is ductile in nature and is therefore suitable for peel adhesion testing. The base metals chosen were aluminium alloys S1C and HE9 as it is relatively easy to produce varying levels of adhesion between the substrate and electrodeposited coating by choosing the appropriate process sequence. S1C alloy is the commercially pure aluminium and was used to produce good adhesion. HE9 aluminium alloy is a more difficult to plate alloy and was chosen to produce poorer adhesion. The "Modal Testing" method used for studying vibrations was investigated as a possible means of evaluating adhesion but was not successful and so research was concentrated on the "Q" meter. The method based on the use of a "Q" meter involves the principle of exciting vibrations in a sample, interrupting the driving signal and counting the number of oscillations of the freely decaying vibrations between two known preselected amplitudes of oscillations. It was not possible to reconstruct a working instrument using Muaddi's thesis (1982) as it had either a serious error or the information was incomplete. Hence a modified "Q" meter had to be designed and constructed but it was then difficult to resonate non-magnetic materials, such as aluminium, therefore, a comparison before and after plating could not be made. A new "Q" meter was then developed based on an Impulse Technique. A regulated miniature hammer was used to excite the test piece at the fundamental mode instead of an electronic hammer and test pieces were supported at the two predetermined nodal points using nylon threads. This instrument developed was not very successful at detecting changes due to good and poor pretreatments given before plating, however, it was more sensitive to changes at the surface such as room temperature oxidation. Statistical analysis of test results from untreated aluminium alloys show that the instrument is not always consistent, the variation was even bigger when readings were taken on different days. Although aluminium is said to form protective oxides at room temperature there was evidence that the aluminium surface changes continuously due to film formation, growth and breakdown. Nickel plated and zinc alloy immersion coated samples also showed variation in Q with time. In order to prove that the variations in Q were mainly due to surface oxidation, aluminium samples were lacquered and anodised Such treatments enveloped the active surfaces reacting with the environment and the Q variation with time was almost eliminated especially after hard anodising. This instrument detected major differences between different untreated aluminium substrates.Also Q values decreased progressively as coating thicknesses were increased. This instrument was also able to detect changes in Q due to heat-treatment of aluminium alloys.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Symbiotic design methods aim to take into account technical, social and organizational criteria simultaneously. Over the years, many symbiotic methods have been developed and applied in various countries. Nevertheless, the diagnosis that only technical criteria receive attention in the design of production systems, is still made repeatedly. Examples of symbiotic approaches are presented at three different levels: technical systems, organizations, and the process. From these, discussion points are generated concerning the character of the approaches, the importance of economic motives, the impact of national environments, the necessity of a guided design process, the use of symbiotic methods, and the roles of participants in the design process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

If product cycle time reduction is the mission, and the multifunctional team is the means of achieving the mission, what then is the modus operandi by which this means is to accomplish its mission? This paper asserts that a preferred modus operandi for the multifunctional team is to adopt a process-oriented view of the manufacturing enterprise, and for this it needs the medium of a process map [16] The substance of this paper is a methodology which enables the creation of such maps Specific examples of process models drawn from the product develop ment life cycle are presented and described in order to support the methodology's integrity and value The specific deliverables we have so far obtained are a methodology for process capture and analysis, a collection of process models spanning the product development cycle, and, an engineering handbook which hosts these models and presents a computer-based means of navigating through these processes in order to allow users a better understanding of the nature of the business, their role in it, and why the job that they do benefits the work of the company We assert that this kind of thinking is the essence of concurrent engineering implementation, and further that the systemigram process models uniquely stim ulate and organise such thinking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The two areas of theory upon which this research was based were „strategy development process?(SDP) and „complex adaptive systems? (CAS), as part of complexity theory, focused on human social organisations. The literature reviewed showed that there is a paucity of empirical work and theory in the overlap of the two areas, providing an opportunity for contributions to knowledge in each area of theory, and for practitioners. An inductive approach was adopted for this research, in an effort to discover new insights to the focus area of study. It was undertaken from within an interpretivist paradigm, and based on a novel conceptual framework. The organisationally intimate nature of the research topic, and the researcher?s circumstances required a research design that was both in-depth and long term. The result was a single, exploratory, case study, which included use of data from 44 in-depth, semi-structured interviews, from 36 people, involving all the top management team members and significant other staff members; observations, rumour and grapevine (ORG) data; and archive data, over a 5½ year period (2005 – 2010). Findings confirm the validity of the conceptual framework, and that complex adaptive systems theory has potential to extend strategy development process theory. It has shown how and why the strategy process developed in the case study organisation by providing deeper insights to the behaviour of the people, their backgrounds, and interactions. Broad predictions of the „latent strategy development? process and some elements of the strategy content are also possible. Based on this research, it is possible to extend the utility of the SDP model by including peoples? behavioural characteristics within the organisation, via complex adaptive systems theory. Further research is recommended to test limits of the application of the conceptual framework and improve its efficacy with more organisations across a variety of sectors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The literature on the potential use of liquid ammonia as a solvent for the extraction of aromatic hydrocarbons from mixtures with paraffins, and the application of reflux, has been reviewed. Reference is made to extractors suited to this application. A pilot scale extraction plant was designed comprising a Scm. diameter by 12Scm. high, 50 stage Rotating Disc Contactor with 2 external settlers. Provision was made for operation with, or without, reflux at a pressure of 10 bar and ambient temperature. The solvent recovery unit consisted of an evaporator, compressor and condenser in a refrigeration cycle. Two systems were selected for study, Cumene-n-Heptane-Ammonia and Toluene-Methylcyclohexane-Ammonia. Equlibrium data for the first system was determined experimentally in a specially-designed, equilibrium bomb. A technique was developed to withdraw samples under pressure for analysis by chromatography and titration. The extraction plant was commissioned with a kerosine-water system; detailed operating procedures were developed based on a Hazard and Operability Study. Experimental runs were carried out with both ternary ammonia systems. With the system Toluene-Methylcyclohexane-Ammonia the extraction plant and the solvent recovery facility, operated satisfactorily, and safely,in accordance with the operating procedures. Experimental data gave reasonable agreement with theory. Recommendations are made for further work with plant.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pyrolysis is one of several thermochemical technologies that convert solid biomass into more useful and valuable bio-fuels. Pyrolysis is thermal degradation in the complete or partial absence of oxygen. Under carefully controlled conditions, solid biomass can be converted to a liquid known as bie-oil in 75% yield on dry feed. Bio-oil can be used as a fuel but has the drawback of having a high level of oxygen due to the presence of a complex mixture of molecular fragments of cellulose, hemicellulose and lignin polymers. Also, bio-oil has a number of problems in use including high initial viscosity, instability resulting in increased viscosity or phase separation and high solids content. Much effort has been spent on upgrading bio-oil into a more usable liquid fuel, either by modifying the liquid or by major chemical and catalytic conversion to hydrocarbons. The overall primary objective was to improve oil stability by exploring different ways. The first was to detennine the effect of feed moisture content on bio-oil stability. The second method was to try to improve bio-oil stability by partially oxygenated pyrolysis. The third one was to improve stability by co-pyrolysis with methanol. The project was carried out on an existing laboratory pyrolysis reactor system, which works well with this project without redesign or modification too much. During the finishing stages of this project, it was found that the temperature of the condenser in the product collection system had a marked impact on pyrolysis liquid stability. This was discussed in this work and further recommendation given. The quantity of water coming from the feedstock and the pyrolysis reaction is important to liquid stability. In the present work the feedstock moisture content was varied and pyrolysis experiments were carried out over a range of temperatures. The quality of the bio-oil produced was measured as water content, initial viscosity and stability. The result showed that moderate (7.3-12.8 % moisture) feedstock moisture led to more stable bio-oil. One of drawbacks of bio-oil was its instability due to containing unstable oxygenated chemicals. Catalytic hydrotreatment of the oil and zeolite cracking of pyrolysis vapour were discllssed by many researchers, the processes were intended to eliminate oxygen in the bio-oil. In this work an alternative way oxygenated pyrolysis was introduced in order to reduce oil instability, which was intended to oxidise unstable oxygenated chemicals in the bio-oil. The results showed that liquid stability was improved by oxygen addition during the pyrolysis of beech wood at an optimum air factor of about 0.09-0.15. Methanol as a postproduction additive to bio-oil has been studied by many researchers and the most effective result came from adding methanol to oil just after production. Co-pyrolysis of spruce wood with methanol was undertaken in the present work and it was found that methanol improved liquid stability as a co-pyrolysis solvent but was no more effective than when used as a postproduction additive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Operators can become confused while diagnosing faults in process plant while in operation. This may prevent remedial actions being taken before hazardous consequences can occur. The work in this thesis proposes a method to aid plant operators in systematically finding the causes of any fault in the process plant. A computer aided fault diagnosis package has been developed for use on the widely available IBM PC compatible microcomputer. The program displays a coloured diagram of a fault tree on the VDU of the microcomputer, so that the operator can see the link between the fault and its causes. The consequences of the fault and the causes of the fault are also shown to provide a warning of what may happen if the fault is not remedied. The cause and effect data needed by the package are obtained from a hazard and operability (HAZOP) study on the process plant. The result of the HAZOP study is recorded as cause and symptom equations which are translated into a data structure and stored in the computer as a file for the package to access. Probability values are assigned to the events that constitute the basic causes of any deviation. From these probability values, the a priori probabilities of occurrence of other events are evaluated. A top-down recursive algorithm, called TDRA, for evaluating the probability of every event in a fault tree has been developed. From the a priori probabilities, the conditional probabilities of the causes of the fault are then evaluated using Bayes' conditional probability theorem. The posteriori probability values could then be used by the operators to check in an orderly manner the cause of the fault. The package has been tested using the results of a HAZOP study on a pilot distillation plant. The results from the test show how easy it is to trace the chain of events that leads to the primary cause of a fault. This method could be applied in a real process environment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ion exchange resins are used for many purposes in various areas of science and commerce. One example is the use of cation exchange resins in the nuclear industry for the clean up of radioactively contaminated water (for example the removal of 137Cs). However, during removal of radionuclides, the resin itself becomes radioactively contaminated, and must be treated as Intermediate Level Waste. This radioactive contamination of the resin creates a disposal problem. Conventionally, there are two main avenues of disposal for industrial wastes, landfill burial or incineration. However, these are regarded as inappropriate for the disposal of the cation exchange resin involved in this project. Thus, a method involving the use of Fenton's Reagent (Hydrogen Peroxide/soluble Iron catalyst) to destroy the resin by wet oxidation has been developed. This process converts 95% of the solid resin to gaseous CO2, thus greatly reducing the volume of radioactive waste that has to be disposed of. However, hydrogen peroxide is an expensive reagent, and is a major component of the cost of any potential plant for the destruction of ion exchange resin. The aim of my project has been to discover a way of improving the efficiency of the destruction of the resin thus reducing the cost involved in the use of hydrogen peroxide. The work on this problem has been concentrated in two main areas:-1) Use of analytical techniques such as NMR and IR to follow the process of the hydrogen peroxide destruction of both resin beads and model systems such as water soluble calixarenes. 2) Use of various physical and chemical techniques in an attempt to improve the overall efficiency of hydrogen peroxide utilization. Examples of these techniques include UV irradiation, both with and without a photocatalyst, oxygen carrying molecules and various stirring regimes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is increasing research interest into the nature of competences required to secure a graduate job. This paper examines the role of the undergraduate work placement in developing such employment competences. In order to do this we draw upon a framework of generic competences developed in a previous project by one of the authors, together with data on how these competences are valued by graduates and employers. We also draw upon a survey of employers and students who have participated in an Aston Business School work placement. The work placement year is an integral feature of Aston’s undergraduate business programme and gives up to 600 students a year the experience of working with well known companies. For the past five years we have conducted a survey of these companies to assess their experience of employing our undergraduates on work placements and to examine the skills and competencies developed by students in the learning process. In this paper we compare data from both pieces of research to examine how competences developed during the undergraduate work placement contribute to the enhancement of graduate employment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research was undertaken to: develop a process for the direct solvent extraction of castor oil seeds. A literature survey confirmed the desirability of establishing such a process with emphasis on the decortication, size, reduction, detoxification-deallergenization, and solvent·extraction operations. A novel process was developed for the dehulling of castor seeds which consists of pressurizing the beans and then suddenly releasing the pressure to vaccum. The degree of dehulling varied according to the pressure applied and the size of the beans. Some of the batches were difficult-to-hull, and this phenomenon was investigated using the scanning electron microscope and by thickness and compressive strength measurements. The other variables studied to lesser degrees included residence time, moisture, content, and temperature.The method was successfully extended to cocoa beans, and (with modifications) to peanuts. The possibility of continuous operation was looked into, and a mechanism was suggested to explain the method works. The work on toxins and allergens included an extensive literature survey on the properties of these substances and the methods developed for their deactivation Part of the work involved setting up an assay method for measuring their concentration in the beans and cake, but technical difficulties prevented the completion of this aspect of the project. An appraisal of the existing deactivation methods was made in the course of searching for new ones. A new method of reducing the size of oilseeds was introduced in this research; it involved freezing the beans in cardice and milling them in a coffee grinder, the method was found to be a quick, efficient, and reliable. An application of the freezing technique was successful in dehulling soybeans and de-skinning peanut kernels. The literature on the solvent extraction, of oilseeds, especially castor, was reviewed: The survey covered processes, equipment, solvents, and mechanism of leaching. three solvents were experimentally investigated: cyclohexane, ethanol, and acetone. Extraction with liquid ammonia and liquid butane was not effective under the conditions studied. Based on the results of the research a process has been suggested for the direct solvent extraction of castor seeds, the various sections of the process have analysed, and the factors affecting the economics of the process were discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The main objective of this work was to examlne the various stages of the production of industrial laminates based on phenol-formaldehyde resins, with a view of suggesting ways of improving the process economics and/or the physical properties of the final product. Aspects of impregnation, drying, and lamination were investigated. The resins used in all experiments were ammonia-catalysed. Work was concentrated on the lamination stage since this is a labour intensive activity. Paper-phenolic lay-ups were characterised in terms of the temperatures experienced during cure, and a shorter cure-cycle is proposed, utilising the exothermic heat produced during pressing of 25.5 mm thick lay-ups. Significant savings in production costs and improvements in some of the physical properties have been achieved. In particular, water absorption has been reduced by 43-61%. Work on the drying stage has shown that rapid heating of the wet impregnated substrate results in resin solids losses. Drying at lower temperatures by reducing the driving force leads to more resin (up to 6.5%) being retained by the prepregs and therefore more effective use of an expensive raw material. The impregnation work has indicated that residence times above 6 seconds in the varnish bath enhance the insulation resistance of the final product, possibly due to improved resin distribution and reduction in water absorption. In addition, a novel process which involves production of laminates by in situ polymerisation of the phenolic resin on the substrate has been examined. Such a process would eliminate the solvent recovery plant - a necessary stage in current industrial processes. In situ polymerisation has been shown to be chemically feasible.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research was concerned with identifying factors which may influence human reliability within chemical process plants - these factors are referred to as Performance Shaping Factors (PSFs). Following a period of familiarization within the industry, a number of case studies were undertaken covering a range of basic influencing factors. Plant records and site `lost time incident reports' were also used as supporting evidence for identifying and classifying PSFs. In parallel to the investigative research, the available literature appertaining to human reliability assessment and PSFs was considered in relation to the chemical process plan environment. As a direct result of this work, a PSF classification structure has been produced with an accompanying detailed listing. Phase two of the research considered the identification of important individual PSFs for specific situations. Based on the experience and data gained during phase one, it emerged that certain generic features of a task influenced PSF relevance. This led to the establishment of a finite set of generic task groups and response types. Similarly, certain PSFs influence some human errors more than others. The result was a set of error type key words, plus the identification and classification of error causes with their underlying error mechanisms. By linking all these aspects together, a comprehensive methodology has been forwarded as the basis of a computerized aid for system designers. To recapitulate, the major results of this research have been: One, the development of a comprehensive PSF listing specifically for the chemical process industries with a classification structure that facilitates future updates; and two, a model of identifying relevant SPFs and their order of priority. Future requirements are the evaluation of the PSF listing and the identification method. The latter must be considered both in terms of `useability' and its success as a design enhancer, in terms of an observable reduction in important human errors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the advent of distributed computer systems with a largely transparent user interface, new questions have arisen regarding the management of such an environment by an operating system. One fertile area of research is that of load balancing, which attempts to improve system performance by redistributing the workload submitted to the system by the users. Early work in this field concentrated on static placement of computational objects to improve performance, given prior knowledge of process behaviour. More recently this has evolved into studying dynamic load balancing with process migration, thus allowing the system to adapt to varying loads. In this thesis, we describe a simulated system which facilitates experimentation with various load balancing algorithms. The system runs under UNIX and provides functions for user processes to communicate through software ports; processes reside on simulated homogeneous processors, connected by a user-specified topology, and a mechanism is included to allow migration of a process from one processor to another. We present the results of a study of adaptive load balancing algorithms, conducted using the aforementioned simulated system, under varying conditions; these results show the relative merits of different approaches to the load balancing problem, and we analyse the trade-offs between them. Following from this study, we present further novel modifications to suggested algorithms, and show their effects on system performance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Despite the voluminous studies written about organisational innovation over the last 30-40 years our understanding of this phenomenon continues to be inconsistent and inconclusive (Wolfe, 1994). An assessment of the theoretical and methodological issues influencing the explanatory utility of many studies has led scholars (e.g. Slappendel, 1996) to re-evaluate the assumptions used to ground studies. Building on these criticisms the current study contributes to the development of an interactive perspective of organisational innovation. This work contributes empirically and theoretically to an improved understanding of the innovation process and the interaction between the realm of action and the mediating effects of pre-existing contingencies i.e. social control, economic exchange and the communicability of knowledge (Scarbrough, 1996). Building on recent advances in institutional theory (see Barley, 1986; 1990; Barley and Tolbert, 1997) and critical theory (Morrow, 1994, Sayer, 1992) the study aims to demonstrate, via longitudinal intensive research, the process through which ideas are translated into reality. This is significant because, despite a growing recognition of the implicit link between the strategic conduct of actors and the institutional realm in organisational analysis, there are few examples that theorise and empirically test these connections. By assessing an under researched example of technology transfer; the government's Teaching Company Scheme (TCS) this project provides a critique of the innovation process that contributes to theory and our appreciation of change in the UK government's premier technology transfer scheme (QR, 1996). Critical moments during the translation of ideas illustrate how elements that are linked to social control, economic exchange and communicability mediate the innovation process. Using analytical categories i.e. contradiction, slippage and dysfunctionality these are assessed in relation to the actions (coping strategies) of programme members over a two-year period. Drawing on Giddens' (1995) notion of the duality of structure this study explores the nature of the relationship between the task environment and institutional environment demonstrating how and why knowledge is both an enabler and barrier to organisational innovation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Xerox Customer Engagement activity is informed by the "Go To Market" strategy, and "Intelligent Coverage" sales philosophy. The realisation of this philosophy necessitates a sophisticated level of Market Understanding, and the effective integration of the direct channels of Customer Engagement. Sophisticated Market Understanding requires the mapping and coding of the entire UK market at the DMU (Decision Making Unit) level, which in turn enables the creation of tailored coverage prescriptions. Effective Channel Integration is made possible by the organisation of Customer Engagement work according to a single, process defined structure: the Selling Process. Organising by process facilitates the discipline of Task Substitution, which leads logically to creation of Hybrid Selling models. Productive Customer Engagement requires Selling Process specialisation by industry sector, customer segment and product group. The research shows that Xerox's Market Database (MDB) plays a central role in delivering the Go To Market strategic aims. It is a tool for knowledge based selling, enables productive SFA (Sales Force Automation) and, in sum, is critical to the efficient and effective deployment of Customer Engagement resources. Intelligent Coverage is not possible without the MDB. Analysis of the case evidence has resulted in the definition of 60 idiographic statements. These statements are about how Xerox organise and manage three direct channels of Customer Engagement: Face to Face, Telebusiness and Ebusiness. Xerox is shown to employ a process-oriented, IT-enabled, holistic approach to Customer Engagement productivity. The significance of the research is that it represents a detailed (perhaps unequalled) level of rich description of the interplay between IT and a holistic, process-oriented management philosophy.