934 resultados para Classificació AMS::70 Mechanics of particles and systems::70H Hamiltonian and Lagrangian mechanics
Resumo:
The fluid–particle interaction and the impact of different heat transfer conditions on pyrolysis of biomass inside a 150 g/h fluidised bed reactor are modelled. Two different size biomass particles (350 µm and 550 µm in diameter) are injected into the fluidised bed. The different biomass particle sizes result in different heat transfer conditions. This is due to the fact that the 350 µm diameter particle is smaller than the sand particles of the reactor (440 µm), while the 550 µm one is larger. The bed-to-particle heat transfer for both cases is calculated according to the literature. Conductive heat transfer is assumed for the larger biomass particle (550 µm) inside the bed, while biomass–sand contacts for the smaller biomass particle (350 µm) were considered unimportant. The Eulerian approach is used to model the bubbling behaviour of the sand, which is treated as a continuum. Biomass reaction kinetics is modelled according to the literature using a two-stage, semi-global model which takes into account secondary reactions. The particle motion inside the reactor is computed using drag laws, dependent on the local volume fraction of each phase. FLUENT 6.2 has been used as the modelling framework of the simulations with the whole pyrolysis model incorporated in the form of User Defined Function (UDF).
Resumo:
A typical liquid state NMR spectrum is composed of a number of discrete absorptions which can be readily interpreted to yield detailed information about the chemical environment of the nuclei found within the sample. The same cannot be said about the spectra of solid samples. For these the absorptions are typically broad, featureless and yield little information directly. This situation may be further exacerbated by the characteristically long T1 values of nuclei bound within a solid lattice which, consequently, require long inter-sequence delays that necessitate lengthy experiments. This work attempts to address both of these inherent problems. Classically, the resolution of the broad-line spectra of solids into discrete resonances has been achieved by imparting to the sample coherent rotation about specific axes in relation to the polarising magnetic field, as implemented in the magic-angle spinning (MAS) [1], dynamic angle spinning (DAS) [2] and double rotation (DOR) [3] NMR experiments. Recently, an alternative method, sonically induced narrowing of the NMR spectra of solids (SINNMR) [4], has been reported which yields the same well resolved solid-state spectra as the classic solid-state NMR experiments, but which achieves the resolution of the broad-line spectra through the promotion of incoherent motion in a suspension of solid particles. The first part of this work examines SINNMR and, in particular, concentrates on ultrasonically induced evaluation, a phenomenon which is thought to be essential to the incoherent averaging mechanism. The second part of this work extends the principle of incoherent motion, implicit in SINNMR, to a new genre of particulate systems, air fluidized beds, and examines the feasibility of such systems to provide well resolved solid state NMR spectra. Samples of trisodium phosphate dodecahydrate and of aluminium granules are examined using the new method with partially resolved spectra being reported in the case of the latter.
Resumo:
An analytical first order calculation of the impact of Gaussian white noise on a novel single Mach-Zehnder Interferometer demodulation scheme for DQPSK reveals a constant Q factor ratio to the conventional scheme.
Resumo:
The research investigates the processes of adoption and implementation, by organisations, of computer aided production management systems (CAPM). It is organised around two different theoretical perspectives. The first part is informed by the Rogers model of the diffusion, adoption and implementation of innovations, and the second part by a social constructionist approach to technology. Rogers' work is critically evaluated and a model of adoption and implementation is distilled from it and applied to a set of empirical case studies. In the light of the case study data, strengths and weaknesses of the model are identified. It is argued that the model is too rational and linear to provide an adequate explanation of adoption processes. It is useful for understanding processes of implementation but requires further development. The model is not able to adequately encompass complex computer based technologies. However, the idea of 'reinvention' is identified as Roger's key concept but it needs to be conceptually extended. Both Roger's model and definition of CAPM found in the literature from production engineering tend to treat CAPM in objectivist terms. The problems with this view are addressed through a review of the literature on the sociology of technology, and it is argued that a social constructionist approach offers a more useful framework for understanding CAPM, its nature, adoption, implementation, and use. CAPM it is argued, must be understood on terms of the ways in which it is constituted in discourse, as part of a 'struggle for meaning' on the part of academics, professional engineers, suppliers, and users.
Resumo:
Computerised production control developments have concentrated on Manufacturing Resources Planning (MRP II) systems. The literature suggests however, that despite the massive investment in hardware, software and management education, successful implementation of such systems in manufacturing industries has proved difficult. This thesis reviews the development of production planning and control systems, in particular, investigates the causes of failures in implementing MRP/MRP II systems in industrial environments and argues that the centralised and top-down planning structure, as well as the routine operational methodology of such systems, is inherently prone to failure. The thesis reviews the control benefits of cellular manufacturing systems but concludes that in more dynamic manufacturing environments, techniques such as Kanban are inappropriate. The basic shortcomings of MRP II systems are highlighted and a new enhanced operational methodology based on distributed planning and control principles is introduced. Distributed Manufacturing Resources Planning (DMRP), was developed as a capacity sensitive production planning and control solution for cellular manufacturing environments. The system utilises cell based, independently operated MRP II systems, integrated into a plant-wide control system through a Local Area Network. The potential benefits of adopting the system in industrial environments is discussed and the results of computer simulation experiments to compare the performance of the DMRP system against the conventional MRP II systems presented. DMRP methodology is shown to offer significant potential advantages which include ease of implementation, cost effectiveness, capacity sensitivity, shorter manufacturing lead times, lower working in progress levels and improved customer service.
Resumo:
The use of immunological adjuvants has been established since 1924 and ever since many candidates have been extensively researched in vaccine development. The controlled release of vaccine is another area of biotechnology research, which is advancing rapidly with great potential and success. Encapsulation of peptide and protein drugs within biodegradable microspheres has been amongst the most successful of approaches within the past decade. The present studies have focused on combining the advantages of microsphere delivery systems composed of biodegradable polylactide (PLLA) and polylactide-co-glycolide (PLGA) polymers with that of safe and effective adjuvants. The research efforts were directed to the development of single-dose delivery vehicles which, can be manufactured easily, safely, under mild and favourable conditions to the encapsulated antigens. In pursuing this objective non ionic block copolymers (NIBCs) (Pluronics@ LI01 and L121) were incorporated within poly-dl-lactide (PDLA) micorospheres prepared with emulsification-diffusion method. LI0I and L121 served both as adjuvants and stabilising agents within these vaccine delivery vehicles. These formulations encapsulating the model antigens lysozyme, ovalbumin (OVA) and diphtheria toxoid (DT) resulted in high entrapment efficiency (99%), yield (96.7%) and elicited high and sustained immune response (IgG titres up to 9427) after one single administration over nine months. The structural integrity of the antigens was preserved within these formulations. In evaluating new approaches for the use of well-established adjuvants such as alum, these particles were incorporated within PLLA and PLGA microspheres at much lesser quantities (5-10 times lower) than those contained within conventional alum-adsorbed vaccines. These studies focused on the incorporation of the clinically relevant tetanus toxoid (TT) antigen within biodegradable microspheres. The encapsulation of both alum particles and TT antigen within these micropheres resulted in preparations with high encapsulation efficiency (95%) and yield (91.2%). The immune response to these particles was also investigated to evaluate the secretion of serum IgG, IgG1, IgG2a and IgG2b after a single administration of these vaccines. The Splenic cells proliferation was also investigated as an indication for the induction of cell mediated immunity. These particles resulted in high and sustained immune response over a period of 14 months. The stability of TT within particles was also investigated under dry storage over a period of several months. NIBC microspheres were also investigated as potential DNA vaccine delivery systems using hepatitis B plasmid. These particles resulted in micro spheres of 3-5 μm diameter and were shown to preserve the integrity of the encapsulated (27.7% entrapment efficiency) hepatitis B plasmid.
Resumo:
The compaction behaviour of powders with soft and hard components is of particular interest to the paint processing industry. Unfortunately, at the present time, very little is known about the internal mechanisms within such systems and therefore suitable tests are required to help in the interpretative process. The TRUBAL, Distinct Element Method (D.E.M.) program was the method of investigation used in this study. Steel (hard) and rubber (soft) particles were used in the randomly-generated, binary assemblies because they provided a sharp contrast in physical properties. For reasons of simplicity, isotropic compression of two-dimensional assemblies was also initially considered. The assemblies were first subject to quasi-static compaction, in order to define their behaviour under equilibrium conditions. The stress-strain behaviour of the assemblies under such conditions was found to be adequately described by a second-order polynomial expansion. The structural evolution of the simulation assemblies was also similar to that observed for real powder systems. Further simulation tests were carried out to investigate the effects of particle size on the compaction behaviour of the two-dimensional, binary assemblies. Later work focused on the quasi-static compaction behaviour of three-dimensional assemblies, because they represented more realistic particle systems. The compaction behaviour of the assemblies during the simulation experiments was considered in terms of percolation theory concepts, as well as more familiar macroscopic and microstructural parameters. Percolation theory, which is based on ideas from statistical physics, has been found to be useful in the interpretation of the mechanical behaviour of simple, elastic lattices. However, from the evidence of this study, percolation theory is also able to offer a useful insight into the compaction behaviour of more realistic particle assemblies.
Resumo:
Surface deposition of dense aerosol particles is of major concern in the nuclear industry for safety assessment. This study presents theoretical investigations and computer simulations of single gas-born U3O8 particles impacting with the in-reactor surface and the fragmentation of small agglomerates. A theoretical model for elasto-plastic spheres has been developed and used to analyse the force-displacement and force-time relationships. The impulse equations, based on Newton's second law, are applied to govern the tangential bouncing behaviour. The theoretical model is then incorporated into the Distinct Element Method code TRUBAL in order to perform computer simulated tests of particle collisions. A comparison of simulated results with both theoretical predictions and experimental measurements is provided. For oblique impacts, the results in terms of the force-displacement relationship, coefficients of restitution, trajectory of the impacting particle, and distribution of kinetic energy and work done during the process of impact are presented. The effects of Poisson's ratio, friction, plastic deformation and initial particle rotation on the bouncing behaviour are also discussed. In the presence of adhesion an elasto-plastic collision model, which is an extension to the JKR theory, is developed. Based on an energy balance equation the critical sticking velocity is obtained. For oblique collisions computer simulated results are used to establish a set of criteria determining whether or not the particle bounces off the target plate. For impact velocities above the critical sticking value, computer simulated results for the coefficients of restitution and rebound angles of the particle are presented. Computer simulations of fracture/fragmentation resulting from agglomerate-wall impact have also been performed, where two randomly generated agglomerates (one monodisperse, the other polydisperse), each consisting of 50 primary particles are used. The effects of impact angle, local structural arrangements close to the impact point, and plastic deformation at the contacts on agglomerate damage are examined. The simulated results show a significant difference in agglomerate strength between the two assemblies. The computer data also shows that agglomerate damage resulting from an oblique impact is determined by the normal velocity component rather than the impact speed.
Resumo:
The research described here concerns the development of metrics and models to support the development of hybrid (conventional/knowledge based) integrated systems. The thesis argues from the point that, although it is well known that estimating the cost, duration and quality of information systems is a difficult task, it is far from clear what sorts of tools and techniques would adequately support a project manager in the estimation of these properties. A literature review shows that metrics (measurements) and estimating tools have been developed for conventional systems since the 1960s while there has been very little research on metrics for knowledge based systems (KBSs). Furthermore, although there are a number of theoretical problems with many of the `classic' metrics developed for conventional systems, it also appears that the tools which such metrics can be used to develop are not widely used by project managers. A survey was carried out of large UK companies which confirmed this continuing state of affairs. Before any useful tools could be developed, therefore, it was important to find out why project managers were not using these tools already. By characterising those companies that use software cost estimating (SCE) tools against those which could but do not, it was possible to recognise the involvement of the client/customer in the process of estimation. Pursuing this point, a model of the early estimating and planning stages (the EEPS model) was developed to test exactly where estimating takes place. The EEPS model suggests that estimating could take place either before a fully-developed plan has been produced, or while this plan is being produced. If it were the former, then SCE tools would be particularly useful since there is very little other data available from which to produce an estimate. A second survey, however, indicated that project managers see estimating as being essentially the latter at which point project management tools are available to support the process. It would seem, therefore, that SCE tools are not being used because project management tools are being used instead. The issue here is not with the method of developing an estimating model or tool, but; in the way in which "an estimate" is intimately tied to an understanding of what tasks are being planned. Current SCE tools are perceived by project managers as targetting the wrong point of estimation, A model (called TABATHA) is then presented which describes how an estimating tool based on an analysis of tasks would thus fit into the planning stage. The issue of whether metrics can be usefully developed for hybrid systems (which also contain KBS components) is tested by extending a number of "classic" program size and structure metrics to a KBS language, Prolog. Measurements of lines of code, Halstead's operators/operands, McCabe's cyclomatic complexity, Henry & Kafura's data flow fan-in/out and post-release reported errors were taken for a set of 80 commercially-developed LPA Prolog programs: By re~defining the metric counts for Prolog it was found that estimates of program size and error-proneness comparable to the best conventional studies are possible. This suggests that metrics can be usefully applied to KBS languages, such as Prolog and thus, the development of metncs and models to support the development of hybrid information systems is both feasible and useful.