922 resultados para large-eddy simulation


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Open-loop operatlon of the stepping motor exploits the inherent advantages of the machine. For near optimum operation: in this mode, however, an accurate system model is required to facilitate controller design. Such a model must be comprehensive and take account of the non-linearities inherent in the system. The result is a complex formulation which can be made manageable with a computational aid. A digital simulation of a hybrid type stepping motor and its associated drive circuit is proposed. The simulation is based upon a block diagram model which includes reasonable approximations to the major non-linearities. The simulation is shown to yield accurate performance predictions. The determination of the transfer functions is based upon the consideration of the physical processes involved rather than upon direct input-outout measurements. The effects of eddy currents, saturation, hysteresis, drive circuit characteristics and non-linear torque displacement characteristics are considered and methods of determining transfer functions, which take account of these effects, are offered. The static torque displacement characteristic is considered in detail and a model is proposed which predicts static torque for any combination of phase currents and shaft position. Methods of predicting the characteristic directly from machine geometry are investigated. Drive circuit design for high efficiency operation is considered and a model of a bipolar, bilevel circuit is proposed. The transfers between stator voltage and stator current and between stator current and air gap flux are complicated by the effects of eddy currents, saturation and hysteresis. Frequency response methods, combined with average inductance measurements, are shown to yield reasonable transfer functions. The modelling procedure and subsequent digital simulation is concluded to be a powerful method of non-linear analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis describes work carried out to improve the fundamental modelling of liquid flows on distillation trays. A mathematical model is presented based on the principles of computerised fluid dynamics. It models the liquid flow in the horizontal directions allowing for the effects of the vapour through the use of an increased liquid turbulence, modelled by an eddy viscosity, and a resistance to liquid flow caused by the vapour being accelerated horizontally by the liquid. The resultant equations are similar to the Navier-Stokes equations with the addition of a resistance term.A mass-transfer model is used to calculate liquid concentration profiles and tray efficiencies. A heat and mass transfer analogy is used to compare theoretical concentration profiles to experimental water-cooling data obtained from a 2.44 metre diameter air-water distillation simulation rig. The ratios of air to water flow rates are varied in order to simulate three pressures: vacuum, atmospheric pressure and moderate pressure.For simulated atmospheric and moderate pressure distillation, the fluid mechanical model constantly over-predicts tray efficiencies with an accuracy of between +1.7% and +11.3%. This compares to -1.8% to -10.9% for the stagnant regions model (Porter et al. 1972) and +12.8% to +34.7% for the plug flow plus back-mixing model (Gerster et al. 1958). The model fails to predict the flow patterns and tray efficiencies for vacuum simulation due to the change in the mechanism of liquid transport, from a liquid continuous layer to a spray as the liquid flow-rate is reduced. This spray is not taken into account in the development of the fluid mechanical model. A sensitivity analysis carried out has shown that the fluid mechanical model is relatively insensitive to the prediction of the average height of clear liquid, and a reduction in the resistance term results in a slight loss of tray efficiency. But these effects are not great. The model is quite sensitive to the prediction of the eddy viscosity term. Variations can produce up to a 15% decrease in tray efficiency. The fluid mechanical model has been incorporated into a column model so that statistical optimisation techniques can be employed to fit a theoretical column concentration profile to experimental data. Through the use of this work mass-transfer data can be obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There is considerable concern over the increased effect of fossil fuel usage on the environment and this concern has resulted in an effort to find alternative, environmentally friendly energy sources. Biomass is an available alternative resource which may be converted by flash pyrolysis to produce a crude liquid product that can be used directly to substitute for conventional fossil fuels or upgraded to a higher quality fuel. Both the crude and upgraded products may be utilised for power generation. A computer program, BLUNT, has been developed to model the flash pyrolysis of biomass with subsequent upgrading, refining or power production. The program assesses and compares the economic and technical opportunities for biomass thermochemical conversion on the same basis. BLUNT works by building up a selected processing route from a number of process steps through which the material passes sequentially. Each process step has a step model that calculates the mass and energy balances, the utilities usage and the capital cost for that step of the process. The results of the step models are combined to determine the performance of the whole conversion route. Sample results from the modelling are presented in this thesis. Due to the large number of possible combinations of feeds, conversion processes, products and sensitivity analyses a complete set of results is impractical to present in a single publication. Variation of the production costs for the available products have been illustrated based on the cost of a wood feedstock. The effect of selected macroeconomic factors on the production costs of bio-diesel and gasoline are also given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Changes in modern structural design have created a demand for products which are light but possess high strength. The objective is a reduction in fuel consumption and weight of materials to satisfy both economic and environmental criteria. Cold roll forming has the potential to fulfil this requirement. The bending process is controlled by the shape of the profile machined on the periphery of the rolls. A CNC lathe can machine complicated profiles to a high standard of precision, but the expertise of a numerical control programmer is required. A computer program was developed during this project, using the expert system concept, to calculate tool paths and consequently to expedite the procurement of the machine control tapes whilst removing the need for a skilled programmer. Codifying the expertise of a human and the encapsulation of knowledge within a computer memory, destroys the dependency on highly trained people whose services can be costly, inconsistent and unreliable. A successful cold roll forming operation, where the product is geometrically correct and free from visual defects, is not easy to attain. The geometry of the sheet after travelling through the rolling mill depends on the residual strains generated by the elastic-plastic deformation. Accurate evaluation of the residual strains can provide the basis for predicting the geometry of the section. A study of geometric and material non-linearity, yield criteria, material hardening and stress-strain relationships was undertaken in this research project. The finite element method was chosen to provide a mathematical model of the bending process and, to ensure an efficient manipulation of the large stiffness matrices, the frontal solution was applied. A series of experimental investigations provided data to compare with corresponding values obtained from the theoretical modelling. A computer simulation, capable of predicting that a design will be satisfactory prior to the manufacture of the rolls, would allow effort to be concentrated into devising an optimum design where costs are minimised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis introduces and develops a novel real-time predictive maintenance system to estimate the machine system parameters using the motion current signature. Recently, motion current signature analysis has been addressed as an alternative to the use of sensors for monitoring internal faults of a motor. A maintenance system based upon the analysis of motion current signature avoids the need for the implementation and maintenance of expensive motion sensing technology. By developing nonlinear dynamical analysis for motion current signature, the research described in this thesis implements a novel real-time predictive maintenance system for current and future manufacturing machine systems. A crucial concept underpinning this project is that the motion current signature contains infor­mation relating to the machine system parameters and that this information can be extracted using nonlinear mapping techniques, such as neural networks. Towards this end, a proof of con­cept procedure is performed, which substantiates this concept. A simulation model, TuneLearn, is developed to simulate the large amount of training data required by the neural network ap­proach. Statistical validation and verification of the model is performed to ascertain confidence in the simulated motion current signature. Validation experiment concludes that, although, the simulation model generates a good macro-dynamical mapping of the motion current signature, it fails to accurately map the micro-dynamical structure due to the lack of knowledge regarding performance of higher order and nonlinear factors, such as backlash and compliance. Failure of the simulation model to determine the micro-dynamical structure suggests the pres­ence of nonlinearity in the motion current signature. This motivated us to perform surrogate data testing for nonlinearity in the motion current signature. Results confirm the presence of nonlinearity in the motion current signature, thereby, motivating the use of nonlinear tech­niques for further analysis. Outcomes of the experiment show that nonlinear noise reduction combined with the linear reverse algorithm offers precise machine system parameter estimation using the motion current signature for the implementation of the real-time predictive maintenance system. Finally, a linear reverse algorithm, BJEST, is developed and applied to the motion current signature to estimate the machine system parameters.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Particulate solids are complex redundant systems which consist of discrete particles. The interactions between the particles are complex and have been the subject of many theoretical and experimental investigations. Invetigations of particulate material have been restricted by the lack of quantitative information on the mechanisms occurring within an assembly. Laboratory experimentation is limited as information on the internal behaviour can only be inferred from measurements on the assembly boundary, or the use of intrusive measuring devices. In addition comparisons between test data are uncertain due to the difficulty in reproducing exact replicas of physical systems. Nevertheless, theoretical and technological advances require more detailed material information. However, numerical simulation affords access to information on every particle and hence the micro-mechanical behaviour within an assembly, and can replicate desired systems. To use a computer program to numerically simulate material behaviour accurately it is necessary to incorporte realistic interaction laws. This research programme used the finite difference simulation program `BALL', developed by Cundall (1971), which employed linear spring force-displacement laws. It was thus necessary to incorporate more realistic interaction laws. Therefore, this research programme was primarily concerned with the implementation of the normal force-displacement law of Hertz (1882) and the tangential force-displacement laws of Mindlin and Deresiewicz (1953). Within this thesis the contact mechanics theories employed in the program are developed and the adaptations which were necessary to incorporate these laws are detailed. Verification of the new contact force-displacement laws was achieved by simulating a quasi-static oblique contact and single particle oblique impact. Applications of the program to the simulation of large assemblies of particles is given, and the problems in undertaking quasi-static shear tests along with the results from two successful shear tests are described.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Epitopes mediated by T cells lie at the heart of the adaptive immune response and form the essential nucleus of anti-tumour peptide or epitope-based vaccines. Antigenic T cell epitopes are mediated by major histocompatibility complex (MHC) molecules, which present them to T cell receptors. Calculating the affinity between a given MHC molecule and an antigenic peptide using experimental approaches is both difficult and time consuming, thus various computational methods have been developed for this purpose. A server has been developed to allow a structural approach to the problem by generating specific MHC:peptide complex structures and providing configuration files to run molecular modelling simulations upon them. A system has been produced which allows the automated construction of MHC:peptide structure files and the corresponding configuration files required to execute a molecular dynamics simulation using NAMD. The system has been made available through a web-based front end and stand-alone scripts. Previous attempts at structural prediction of MHC:peptide affinity have been limited due to the paucity of structures and the computational expense in running large scale molecular dynamics simulations. The MHCsim server (http://igrid-ext.cryst.bbk.ac.uk/MHCsim) allows the user to rapidly generate any desired MHC:peptide complex and will facilitate molecular modelling simulation of MHC complexes on an unprecedented scale.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer based discrete event simulation (DES) is one of the most commonly used aids for the design of automotive manufacturing systems. However, DES tools represent machines in extensive detail, while only representing workers as simple resources. This presents a problem when modelling systems with a highly manual work content, such as an assembly line. This paper describes research at Cranfield University, in collaboration with the Ford Motor Company, founded on the assumption that human variation is the cause of a large percentage of the disparity between simulation predictions and real world performance. The research aims to improve the accuracy and reliability of simulation prediction by including models of human factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The heightened threat of terrorism has caused governments worldwide to plan for responding to large-scale catastrophic incidents. In England the New Dimension Programme supplies equipment, procedures and training to the Fire and Rescue Service to ensure the country's preparedness to respond to a range of major critical incidents. The Fire and Rescue Service is involved partly by virtue of being able to very quickly mobilize a large skilled workforce and specialist equipment. This paper discusses the use of discrete event simulation modeling to understand how a fire and rescue service might position its resources before an incident takes place, to best respond to a combination of different incidents at different locations if they happen. Two models are built for this purpose. The first model deals with mass decontamination of a population following a release of a hazardous substance—aiming to study resource requirements (vehicles, equipment and manpower) necessary to meet performance targets. The second model deals with the allocation of resources across regions—aiming to study cover level and response times, analyzing different allocations of resources, both centralized and decentralized. Contributions to theory and practice in other contexts (e.g. the aftermath of natural disasters such as earthquakes) are outlined.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we study the localization problem in large-scale Underwater Wireless Sensor Networks (UWSNs). Unlike in the terrestrial positioning, the global positioning system (GPS) can not work efficiently underwater. The limited bandwidth, the severely impaired channel and the cost of underwater equipment all makes the localization problem very challenging. Most current localization schemes are not well suitable for deep underwater environment. We propose a hierarchical localization scheme to address the challenging problems. The new scheme mainly consists of four types of nodes, which are surface buoys, Detachable Elevator Transceivers (DETs), anchor nodes and ordinary nodes. Surface buoy is assumed to be equipped with GPS on the water surface. A DET is attached to a surface buoy and can rise and down to broadcast its position. The anchor nodes can compute their positions based on the position information from the DETs and the measurements of distance to the DETs. The hierarchical localization scheme is scalable, and can be used to make balances on the cost and localization accuracy. Initial simulation results show the advantages of our proposed scheme. © 2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently underwater sensor networks (UWSN) attracted large research interests. Medium access control (MAC) is one of the major challenges faced by UWSN due to the large propagation delay and narrow channel bandwidth of acoustic communications used for UWSN. Widely used slotted aloha (S-Aloha) protocol suffers large performance loss in UWSNs, which can only achieve performance close to pure aloha (P-Aloha). In this paper we theoretically model the performances of S-Aloha and P-Aloha protocols and analyze the adverse impact of propagation delay. According to the observation on the performances of S-Aloha protocol we propose two enhanced S-Aloha protocols in order to minimize the adverse impact of propagation delay on S-Aloha protocol. The first enhancement is a synchronized arrival S-Aloha (SA-Aloha) protocol, in which frames are transmitted at carefully calculated time to align the frame arrival time with the start of time slots. Propagation delay is taken into consideration in the calculation of transmit time. As estimation error on propagation delay may exist and can affect network performance, an improved SA-Aloha (denoted by ISA-Aloha) is proposed, which adjusts the slot size according to the range of delay estimation errors. Simulation results show that both SA-Aloha and ISA-Aloha perform remarkably better than S-Aloha and P-Aloha for UWSN, and ISA-Aloha is more robust even when the propagation delay estimation error is large. © 2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we study an area localization problem in large scale Underwater Wireless Sensor Networks (UWSNs). The limited bandwidth, the severely impaired channel and the cost of underwater equipment all makes the underwater localization problem very challenging. Exact localization is very difficult for UWSNs in deep underwater environment. We propose a Mobile DETs based efficient 3D multi-power Area Localization Scheme (3D-MALS) to address the challenging problem. In the proposed scheme, the ideas of 2D multi-power Area Localization Scheme(2D-ALS) [6] and utilizing Detachable Elevator Transceiver (DET) are used to achieve the simplicity, location accuracy, scalability and low cost performances. The DET can rise and down to broadcast its position. And it is assumed that all the underwater nodes underwater have pressure sensors and know their z coordinates. The simulation results show that our proposed scheme is very efficient. © 2009 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Calibration of stochastic traffic microsimulation models is a challenging task. This paper proposes a fast iterative probabilistic precalibration framework and demonstrates how it can be successfully applied to a real-world traffic simulation model of a section of the M40 motorway and its surrounding area in the U.K. The efficiency of the method stems from the use of emulators of the stochastic microsimulator, which provides fast surrogates of the traffic model. The use of emulators minimizes the number of microsimulator runs required, and the emulators' probabilistic construction allows for the consideration of the extra uncertainty introduced by the approximation. It is shown that automatic precalibration of this real-world microsimulator, using turn-count observational data, is possible, considering all parameters at once, and that this precalibrated microsimulator improves on the fit to observations compared with the traditional expertly tuned microsimulation. © 2000-2011 IEEE.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Large-scale evacuations are a recurring theme on news channels, whether in response to major natural or manmade disasters. The role of warning dissemination is a key part in the success of such large-scale evacuations and its inadequacy in certain cases has been a 'primary contribution to deaths and injuries' (Hayden et al.; 2007). Along with technology-driven 'official warning channels' (e.g. sirens, mass media), the role of unofficial channel (e.g. neighbours, personal contacts, volunteer wardens) has proven to be significant in warning the public of the need to evacuate. Although post-evacuation studies identify the behaviours of evacuees as disseminators of the warning message, there has not been a detailed study that quantifies the effects of such behaviour on the warning message dissemination. This paper develops an Agent-Based Simulation (ABS) model of multiple agents (evacuee households) in a hypothetical community to investigate the impact of behaviour as an unofficial channel on the overall warning dissemination. Parameters studied include the percentage of people who warn their neighbours, the efficiency of different official warning channels, and delay time to warn neighbours. Even with a low proportion of people willing to warn their neighbour, the results showed considerable impact on the overall warning dissemination. © 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Timely warning of the public during large scale emergencies is essential to ensure safety and save lives. This ongoing study proposes an agent-based simulation model to simulate the warning message dissemination among the public considering both official channels and unofficial channels The proposed model was developed in NetLogo software for a hypothetical area, and requires input parameters such as effectiveness of each official source (%), estimated time to begin informing others, estimated time to inform others and estimated percentage of people (who do not relay the message). This paper demonstrates a means of factoring the behaviour of the public as informants into estimating the effectiveness of warningdissemination during large scale emergencies. The model provides a tool for the practitioner to test the potential impact of the informal channels on the overall warning time and sensitivity of the modelling parameters. The tool would help the practitioners to persuade evacuees to disseminate the warning message informing others similar to the ’Run to thy neighbour campaign conducted by the Red cross.