935 resultados para Simulation tool
Resumo:
Computational Fluid Dynamics (CFD) has found great acceptance among the engineering community as a tool for research and design of processes that are practically difficult or expensive to study experimentally. One of these processes is the biomass gasification in a Circulating Fluidized Bed (CFB). Biomass gasification is the thermo-chemical conversion of biomass at a high temperature and a controlled oxygen amount into fuel gas, also sometime referred to as syngas. Circulating fluidized bed is a type of reactor in which it is possible to maintain a stable and continuous circulation of solids in a gas-solid system. The main objectives of this thesis are four folds: (i) Develop a three-dimensional predictive model of biomass gasification in a CFB riser using advanced Computational Fluid Dynamic (CFD) (ii) Experimentally validate the developed hydrodynamic model using conventional and advanced measuring techniques (iii) Study the complex hydrodynamics, heat transfer and reaction kinetics through modelling and simulation (iv) Study the CFB gasifier performance through parametric analysis and identify the optimum operating condition to maximize the product gas quality. Two different and complimentary experimental techniques were used to validate the hydrodynamic model, namely pressure measurement and particle tracking. The pressure measurement is a very common and widely used technique in fluidized bed studies, while, particle tracking using PEPT, which was originally developed for medical imaging, is a relatively new technique in the engineering field. It is relatively expensive and only available at few research centres around the world. This study started with a simple poly-dispersed single solid phase then moved to binary solid phases. The single solid phase was used for primary validations and eliminating unnecessary options and steps in building the hydrodynamic model. Then the outcomes from the primary validations were applied to the secondary validations of the binary mixture to avoid time consuming computations. Studies on binary solid mixture hydrodynamics is rarely reported in the literature. In this study the binary solid mixture was modelled and validated using experimental data from the both techniques mentioned above. Good agreement was achieved with the both techniques. According to the general gasification steps the developed model has been separated into three main gasification stages; drying, devolatilization and tar cracking, and partial combustion and gasification. The drying was modelled as a mass transfer from the solid phase to the gas phase. The devolatilization and tar cracking model consist of two steps; the devolatilization of the biomass which is used as a single reaction to generate the biomass gases from the volatile materials and tar cracking. The latter is also modelled as one reaction to generate gases with fixed mass fractions. The first reaction was classified as a heterogeneous reaction while the second reaction was classified as homogenous reaction. The partial combustion and gasification model consisted of carbon combustion reactions and carbon and gas phase reactions. The partial combustion considered was for C, CO, H2 and CH4. The carbon gasification reactions used in this study is the Boudouard reaction with CO2, the reaction with H2O and Methanation (Methane forming reaction) reaction to generate methane. The other gas phase reactions considered in this study are the water gas shift reaction, which is modelled as a reversible reaction and the methane steam reforming reaction. The developed gasification model was validated using different experimental data from the literature and for a wide range of operating conditions. Good agreement was observed, thus confirming the capability of the model in predicting biomass gasification in a CFB to a great accuracy. The developed model has been successfully used to carry out sensitivity and parametric analysis. The sensitivity analysis included: study of the effect of inclusion of various combustion reaction; and the effect of radiation in the gasification reaction. The developed model was also used to carry out parametric analysis by changing the following gasifier operating conditions: fuel/air ratio; biomass flow rates; sand (heat carrier) temperatures; sand flow rates; sand and biomass particle sizes; gasifying agent (pure air or pure steam); pyrolysis models used; steam/biomass ratio. Finally, based on these parametric and sensitivity analysis a final model was recommended for the simulation of biomass gasification in a CFB riser.
Resumo:
Timely warning of the public during large scale emergencies is essential to ensure safety and save lives. This ongoing study proposes an agent-based simulation model to simulate the warning message dissemination among the public considering both official channels and unofficial channels The proposed model was developed in NetLogo software for a hypothetical area, and requires input parameters such as effectiveness of each official source (%), estimated time to begin informing others, estimated time to inform others and estimated percentage of people (who do not relay the message). This paper demonstrates a means of factoring the behaviour of the public as informants into estimating the effectiveness of warningdissemination during large scale emergencies. The model provides a tool for the practitioner to test the potential impact of the informal channels on the overall warning time and sensitivity of the modelling parameters. The tool would help the practitioners to persuade evacuees to disseminate the warning message informing others similar to the ’Run to thy neighbour campaign conducted by the Red cross.
Resumo:
Commercial process simulators are increasing interest in the chemical engineer education. In this paper, the use of commercial dynamic simulation software, D-SPICE® and K-Spice®, for three different chemical engineering courses is described and discussed. The courses cover the following topics: basic chemical engineering, operability and safety analysis and process control. User experiences from both teachers and students are presented. The benefits of dynamic simulation as an additional teaching tool are discussed and summarized. The experiences confirm that commercial dynamic simulators provide realistic training and can be successfully integrated into undergraduate and graduate teaching, laboratory courses and research. © 2012 The Institution of Chemical Engineers.
Resumo:
Certain theoretical and methodological problems of designing real-time dynamical expert systems, which belong to the class of the most complex integrated expert systems, are discussed. Primary attention is given to the problems of designing subsystems for modeling the external environment in the case where the environment is represented by complex engineering systems. A specific approach to designing simulation models for complex engineering systems is proposed and examples of the application of this approach based on the G2 (Gensym Corp.) tool system are described.
Resumo:
2010 Mathematics Subject Classification: 65D18.
Resumo:
Ecological models have often been used in order to answer questions that are in the limelight of recent researches such as the possible effects of climate change. The methodology of tactical models is a very useful tool comparison to those complex models requiring relatively large set of input parameters. In this study, a theoretical strategic model (TEGM ) was adapted to the field data on the basis of a 24-year long monitoring database of phytoplankton in the Danube River at the station of G¨od, Hungary (at 1669 river kilometer – hereafter referred to as “rkm”). The Danubian Phytoplankton Growth Model (DPGM) is able to describe the seasonal dynamics of phytoplankton biomass (mg L−1) based on daily temperature, but takes the availability of light into consideration as well. In order to improve fitting, the 24-year long database was split in two parts in accordance with environmental sustainability. The period of 1979–1990 has a higher level of nutrient excess compared with that of the 1991–2002. The authors assume that, in the above-mentioned periods, phytoplankton responded to temperature in two different ways, thus two submodels were developed, DPGM-sA and DPGMsB. Observed and simulated data correlated quite well. Findings suggest that linear temperature rise brings drastic change to phytoplankton only in case of high nutrient load and it is mostly realized through the increase of yearly total biomass.
Resumo:
Developing analytical models that can accurately describe behaviors of Internet-scale networks is difficult. This is due, in part, to the heterogeneous structure, immense size and rapidly changing properties of today's networks. The lack of analytical models makes large-scale network simulation an indispensable tool for studying immense networks. However, large-scale network simulation has not been commonly used to study networks of Internet-scale. This can be attributed to three factors: 1) current large-scale network simulators are geared towards simulation research and not network research, 2) the memory required to execute an Internet-scale model is exorbitant, and 3) large-scale network models are difficult to validate. This dissertation tackles each of these problems. ^ First, this work presents a method for automatically enabling real-time interaction, monitoring, and control of large-scale network models. Network researchers need tools that allow them to focus on creating realistic models and conducting experiments. However, this should not increase the complexity of developing a large-scale network simulator. This work presents a systematic approach to separating the concerns of running large-scale network models on parallel computers and the user facing concerns of configuring and interacting with large-scale network models. ^ Second, this work deals with reducing memory consumption of network models. As network models become larger, so does the amount of memory needed to simulate them. This work presents a comprehensive approach to exploiting structural duplications in network models to dramatically reduce the memory required to execute large-scale network experiments. ^ Lastly, this work addresses the issue of validating large-scale simulations by integrating real protocols and applications into the simulation. With an emulation extension, a network simulator operating in real-time can run together with real-world distributed applications and services. As such, real-time network simulation not only alleviates the burden of developing separate models for applications in simulation, but as real systems are included in the network model, it also increases the confidence level of network simulation. This work presents a scalable and flexible framework to integrate real-world applications with real-time simulation.^
Resumo:
Network simulation is an indispensable tool for studying Internet-scale networks due to the heterogeneous structure, immense size and changing properties. It is crucial for network simulators to generate representative traffic, which is necessary for effectively evaluating next-generation network protocols and applications. With network simulation, we can make a distinction between foreground traffic, which is generated by the target applications the researchers intend to study and therefore must be simulated with high fidelity, and background traffic, which represents the network traffic that is generated by other applications and does not require significant accuracy. The background traffic has a significant impact on the foreground traffic, since it competes with the foreground traffic for network resources and therefore can drastically affect the behavior of the applications that produce the foreground traffic. This dissertation aims to provide a solution to meaningfully generate background traffic in three aspects. First is realism. Realistic traffic characterization plays an important role in determining the correct outcome of the simulation studies. This work starts from enhancing an existing fluid background traffic model by removing its two unrealistic assumptions. The improved model can correctly reflect the network conditions in the reverse direction of the data traffic and can reproduce the traffic burstiness observed from measurements. Second is scalability. The trade-off between accuracy and scalability is a constant theme in background traffic modeling. This work presents a fast rate-based TCP (RTCP) traffic model, which originally used analytical models to represent TCP congestion control behavior. This model outperforms other existing traffic models in that it can correctly capture the overall TCP behavior and achieve a speedup of more than two orders of magnitude over the corresponding packet-oriented simulation. Third is network-wide traffic generation. Regardless of how detailed or scalable the models are, they mainly focus on how to generate traffic on one single link, which cannot be extended easily to studies of more complicated network scenarios. This work presents a cluster-based spatio-temporal background traffic generation model that considers spatial and temporal traffic characteristics as well as their correlations. The resulting model can be used effectively for the evaluation work in network studies.
Resumo:
The optimization of the timing parameters of traffic signals provides for efficient operation of traffic along a signalized transportation system. Optimization tools with macroscopic simulation models have been used to determine optimal timing plans. These plans have been, in some cases, evaluated and fine tuned using microscopic simulation tools. A number of studies show inconsistencies between optimization tool results based on macroscopic simulation and the results obtained from microscopic simulation. No attempts have been made to determine the reason behind these inconsistencies. This research investigates whether adjusting the parameters of macroscopic simulation models to correspond to the calibrated microscopic simulation model parameters can reduce said inconsistencies. The adjusted parameters include platoon dispersion model parameters, saturation flow rates, and cruise speeds. The results from this work show that adjusting cruise speeds and saturation flow rates can have significant impacts on improving the optimization/macroscopic simulation results as assessed by microscopic simulation models.
Resumo:
With the growing commercial importance of the Internet and the development of new real-time, connection-oriented services like IP-telephony and electronic commerce resilience is becoming a key issue in the design of TP-based networks. Two emerging technologies, which can accomplish the task of efficient information transfer, are Multiprotocol Label Switching (MPLS) and Differentiated Services. A main benefit of MPLS is the ability to introduce traffic-engineering concepts due to its connection-oriented characteristic. With MPLS it is possible to assign different paths for packets through the network. Differentiated services divides traffic into different classes and treat them differently, especially when there is a shortage of network resources. In this thesis, a framework was proposed to integrate the above two technologies and its performance in providing load balancing and improving QoS was evaluated. Simulation and analysis of this framework demonstrated that the combination of MPLS and Differentiated services is a powerful tool for QoS provisioning in IP networks.
Resumo:
Network simulation is an indispensable tool for studying Internet-scale networks due to the heterogeneous structure, immense size and changing properties. It is crucial for network simulators to generate representative traffic, which is necessary for effectively evaluating next-generation network protocols and applications. With network simulation, we can make a distinction between foreground traffic, which is generated by the target applications the researchers intend to study and therefore must be simulated with high fidelity, and background traffic, which represents the network traffic that is generated by other applications and does not require significant accuracy. The background traffic has a significant impact on the foreground traffic, since it competes with the foreground traffic for network resources and therefore can drastically affect the behavior of the applications that produce the foreground traffic. This dissertation aims to provide a solution to meaningfully generate background traffic in three aspects. First is realism. Realistic traffic characterization plays an important role in determining the correct outcome of the simulation studies. This work starts from enhancing an existing fluid background traffic model by removing its two unrealistic assumptions. The improved model can correctly reflect the network conditions in the reverse direction of the data traffic and can reproduce the traffic burstiness observed from measurements. Second is scalability. The trade-off between accuracy and scalability is a constant theme in background traffic modeling. This work presents a fast rate-based TCP (RTCP) traffic model, which originally used analytical models to represent TCP congestion control behavior. This model outperforms other existing traffic models in that it can correctly capture the overall TCP behavior and achieve a speedup of more than two orders of magnitude over the corresponding packet-oriented simulation. Third is network-wide traffic generation. Regardless of how detailed or scalable the models are, they mainly focus on how to generate traffic on one single link, which cannot be extended easily to studies of more complicated network scenarios. This work presents a cluster-based spatio-temporal background traffic generation model that considers spatial and temporal traffic characteristics as well as their correlations. The resulting model can be used effectively for the evaluation work in network studies.
Resumo:
Different types of serious games have been used in elucidating computer science areas such as computer games, mobile games, Lego-based games, virtual worlds and webbased games. Different evaluation techniques have been conducted like questionnaires, interviews, discussions and tests. Simulation have been widely used in computer science as a motivational and interactive learning tool. This paper aims to evaluate the possibility of successful implementation of simulation in computer programming modules. A framework is proposed to measure the impact of serious games on enhancing students understanding of key computer science concepts. Experiments will be held on the EEECS of Queen’s University Belfast students to test the framework and attain results.
Resumo:
The creation of Causal Loop Diagrams (CLDs) is a major phase in the System Dynamics (SD) life-cycle, since the created CLDs express dependencies and feedback in the system under study, as well as, guide modellers in building meaningful simulation models. The cre-ation of CLDs is still subject to the modeller's domain expertise (mental model) and her ability to abstract the system, because of the strong de-pendency on semantic knowledge. Since the beginning of SD, available system data sources (written and numerical models) have always been sparsely available, very limited and imperfect and thus of little benefit to the whole modelling process. However, in recent years, we have seen an explosion in generated data, especially in all business related domains that are analysed via Business Dynamics (BD). In this paper, we intro-duce a systematic tool supported CLD creation approach, which analyses and utilises available disparate data sources within the business domain. We demonstrate the application of our methodology on a given business use-case and evaluate the resulting CLD. Finally, we propose directions for future research to further push the automation in the CLD creation and increase confidence in the generated CLDs.
Resumo:
Steady-state computational fluid dynamics (CFD) simulations are an essential tool in the design process of centrifugal compressors. Whilst global parameters, such as pressure ratio and efficiency, can be predicted with reasonable accuracy, the accurate prediction of detailed compressor flow fields is a much more significant challenge. Much of the inaccuracy is associated with the incorrect selection of turbulence model. The need for a quick turnaround in simulations during the design optimisation process, also demands that the turbulence model selected be robust and numerically stable with short simulation times.
In order to assess the accuracy of a number of turbulence model predictions, the current study used an exemplar open CFD test case, the centrifugal compressor ‘Radiver’, to compare the results of three eddy viscosity models and two Reynolds stress type models. The turbulence models investigated in this study were (i) Spalart-Allmaras (SA) model, (ii) the Shear Stress Transport (SST) model, (iii) a modification to the SST model denoted the SST-curvature correction (SST-CC), (iv) Reynolds stress model of Speziale, Sarkar and Gatski (RSM-SSG), and (v) the turbulence frequency formulated Reynolds stress model (RSM-ω). Each was found to be in good agreement with the experiments (below 2% discrepancy), with respect to total-to-total parameters at three different operating conditions. However, for the off-design conditions, local flow field differences were observed between the models, with the SA model showing particularly poor prediction of local flow structures. The SST-CC showed better prediction of curved rotating flows in the impeller. The RSM-ω was better for the wake and separated flow in the diffuser. The SST model showed reasonably stable, robust and time efficient capability to predict global and local flow features.
Resumo:
The early bird and night owl restaurant tool found in the accompanying Excel file provides an estimate of the effects of offering off-peak special menu prices. Unlike the classic back-of-envelope calculation, the tool includes the effect of anticipated cannibalization of full-price covers and seeks to optimize table use. The tool also considers the revenue from new customers attracted by the early bird or night owl promotions, as well as the level of increased business needed to achieve the net monetary value target for the promotion.