951 resultados para Simulation Modeling
Resumo:
One dimensional magnetic photonic crystals (1D-MPC) are promising structures for integrated optical isolator applications. Rare earth substituted garnet thin films with proper Faraday rotation are required to fabricate planar 1D-MPCs. In this thesis, flat-top response 1D-MPC was proposed and spectral responses and Faraday rotation were modeled. Bismuth substituted iron garnet films were fabricated by RF magnetron sputtering and structures, compositions, birefringence and magnetooptical properties were studied. Double layer structures for single mode propagation were also fabricated by sputtering for the first time. Multilayer stacks with multiple defects (phase shift) composed of Ce-YIG and GGG quarter-wave plates were simulated by the transfer matrix method. The transmission and Faraday rotation characteristics were theoretically studied. It is found that flat-top response, with 100% transmission and near 45o rotation is achievable by adjusting the inter-defect spacing, for film structures as thin as 30 to 35 μm. This is better than 3-fold reduction in length compared to the best Ce-YIG films for comparable rotations, thus allows a considerable reduction in size in manufactured optical isolators. Transmission bands as wide as 7nm were predicted, which is considerable improvement over 2 defects structure. Effect of repetition number and ratio factor on transmission and Faraday rotation ripple factors for the case of 3 and 4 defects structure has been discussed. Diffraction across the structure corresponds to a longer optical path length. Thus the use of guided optics is required to minimize the insertion losses in integrated devices. This part is discussed in chapter 2 in this thesis. Bismuth substituted iron garnet thin films were prepared by RF magnetron sputtering. We investigated or measured the deposition parameters optimization, crystallinity, surface morphologies, composition, magnetic and magnetooptical properties. A very high crystalline quality garnet film with smooth surface has been heteroepitaxially grown on (111) GGG substrate for films less than 1μm. Dual layer structures with two distinct XRD peaks (within a single sputtered film) start to develop when films exceed this thickness. The development of dual layer structure was explained by compositional gradient across film thickness, rather than strain gradient proposed by other authors. Lower DC self bias or higher substrate temperature is found to help to delay the appearance of the 2nd layer. The deposited films show in-plane magnetization, which is advantageous for waveguide devices application. Propagation losses of fabricated waveguides can be decreased by annealing in an oxygen atmosphere from 25dB/cm to 10dB/cm. The Faraday rotation at λ=1.55μm were also measured for the waveguides. FR is small (10° for a 3mm long waveguide), due to the presence of linear birefringence. This part is covered in chapter 4. We also investigated the elimination of linear birefringence by thickness tuning method for our sputtered films. We examined the compressively and tensilely strained films and analyze the photoelastic response of the sputter deposited garnet films. It has been found that the net birefringence can be eliminated under planar compressive strain conditions by sputtering. Bi-layer GGG on garnet thin film yields a reduced birefringence. Temperature control during the sputter deposition of GGG cover layer is critical and strongly influences the magnetization and birefringence level in the waveguide. High temperature deposition lowers the magnetization and increases the linear birefringence in the garnet films. Double layer single mode structures fabricated by sputtering were also studied. The double layer, which shows an in-plane magnetization, has an increased RMS roughness upon upper layer deposition. The single mode characteristic was confirmed by prism coupler measurement. This part is discussed in chapter 5.
Resumo:
Certain theoretical and methodological problems of designing real-time dynamical expert systems, which belong to the class of the most complex integrated expert systems, are discussed. Primary attention is given to the problems of designing subsystems for modeling the external environment in the case where the environment is represented by complex engineering systems. A specific approach to designing simulation models for complex engineering systems is proposed and examples of the application of this approach based on the G2 (Gensym Corp.) tool system are described.
Resumo:
Ecological models have often been used in order to answer questions that are in the limelight of recent researches such as the possible effects of climate change. The methodology of tactical models is a very useful tool comparison to those complex models requiring relatively large set of input parameters. In this study, a theoretical strategic model (TEGM ) was adapted to the field data on the basis of a 24-year long monitoring database of phytoplankton in the Danube River at the station of G¨od, Hungary (at 1669 river kilometer – hereafter referred to as “rkm”). The Danubian Phytoplankton Growth Model (DPGM) is able to describe the seasonal dynamics of phytoplankton biomass (mg L−1) based on daily temperature, but takes the availability of light into consideration as well. In order to improve fitting, the 24-year long database was split in two parts in accordance with environmental sustainability. The period of 1979–1990 has a higher level of nutrient excess compared with that of the 1991–2002. The authors assume that, in the above-mentioned periods, phytoplankton responded to temperature in two different ways, thus two submodels were developed, DPGM-sA and DPGMsB. Observed and simulated data correlated quite well. Findings suggest that linear temperature rise brings drastic change to phytoplankton only in case of high nutrient load and it is mostly realized through the increase of yearly total biomass.
Resumo:
When simulation modeling is used for performance improvement studies of complex systems such as transport terminals, domain specific conceptual modeling constructs could be used by modelers to create structured models. A two stage procedure which includes identification of the problem characteristics/cluster - ‘knowledge acquisition’ and identification of standard models for the problem cluster – ‘model abstraction’ was found to be effective in creating structured models when applied to certain logistic terminal systems. In this paper we discuss some methods and examples related the knowledge acquisition and model abstraction stages for the development of three different types of model categories of terminal systems
Resumo:
Immobilized cell utilization in tower-type bioreactor is one of the main alternatives being studied to improve the industrial bioprocess. Other alternatives for the production of beta -lactam antibiotics, such as a cephalosporin C fed-batch p recess in an aerated stirred-tank bioreactor with free cells of Cepha-losporium acremonium or a tower-type bioreactor with immobilized cells of this fungus, have proven to be more efficient than the batch profess. In the fed-batch process, it is possible to minimize the catabolite repression exerted by the rapidly utilization of carbon sources (such as glucose) in the synthesis of antibiotics by utilizing a suitable flow rate of supplementary medium. In this study, several runs for cephalosporin C production, each lasting 200 h, were conducted in a fed-batch tower-type bioreactor using different hydrolyzed sucrose concentrations, For this study's model, modifications were introduced to take into account the influence of supplementary medium flow rate. The balance equations considered the effect of oxygen limitation inside the bioparticles. In the Monod-type rate equations, eel concentrations, substrate concentrations, and dissolved oxygen were included as reactants affecting the bioreaction rate. The set of differential equations was solved by the numerical method, and the values of the parameters were estimated by the classic nonlinear regression method following Marquardt's procedure with a 95% confidence interval. The simulation results showed that the proposed model fit well with the experimental data,and based on the experimental data and the mathematical model an optimal mass flow rate to maximize the bioprocess productivity could be proposed.
Resumo:
In this thesis different approaches for the modeling and simulation of the blood protein fibrinogen are presented. The approaches are meant to systematically connect the multiple time and length scales involved in the dynamics of fibrinogen in solution and at inorganic surfaces. The first part of the thesis will cover simulations of fibrinogen on an all atom level. Simulations of the fibrinogen protomer and dimer are performed in explicit solvent to characterize the dynamics of fibrinogen in solution. These simulations reveal an unexpectedly large and fast bending motion that is facilitated by molecular hinges located in the coiled-coil region of fibrinogen. This behavior is characterized by a bending and a dihedral angle and the distribution of these angles is measured. As a consequence of the atomistic detail of the simulations it is possible to illuminate small scale behavior in the binding pockets of fibrinogen that hints at a previously unknown allosteric effect. In a second step atomistic simulations of the fibrinogen protomer are performed at graphite and mica surfaces to investigate initial adsorption stages. These simulations highlight the different adsorption mechanisms at the hydrophobic graphite surface and the charged, hydrophilic mica surface. It is found that the initial adsorption happens in a preferred orientation on mica. Many effects of practical interest involve aggregates of many fibrinogen molecules. To investigate such systems, time and length scales need to be simulated that are not attainable in atomistic simulations. It is therefore necessary to develop lower resolution models of fibrinogen. This is done in the second part of the thesis. First a systematically coarse grained model is derived and parametrized based on the atomistic simulations of the first part. In this model the fibrinogen molecule is represented by 45 beads instead of nearly 31,000 atoms. The intra-molecular interactions of the beads are modeled as a heterogeneous elastic network while inter-molecular interactions are assumed to be a combination of electrostatic and van der Waals interaction. A method is presented that determines the charges assigned to beads by matching the electrostatic potential in the atomistic simulation. Lastly a phenomenological model is developed that represents fibrinogen by five beads connected by rigid rods with two hinges. This model only captures the large scale dynamics in the atomistic simulations but can shed light on experimental observations of fibrinogen conformations at inorganic surfaces.
Resumo:
The Center for Transportation Research and Education (CTRE) used the traffic simulation model CORSIM to access proposed capacity and safety improvement strategies for the U.S. 61 corridor through Burlington, Iowa. The comparison between the base and alternative models allow for evaluation of the traffic flow performance under the existing conditions as well as other design scenarios. The models also provide visualization of performance for interpretation by technical staff, public policy makers, and the public. The objectives of this project are to evaluate the use of traffic simulation models for future use by the Iowa Department of Transportation (DOT) and to develop procedures for employing simulation modeling to conduct the analysis of alternative designs. This report presents both the findings of the U.S. 61 evaluation and an overview of model development procedures. The first part of the report includes the simulation modeling development procedures. The simulation analysis is illustrated through the Burlington U.S. 61 corridor case study application. Part I is not intended to be a user manual but simply introductory guidelines for traffic simulation modeling. Part II of the report evaluates the proposed improvement concepts in a side by side comparison of the base and alternative models.
Resumo:
In this Master’s thesis agent-based modeling has been used to analyze maintenance strategy related phenomena. The main research question that has been answered was: what does the agent-based model made for this study tell us about how different maintenance strategy decisions affect profitability of equipment owners and maintenance service providers? Thus, the main outcome of this study is an analysis of how profitability can be increased in industrial maintenance context. To answer that question, first, a literature review of maintenance strategy, agent-based modeling and maintenance modeling and optimization was conducted. This review provided the basis for making the agent-based model. Making the model followed a standard simulation modeling procedure. With the simulation results from the agent-based model the research question was answered. Specifically, the results of the modeling and this study are: (1) optimizing the point in which a machine is maintained increases profitability for the owner of the machine and also the maintainer with certain conditions; (2) time-based pricing of maintenance services leads to a zero-sum game between the parties; (3) value-based pricing of maintenance services leads to a win-win game between the parties, if the owners of the machines share a substantial amount of their value to the maintainers; and (4) error in machine condition measurement is a critical parameter to optimizing maintenance strategy, and there is real systemic value in having more accurate machine condition measurement systems.
Resumo:
The heightened threat of terrorism has caused governments worldwide to plan for responding to large-scale catastrophic incidents. In England the New Dimension Programme supplies equipment, procedures and training to the Fire and Rescue Service to ensure the country's preparedness to respond to a range of major critical incidents. The Fire and Rescue Service is involved partly by virtue of being able to very quickly mobilize a large skilled workforce and specialist equipment. This paper discusses the use of discrete event simulation modeling to understand how a fire and rescue service might position its resources before an incident takes place, to best respond to a combination of different incidents at different locations if they happen. Two models are built for this purpose. The first model deals with mass decontamination of a population following a release of a hazardous substance—aiming to study resource requirements (vehicles, equipment and manpower) necessary to meet performance targets. The second model deals with the allocation of resources across regions—aiming to study cover level and response times, analyzing different allocations of resources, both centralized and decentralized. Contributions to theory and practice in other contexts (e.g. the aftermath of natural disasters such as earthquakes) are outlined.
Resumo:
Thesis (Master's)--University of Washington, 2016-08
Resumo:
Tropical ecosystems play a large and complex role in the global carbon cycle. Clearing of natural ecosystems for agriculture leads to large pulses of CO(2) to the atmosphere from terrestrial biomass. Concurrently, the remaining intact ecosystems, especially tropical forests, may be sequestering a large amount of carbon from the atmosphere in response to global environmental changes including climate changes and an increase in atmospheric CO(2). Here we use an approach that integrates census-based historical land use reconstructions, remote-sensing-based contemporary land use change analyses, and simulation modeling of terrestrial biogeochemistry to estimate the net carbon balance over the period 1901-2006 for the state of Mato Grosso, Brazil, which is one of the most rapidly changing agricultural frontiers in the world. By the end of this period, we estimate that of the state`s 925 225 km(2), 221 092 km(2) have been converted to pastures and 89 533 km(2) have been converted to croplands, with forest-to-pasture conversions being the dominant land use trajectory but with recent transitions to croplands increasing rapidly in the last decade. These conversions have led to a cumulative release of 4.8 Pg C to the atmosphere, with similar to 80% from forest clearing and 20% from the clearing of cerrado. Over the same period, we estimate that the residual undisturbed ecosystems accumulated 0.3 Pg C in response to CO2 fertilization. Therefore, the net emissions of carbon from Mato Grosso over this period were 4.5 Pg C. Net carbon emissions from Mato Grosso since 2000 averaged 146 Tg C/yr, on the order of Brazil`s fossil fuel emissions during this period. These emissions were associated with the expansion of croplands to grow soybeans. While alternative management regimes in croplands, including tillage, fertilization, and cropping patterns promote carbon storage in ecosystems, they remain a small portion of the net carbon balance for the region. This detailed accounting of a region`s carbon balance is the type of foundation analysis needed by the new United Nations Collaborative Programmme for Reducing Emissions from Deforestation and Forest Degradation (REDD).
Resumo:
O objectivo deste trabalho passa pelo desenvolvimento de uma ferramenta de simulação dinâmica de recursos rádio em LTE no sentido descendente, com recurso à Framework OMNeT++. A ferramenta desenvolvida permite realizar o planeamento das estações base, simulação e análise de resultados. São descritos os principais aspectos da tecnologia de acesso rádio, designadamente a arquitectura da rede, a codificação, definição dos recursos rádio, os ritmos de transmissão suportados ao nível de canal e o mecanismo de controlo de admissão. Foi definido o cenário de utilização de recursos rádio que inclui a definição de modelos de tráfego e de serviços orientados a pacotes e circuitos. Foi ainda considerado um cenário de referência para a verificação e validação do modelo de simulação. A simulação efectua-se ao nível de sistema, suportada por um modelo dinâmico, estocástico e orientado por eventos discretos de modo a contemplar os diferentes mecanismos característicos da tecnologia OFDMA. Os resultados obtidos permitem a análise de desempenho dos serviços, estações base e sistema ao nível do throughput médio da rede, throughput médio por eNodeB e throughput médio por móvel para além de permitir analisar o contributo de outros parâmetros designadamente, largura de banda, raio de cobertura, perfil dos serviços, esquema de modulação, entre outros. Dos resultados obtidos foi possível verificar que, considerando um cenário com estações base com raio de cobertura de 100 m obteve-se um throughput ao nível do utilizador final igual a 4.69494 Mbps, ou seja, 7 vezes superior quando comparado a estações base com raios de cobertura de 200m.
Resumo:
O desenvolvimento deste trabalho teve como objectivo a optimização de um sistema de climatização industrial, constituído por quatro centrais de climatização adiabáticas, que apresentam limitações de capacidade de arrefecimento, controlo e eficiência. Inicialmente foi necessária a pesquisa bibliográfica e recolha de informação relativa à indústria têxtil e ao processo de arrefecimento evaporativo. Numa fase posterior foram recolhidos e analisados os diversos dados essenciais à compreensão do binómio edifício/sistema de climatização, para a obtenção de possíveis hipóteses de optimização. Da fase de recolha de informações e dados, destaca-se, também, a realização de análises à qualidade do ar interior (QAI). As optimizações seleccionadas como passíveis de implementação, foram estudadas e analisadas com o auxílio do software de simulação energética dinâmica DesignBuilder e os resultados obtidos foram devidamente trabalhados e ajustados de modo a permitir uma assimilação amigável e de fácil interpretação das suas vantagens e desvantagens, tendo ainda sido objecto de estudo de viabilidade económica. A optimização proposta reflecte uma melhoria substancial das condições interiores ao nível da temperatura e humidade relativa, resultando, ainda assim, numa redução de consumos energéticos na ordem dos 23 % (490.337 kWh), isto é, uma poupança anual de 42.169 € aos custos de exploração e com um período de retorno de 1 ano e 11 meses.
Resumo:
Intensification of agricultural production without a sound management and regulations can lead to severe environmental problems, as in Western Santa Catarina State, Brazil, where intensive swine production has caused large accumulations of manure and consequently water pollution. Natural resource scientists are asked by decision-makers for advice on management and regulatory decisions. Distributed environmental models are useful tools, since they can be used to explore consequences of various management practices. However, in many areas of the world, quantitative data for model calibration and validation are lacking. The data-intensive distributed environmental model AgNPS was applied in a data-poor environment, the upper catchment (2,520 ha) of the Ariranhazinho River, near the city of Seara, in Santa Catarina State. Steps included data preparation, cell size selection, sensitivity analysis, model calibration and application to different management scenarios. The model was calibrated based on a best guess for model parameters and on a pragmatic sensitivity analysis. The parameters were adjusted to match model outputs (runoff volume, peak runoff rate and sediment concentration) closely with the sparse observed data. A modelling grid cell resolution of 150 m adduced appropriate and computer-fit results. The rainfall runoff response of the AgNPS model was calibrated using three separate rainfall ranges (< 25, 25-60, > 60 mm). Predicted sediment concentrations were consistently six to ten times higher than observed, probably due to sediment trapping along vegetated channel banks. Predicted N and P concentrations in stream water ranged from just below to well above regulatory norms. Expert knowledge of the area, in addition to experience reported in the literature, was able to compensate in part for limited calibration data. Several scenarios (actual, recommended and excessive manure applications, and point source pollution from swine operations) could be compared by the model, using a relative ranking rather than quantitative predictions.
Resumo:
It has been convincingly argued that computer simulation modeling differs from traditional science. If we understand simulation modeling as a new way of doing science, the manner in which scientists learn about the world through models must also be considered differently. This article examines how researchers learn about environmental processes through computer simulation modeling. Suggesting a conceptual framework anchored in a performative philosophical approach, we examine two modeling projects undertaken by research teams in England, both aiming to inform flood risk management. One of the modeling teams operated in the research wing of a consultancy firm, the other were university scientists taking part in an interdisciplinary project experimenting with public engagement. We found that in the first context the use of standardized software was critical to the process of improvisation, the obstacles emerging in the process concerned data and were resolved through exploiting affordances for generating, organizing, and combining scientific information in new ways. In the second context, an environmental competency group, obstacles were related to the computer program and affordances emerged in the combination of experience-based knowledge with the scientists' skill enabling a reconfiguration of the mathematical structure of the model, allowing the group to learn about local flooding.