858 resultados para Multi-agent simulation
Resumo:
This thesis assesses the question, whether accounting for non-tradable goods sectors in a calibrated Auerbach-Kotlikoff multi-regional overlapping-generations-model significantly affects this model’s results when simulating the economic impact of demographic change. Non-tradable goods constitute a major part of up to 80 percent of GDP of modern economies. At the same time, multi-regional overlapping-generations-models presented by literature on demographic change so far ignored their existence and counterfactually assumed perfect tradability between model regions. Moreover, this thesis introduces the assumption of an increasing preference share for non-tradable goods of old generations. This fact-based as-sumption is also not part of models in relevant literature. rnThese obvious simplifications of common models vis-à-vis reality notwithstanding, this thesis concludes that differences in results between a model featuring non-tradable goods and a common model with perfect tradability are very small. In other words, the common simplifi-cation of ignoring non-tradable goods is unlikely to lead to significant distortions in model results. rnIn order to ensure that differences in results between the ‘new’ model, featuring both non-tradable and tradable goods, and the common model solely reflect deviations due to the more realistic structure of the ‘new’ model, both models are calibrated to match exactly the same benchmark data and thus do not show deviations in their respective baseline steady states.rnA variation analysis performed in this thesis suggests that differences between the common model and a model with non-tradable goods can theoretically be large, but only if the bench-mark tradable goods sector is assumed to be unrealistically small.rnFinally, this thesis analyzes potential real exchange rate effects of demographic change, which could occur due to regional price differences of non-tradable goods. However, results show that shifts in real exchange rate based on these price differences are negligible.rn
Resumo:
L’obiettivo principale di questo elaborato è di mostrare in un primo momento i concetti fondamentali che stanno alla base del paradigma ad agenti. Una volta introdotti, essi verranno collocati in un determinato ambiente di programmazione attraverso una piattaforma specifica chiamata Jason. Come sarà facile capire dalla lettura di questa trattazione, un sistema ad agenti è costituito dagli agenti stessi e dall’ambiente in cui sono situati. L’ambiente risulta quindi un altro tassello fondamentale ed è stato introdotto allo scopo un nuovo paradigma per la programmazione di ambienti chiamato Agent & Artifact. Nello specifico, verrà ampiamente descritto il framework di riferimento di tale paradigma: CArtAgO. Dopo aver illustrato i concetti e gli strumenti per poter agilmente programmare e progettare sistemi ad agenti, verrà infine mostrato un esempio di applicazione di tale tecnologia attraverso un case study. Il progetto del sistema in questione riguarda un reale caso aziendale e integra la tecnologia RFID con quella ad agenti per fornire la soluzione ad un problema noto come quello del controllo periodico delle scorte.
Resumo:
Apart from one article published by Rabl and Sigrist in 1992 (Rechtsmedizin 2:156-158), there are no further reports on secondary skull fractures in shots from captive bolt guns. Up to now, the pertinent literature places particular emphasis on the absence of indirect lesions away from the impact point, when dealing with the wounding capacity of slaughterer's guns. The recent observation of two suicidal head injuries accompanied by skull fractures far away from the bolt's path gave occasion to experimental studies using simulants (glycerin soap, balls from gelatin) and skull brain models. As far as ballistic soap was concerned, the dimensions of the bolt's channel were assessed by multi-slice computed tomography before cutting the blocks open. The test shots to gelatin balls and to skull-brain models were documented by means of a high-speed motion camera. As expected, the typical temporary cavity effect of bullets fired from conventional guns could not be observed when captive bolt stunners were discharged. Nevertheless, the visualized transfer of kinetic energy justifies the assumption that the secondary fractures seen in thin parts of the skull were caused by a hydraulic burst effect.
Resumo:
Image-based modeling of tumor growth combines methods from cancer simulation and medical imaging. In this context, we present a novel approach to adapt a healthy brain atlas to MR images of tumor patients. In order to establish correspondence between a healthy atlas and a pathologic patient image, tumor growth modeling in combination with registration algorithms is employed. In a first step, the tumor is grown in the atlas based on a new multi-scale, multi-physics model including growth simulation from the cellular level up to the biomechanical level, accounting for cell proliferation and tissue deformations. Large-scale deformations are handled with an Eulerian approach for finite element computations, which can operate directly on the image voxel mesh. Subsequently, dense correspondence between the modified atlas and patient image is established using nonrigid registration. The method offers opportunities in atlasbased segmentation of tumor-bearing brain images as well as for improved patient-specific simulation and prognosis of tumor progression.
Resumo:
BACKGROUND: After bovine spongiform encephalopathy (BSE) emerged in European cattle livestock in 1986 a fundamental question was whether the agent established also in the small ruminants' population. In Switzerland transmissible spongiform encephalopathies (TSEs) in small ruminants have been monitored since 1990. While in the most recent TSE cases a BSE infection could be excluded, for historical cases techniques to discriminate scrapie from BSE had not been available at the time of diagnosis and thus their status remained unclear. We herein applied state-of-the-art techniques to retrospectively classify these animals and to re-analyze the affected flocks for secondary cases. These results were the basis for models, simulating the course of TSEs over a period of 70 years. The aim was to come to a statistically based overall assessment of the TSE situation in the domestic small ruminant population in Switzerland. RESULTS: In sum 16 TSE cases were identified in small ruminants in Switzerland since 1981, of which eight were atypical and six were classical scrapie. In two animals retrospective analysis did not allow any further classification due to the lack of appropriate tissue samples. We found no evidence for an infection with the BSE agent in the cases under investigation. In none of the affected flocks, secondary cases were identified. A Bayesian prevalence calculation resulted in most likely estimates of one case of BSE, five cases of classical scrapie and 21 cases of atypical scrapie per 100'000 small ruminants. According to our models none of the TSEs is considered to cause a broader epidemic in Switzerland. In a closed population, they are rather expected to fade out in the next decades or, in case of a sporadic origin, may remain at a very low level. CONCLUSIONS: In summary, these data indicate that despite a significant epidemic of BSE in cattle, there is no evidence that BSE established in the small ruminant population in Switzerland. Classical and atypical scrapie both occur at a very low level and are not expected to escalate into an epidemic. In this situation the extent of TSE surveillance in small ruminants requires reevaluation based on cost-benefit analysis.
Resumo:
Multi-site time series studies of air pollution and mortality and morbidity have figured prominently in the literature as comprehensive approaches for estimating acute effects of air pollution on health. Hierarchical models are generally used to combine site-specific information and estimate pooled air pollution effects taking into account both within-site statistical uncertainty, and across-site heterogeneity. Within a site, characteristics of time series data of air pollution and health (small pollution effects, missing data, highly correlated predictors, non linear confounding etc.) make modelling all sources of uncertainty challenging. One potential consequence is underestimation of the statistical variance of the site-specific effects to be combined. In this paper we investigate the impact of variance underestimation on the pooled relative rate estimate. We focus on two-stage normal-normal hierarchical models and on under- estimation of the statistical variance at the first stage. By mathematical considerations and simulation studies, we found that variance underestimation does not affect the pooled estimate substantially. However, some sensitivity of the pooled estimate to variance underestimation is observed when the number of sites is small and underestimation is severe. These simulation results are applicable to any two-stage normal-normal hierarchical model for combining information of site-specific results, and they can be easily extended to more general hierarchical formulations. We also examined the impact of variance underestimation on the national average relative rate estimate from the National Morbidity Mortality Air Pollution Study and we found that variance underestimation as much as 40% has little effect on the national average.
Resumo:
PURPOSE: To compare objective fellow and expert efficiency indices for an interventional radiology renal artery stenosis skill set with the use of a high-fidelity simulator. MATERIALS AND METHODS: The Mentice VIST simulator was used for three different renal artery stenosis simulations of varying difficulty, which were used to grade performance. Fellows' indices at three intervals throughout 1 year were compared to expert baseline performance. Seventy-four simulated procedures were performed, 63 of which were captured as audiovisual recordings. Three levels of fellow experience were analyzed: 1, 6, and 12 months of dedicated interventional radiology fellowship. The recordings were compiled on a computer workstation and analyzed. Distinct measurable events in the procedures were identified with task analysis, and data regarding efficiency were extracted. Total scores were calculated as the product of procedure time, fluoroscopy time, tools, and contrast agent volume. The lowest scores, which reflected efficient use of tools, radiation, and time, were considered to indicate proficiency. Subjective analysis of participants' procedural errors was not included in this analysis. RESULTS: Fellows' mean scores diminished from 1 month to 12 months (42,960 at 1 month, 18,726 at 6 months, and 9,636 at 12 months). The experts' mean score was 4,660. In addition, the range of variance in score diminished with increasing experience (from a range of 5,940-120,156 at 1 month to 2,436-85,272 at 6 months and 2,160-32,400 at 12 months). Expert scores ranged from 1,450 to 10,800. CONCLUSIONS: Objective efficiency indices for simulated procedures can demonstrate scores directly comparable to the level of clinical experience.
Resumo:
To mitigate greenhouse gas (GHG) emissions and reduce U.S. dependence on imported oil, the United States (U.S.) is pursuing several options to create biofuels from renewable woody biomass (hereafter referred to as “biomass”). Because of the distributed nature of biomass feedstock, the cost and complexity of biomass recovery operations has significant challenges that hinder increased biomass utilization for energy production. To facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization and tapping unused forest residues, it is proposed to develop biofuel supply chain models based on optimization and simulation approaches. The biofuel supply chain is structured around four components: biofuel facility locations and sizes, biomass harvesting/forwarding, transportation, and storage. A Geographic Information System (GIS) based approach is proposed as a first step for selecting potential facility locations for biofuel production from forest biomass based on a set of evaluation criteria, such as accessibility to biomass, railway/road transportation network, water body and workforce. The development of optimization and simulation models is also proposed. The results of the models will be used to determine (1) the number, location, and size of the biofuel facilities, and (2) the amounts of biomass to be transported between the harvesting areas and the biofuel facilities over a 20-year timeframe. The multi-criteria objective is to minimize the weighted sum of the delivered feedstock cost, energy consumption, and GHG emissions simultaneously. Finally, a series of sensitivity analyses will be conducted to identify the sensitivity of the decisions, such as the optimal site selected for the biofuel facility, to changes in influential parameters, such as biomass availability and transportation fuel price. Intellectual Merit The proposed research will facilitate the exploration of a wide variety of conditions that promise profitable biomass utilization in the renewable biofuel industry. The GIS-based facility location analysis considers a series of factors which have not been considered simultaneously in previous research. Location analysis is critical to the financial success of producing biofuel. The modeling of woody biomass supply chains using both optimization and simulation, combing with the GIS-based approach as a precursor, have not been done to date. The optimization and simulation models can help to ensure the economic and environmental viability and sustainability of the entire biofuel supply chain at both the strategic design level and the operational planning level. Broader Impacts The proposed models for biorefineries can be applied to other types of manufacturing or processing operations using biomass. This is because the biomass feedstock supply chain is similar, if not the same, for biorefineries, biomass fired or co-fired power plants, or torrefaction/pelletization operations. Additionally, the research results of this research will continue to be disseminated internationally through publications in journals, such as Biomass and Bioenergy, and Renewable Energy, and presentations at conferences, such as the 2011 Industrial Engineering Research Conference. For example, part of the research work related to biofuel facility identification has been published: Zhang, Johnson and Sutherland [2011] (see Appendix A). There will also be opportunities for the Michigan Tech campus community to learn about the research through the Sustainable Future Institute.
Resumo:
Embedded siloxane polymer waveguides have shown promising results for use in optical backplanes. They exhibit high temperature stability, low optical absorption, and require common processing techniques. A challenging aspect of this technology is out-of-plane coupling of the waveguides. A multi-software approach to modeling an optical vertical interconnect (via) is proposed. This approach utilizes the beam propagation method to generate varied modal field distribution structures which are then propagated through a via model using the angular spectrum propagation technique. Simulation results show average losses between 2.5 and 4.5 dB for different initial input conditions. Certain configurations show losses of less than 3 dB and it is shown that in an input/output pair of vias, average losses per via may be lower than the targeted 3 dB.
Resumo:
As the development of genotyping and next-generation sequencing technologies, multi-marker testing in genome-wide association study and rare variant association study became active research areas in statistical genetics. This dissertation contains three methodologies for association study by exploring different genetic data features and demonstrates how to use those methods to test genetic association hypothesis. The methods can be categorized into in three scenarios: 1) multi-marker testing for strong Linkage Disequilibrium regions, 2) multi-marker testing for family-based association studies, 3) multi-marker testing for rare variant association study. I also discussed the advantage of using these methods and demonstrated its power by simulation studies and applications to real genetic data.
Resumo:
This technical report discusses the application of Lattice Boltzmann Method (LBM) in the fluid flow simulation through porous filter-wall of disordered media. The diesel particulate filter (DPF) is an example of disordered media. DPF is developed as a cutting edge technology to reduce harmful particulate matter in the engine exhaust. Porous filter-wall of DPF traps these soot particles in the after-treatment of the exhaust gas. To examine the phenomena inside the DPF, researchers are looking forward to use the Lattice Boltzmann Method as a promising alternative simulation tool. The lattice Boltzmann method is comparatively a newer numerical scheme and can be used to simulate fluid flow for single-component single-phase, single-component multi-phase. It is also an excellent method for modelling flow through disordered media. The current work focuses on a single-phase fluid flow simulation inside the porous micro-structure using LBM. Firstly, the theory concerning the development of LBM is discussed. LBM evolution is always related to Lattice gas Cellular Automata (LGCA), but it is also shown that this method is a special discretized form of the continuous Boltzmann equation. Since all the simulations are conducted in two-dimensions, the equations developed are in reference with D2Q9 (two-dimensional 9-velocity) model. The artificially created porous micro-structure is used in this study. The flow simulations are conducted by considering air and CO2 gas as fluids. The numerical model used in this study is explained with a flowchart and the coding steps. The numerical code is constructed in MATLAB. Different types of boundary conditions and their importance is discussed separately. Also the equations specific to boundary conditions are derived. The pressure and velocity contours over the porous domain are studied and recorded. The results are compared with the published work. The permeability values obtained in this study can be fitted to the relation proposed by Nabovati [8], and the results are in excellent agreement within porosity range of 0.4 to 0.8.
Resumo:
The physics of the operation of singe-electron tunneling devices (SEDs) and singe-electron tunneling transistors (SETs), especially of those with multiple nanometer-sized islands, has remained poorly understood in spite of some intensive experimental and theoretical research. This computational study examines the current-voltage (IV) characteristics of multi-island single-electron devices using a newly developed multi-island transport simulator (MITS) that is based on semi-classical tunneling theory and kinetic Monte Carlo simulation. The dependence of device characteristics on physical device parameters is explored, and the physical mechanisms that lead to the Coulomb blockade (CB) and Coulomb staircase (CS) characteristics are proposed. Simulations using MITS demonstrate that the overall IV characteristics in a device with a random distribution of islands are a result of a complex interplay among those factors that affect the tunneling rates that are fixed a priori (e.g. island sizes, island separations, temperature, gate bias, etc.), and the evolving charge state of the system, which changes as the source-drain bias (VSD) is changed. With increasing VSD, a multi-island device has to overcome multiple discrete energy barriers (up-steps) before it reaches the threshold voltage (Vth). Beyond Vth, current flow is rate-limited by slow junctions, which leads to the CS structures in the IV characteristic. Each step in the CS is characterized by a unique distribution of island charges with an associated distribution of tunneling probabilities. MITS simulation studies done on one-dimensional (1D) disordered chains show that longer chains are better suited for switching applications as Vth increases with increasing chain length. They are also able to retain CS structures at higher temperatures better than shorter chains. In sufficiently disordered 2D systems, we demonstrate that there may exist a dominant conducting path (DCP) for conduction, which makes the 2D device behave as a quasi-1D device. The existence of a DCP is sensitive to the device structure, but is robust with respect to changes in temperature, gate bias, and VSD. A side gate in 1D and 2D systems can effectively control Vth. We argue that devices with smaller island sizes and narrower junctions may be better suited for practical applications, especially at room temperature.
Resumo:
Description of simulation and training games as tool for awareness and capacity development in multi steakeholder processes
Resumo:
Im Beitrag wird ein neuartiges Förderprinzip zur federnden Aufnahme und zum Transport von massenhaft anfallenden Paketstrukturen vorgestellt. Das Förderprinzip beruht auf einem flächigen Tragmittel in Form eines veränderbaren, elastischen Verbundes von kleinskaligen Fördermodulen. Das konzipierte Transportprinzip mit peristaltischen Eigenschaften soll entstehende Staus der Pakete schnell auflösen und eine dedizierte Steuerung von Teilmengen zulassen, um den erforderlichen Durchsatz innerhalb eines Materialflusssystems zu erreichen. Diese Lösung ermöglicht eine sinnvolle Verknüpfung von Wirkprinzipien der Schüttgut- und Stückgutförderung zur Aufnahme und Fortbewegung von Pakete als Schüttgut. Die Grundfunktionalität des Förderkonzepts wird durch die numerische Simulation auf Basis der Diskrete Elemente Methode sowie der Mehrkörpersimulation überprüft.
Resumo:
In recent years, the ability to respond to real time changes in operations and reconfigurability in equipment are likely to become essential characteristics for next generation intralogistics systems as well as the level of automation, cost effectiveness and maximum throughput. In order to cope with turbulences and the increasing level of dynamic conditions, future intralogistics systems have to feature short reaction times, high flexibility in processes and the ability to adapt to frequent changes. The increasing autonomy and complexity in processes of today’s intralogistics systems requires new and innovative management approaches, which allow a fast response to (un)anticipated events and adaptation to changing environment in order to reduce the negative consequences of these events. The ability of a system to respond effectively a disruption depends more on the decisions taken before the event than those taken during or after. In this context, anticipatory change planning can be a usable approach for managers to make contingency plans for intralogistics systems to deal with the rapidly changing marketplace. This paper proposes a simulation-based decision making framework for the anticipatory change planning of intralogistics systems. This approach includes the quantitative assessments based on the simulation in defined scenarios as well as the analysis of performance availability that combines the flexibility corridors of different performance dimensions. The implementation of the approach is illustrated on a new intralogistics technology called the Cellular Transport System.