987 resultados para Simulation Theory


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abelian and non-Abelian gauge theories are of central importance in many areas of physics. In condensed matter physics, AbelianU(1) lattice gauge theories arise in the description of certain quantum spin liquids. In quantum information theory, Kitaev’s toric code is a Z(2) lattice gauge theory. In particle physics, Quantum Chromodynamics (QCD), the non-Abelian SU(3) gauge theory of the strong interactions between quarks and gluons, is nonperturbatively regularized on a lattice. Quantum link models extend the concept of lattice gauge theories beyond the Wilson formulation, and are well suited for both digital and analog quantum simulation using ultracold atomic gases in optical lattices. Since quantum simulators do not suffer from the notorious sign problem, they open the door to studies of the real-time evolution of strongly coupled quantum systems, which are impossible with classical simulation methods. A plethora of interesting lattice gauge theories suggests itself for quantum simulation, which should allow us to address very challenging problems, ranging from confinement and deconfinement, or chiral symmetry breaking and its restoration at finite baryon density, to color superconductivity and the real-time evolution of heavy-ion collisions, first in simpler model gauge theories and ultimately in QCD.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

(1) A mathematical theory for computing the probabilities of various nucleotide configurations is developed, and the probability of obtaining the correct phylogenetic tree (model tree) from sequence data is evaluated for six phylogenetic tree-making methods (UPGMA, distance Wagner method, transformed distance method, Fitch-Margoliash's method, maximum parsimony method, and compatibility method). The number of nucleotides (m*) necessary to obtain the correct tree with a probability of 95% is estimated with special reference to the human, chimpanzee, and gorilla divergence. m* is at least 4,200, but the availability of outgroup species greatly reduces m* for all methods except UPGMA. m* increases if transitions occur more frequently than transversions as in the case of mitochondrial DNA. (2) A new tree-making method called the neighbor-joining method is proposed. This method is applicable either for distance data or character state data. Computer simulation has shown that the neighbor-joining method is generally better than UPGMA, Farris' method, Li's method, and modified Farris method on recovering the true topology when distance data are used. A related method, the simultaneous partitioning method, is also discussed. (3) The maximum likelihood (ML) method for phylogeny reconstruction under the assumption of both constant and varying evolutionary rates is studied, and a new algorithm for obtaining the ML tree is presented. This method gives a tree similar to that obtained by UPGMA when constant evolutionary rate is assumed, whereas it gives a tree similar to that obtained by the maximum parsimony tree and the neighbor-joining method when varying evolutionary rate is assumed. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rapidly growing technical developments and working time constraints call for changes in trainee formation. In reality, trainees spend fewer hours in the hospital and face more difficulties in acquiring the required qualifications in order to work independently as a specialist. Simulation-based training is a potential solution. It offers the possibility to learn basic technical skills, repeatedly perform key steps in procedures and simulate challenging scenarios in team training. Patients are not at risk and learning curves can be shortened. Advanced learners are able to train rare complications. Senior faculty member's presence is key to assess and debrief effective simulation training. In the field of vascular access surgery, simulation models are available for open as well as endovascular procedures. In this narrative review, we describe the theory of simulation, present simulation models in vascular (access) surgery, discuss the possible benefits for patient safety and the difficulties of implementing simulation in training.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Workshop Overview The use of special effects (moulage) is a way to augment the authenticity of a scenario in simulation. This workshop will introduce different techniques of moulage (oil based cream colors, watercolors, transfer tattoos and 3D Prosthetics). The participants will have the opportunity to explore these techniques by applying various moulages. They will compare the techniques and discuss their advantages and disadvantages. Moreover, strategies for standardization and quality assurance will be discussed. Workshop Rationale Moulage supports the sensory perception in an scenario (1). It can provide evaluation clues (2) and help learners (and SPs) to engage in the simulation. However, it is of crucial importance that the simulated physical pathologies are represented accurate and reliable. Accuracy is achieved by using the appropriate technique, which requires knowledge and practice . With information about different moulage techniques, we hope to increases the knowledge of moulage during the workshop. By applying moulages in various techniques we will practice together. As standardization is critical for simulation scenarios in assessment (3, 4) strategies for standardization of moulage will be introduced and discussed. Workshop Objectives During the workshop participants will: - gain knowledge about different techniques of moulages - practice moulages in various techniques - discuss the advantages and disadvantages of moulage techniques - describe strategies for standardization and quality assurance of moulage Planned Format 5 min Introduction 15 min Overview – Background & Theory (presentation) 15 min Application of moulage for ankle sprain in 4 different techniques (oil based cream color, water color, temporary tatoo, 3D prosthetic) in small groups 5 min Comparing the results by interactive viewing of prepared moulages 15 min Application of moulages for burn in different techniques in small groups 5 min Comparing results the results by interactive viewing of prepared moulages 5 min Sharing experiences with different techniques in small groups 20 min Discussion of the techniques including standardization and quality assurance strategies (plenary discussion) 5 min Summary / Take home points

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We experimentally and numerically investigated the generation of plumes from a local heat source (LHS) and studied the interaction of these plumes with cellular convective motion (CCM) in a rectangular cavity filled with silicon oil at a Prandtl number (Pr) of approximately two thousand. The LHS is generated using a 0.2-W green laser beam. A roll-type CCM is generated by vertically heating one side of the cavity. The CCM may lead to the formation of an unusual spiral convective plume that resembles a vertical Archimedes spiral. A similar plume is obtained in a direct numerical simulation. We discuss the physical mechanism for the formation of a spiral plume and the application of the results to mantle convection problems. We also estimate the Reynolds (Re) and Rayleigh (Ra) numbers and apply self-similarity theory to convection in the Earth's mantle. Spiral plumes can be used to interpret mantle tomography results over the last decade.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Recently, steady economic growth rates have been kept in Poland and Hungary. Money supplies are growing rather rapidly in these economies. In large, exchange rates have trends of depreciation. Then, exports and prices show the steady growth rates. It can be thought that per capita GDPs are in the same level and development stages are similar in these two countries. It is assumed that these two economies have the same export market and export goods are competing in it. If one country has an expansion of monetary policy, price increase and interest rate decrease. Then, exchange rate decrease. Exports and GDP will increase through this phenomenon. At the same time, this expanded monetary policy affects another country through the trade. This mutual relationship between two countries can be expressed by the Nash-equilibrium in the Game theory. In this paper, macro-econometric models of Polish and Hungarian economies are built and the Nash- equilibrium is introduced into them.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Research on multinational firms’ activity has been conducted widely since late 1980s. The literature is differentiated into three types: horizontal FDI, vertical FDI, and three-country FDI, represented by export platform FDI. There are other methods of differentiation of the literature by approach, for example, the pure theory approach represented by Krugman and Melitz and the numerical simulation approach represented by Markusen. This paper surveys Markusen type literature by firm type. There is little literature focused on intermediate goods trade, although intermediate goods trade is considered to be strongly related to the production patterns of MNEs. In this paper, we introduce a model to explicitly treat intermediate goods trade and present simulation analysis for empirical estimation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monte Carlo simulations have been carried out to study the effect of temperature on the growth kinetics of a circular grain. This work demonstrates the importance of roughening fluctuations on the growth dynamics. Since the effect of thermal fluctuations is stronger in d =2 than in d =3, as predicted by d =3 theories of domain kinetics, the circular domain shrinks linearly with time as A (t)=A(0)-αt, where A (0) and A(t) are the initial and instantaneous areas, respectively. However, in contrast to d =3, the slope α is strongly temperature dependent for T≥0.6TC. An analytical theory which considers the thermal fluctuations agrees with the T dependence of the Monte Carlo data in this regime, and this model show that these fluctuations are responsible for the strong temperature dependence of the growth rate for d =2. Our results are particularly relevant to the problem of domain growth in surface science

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the study of non-Markovian queuing systems in container terminals. The methodology presented has been applied to analyze the ship traffic in the port of Valencia located in the Western Mediterranean. Two container terminals have been studied: the public container terminal of NOATUM and the dedicated container terminal of MSC. This paper contains the results of a simulation model based on queuing theory. The methodology presented is found to be effective in replicating realistic ship traffic operations in port as well as in conducting capacity evaluations. Thus the methodology can be used for capacity planning (long term), tactical planning (medium term) and even for the container terminal design (port enlargement purposes).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

E-learning systems output a huge quantity of data on a learning process. However, it takes a lot of specialist human resources to manually process these data and generate an assessment report. Additionally, for formative assessment, the report should state the attainment level of the learning goals defined by the instructor. This paper describes the use of the granular linguistic model of a phenomenon (GLMP) to model the assessment of the learning process and implement the automated generation of an assessment report. GLMP is based on fuzzy logic and the computational theory of perceptions. This technique is useful for implementing complex assessment criteria using inference systems based on linguistic rules. Apart from the grade, the model also generates a detailed natural language progress report on the achieved proficiency level, based exclusively on the objective data gathered from correct and incorrect responses. This is illustrated by applying the model to the assessment of Dijkstra’s algorithm learning using a visual simulation-based graph algorithm learning environment, called GRAPHs

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The presented works aim at proposing a methodology for the simulation of offshore wind conditions using CFD. The main objective is the development of a numerical model for the characterization of atmospheric boundary layers of different stability levels, as the most important issue in offshore wind resource assessment. Based on Monin-Obukhov theory, the steady k-ε Standard turbulence model is modified to take into account thermal stratification in the surface layer. The validity of Monin-Obukhov theory in offshore conditions is discussed with an analysis of a three day episode at FINO-1 platform.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ripple-based controls can strongly reduce the required output capacitance in PowerSoC converter thanks to a very fast dynamic response. Unfortunately, these controls are prone to sub-harmonic oscillations and several parameters affect the stability of these systems. This paper derives and validates a simulation-based modeling and stability analysis of a closed-loop V 2Ic control applied to a 5 MHz Buck converter using discrete modeling and Floquet theory to predict stability. This allows the derivation of sensitivity analysis to design robust systems. The work is extended to different V 2 architectures using the same methodology.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The primary hypothesis stated by this paper is that the use of social choice theory in Ambient Intelligence systems can improve significantly users satisfaction when accessing shared resources. A research methodology based on agent based social simulations is employed to support this hypothesis and to evaluate these benefits. The result is a six-fold contribution summarized as follows. Firstly, several considerable differences between this application case and the most prominent social choice application, political elections, have been found and described. Secondly, given these differences, a number of metrics to evaluate different voting systems in this scope have been proposed and formalized. Thirdly, given the presented application and the metrics proposed, the performance of a number of well known electoral systems is compared. Fourthly, as a result of the performance study, a novel voting algorithm capable of obtaining the best balance between the metrics reviewed is introduced. Fifthly, to improve the social welfare in the experiments, the voting methods are combined with cluster analysis techniques. Finally, the article is complemented by a free and open-source tool, VoteSim, which ensures not only the reproducibility of the experimental results presented, but also allows the interested reader to adapt the case study presented to different environments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the growing body of research on traumatic brain injury and spinal cord injury, computational neuroscience has recently focused its modeling efforts on neuronal functional deficits following mechanical loading. However, in most of these efforts, cell damage is generally only characterized by purely mechanistic criteria, function of quantities such as stress, strain or their corresponding rates. The modeling of functional deficits in neurites as a consequence of macroscopic mechanical insults has been rarely explored. In particular, a quantitative mechanically based model of electrophysiological impairment in neuronal cells has only very recently been proposed (Jerusalem et al., 2013). In this paper, we present the implementation details of Neurite: the finite difference parallel program used in this reference. Following the application of a macroscopic strain at a given strain rate produced by a mechanical insult, Neurite is able to simulate the resulting neuronal electrical signal propagation, and thus the corresponding functional deficits. The simulation of the coupled mechanical and electrophysiological behaviors requires computational expensive calculations that increase in complexity as the network of the simulated cells grows. The solvers implemented in Neurite-explicit and implicit-were therefore parallelized using graphics processing units in order to reduce the burden of the simulation costs of large scale scenarios. Cable Theory and Hodgkin-Huxley models were implemented to account for the electrophysiological passive and active regions of a neurite, respectively, whereas a coupled mechanical model accounting for the neurite mechanical behavior within its surrounding medium was adopted as a link between lectrophysiology and mechanics (Jerusalem et al., 2013). This paper provides the details of the parallel implementation of Neurite, along with three different application examples: a long myelinated axon, a segmented dendritic tree, and a damaged axon. The capabilities of the program to deal with large scale scenarios, segmented neuronal structures, and functional deficits under mechanical loading are specifically highlighted.