14 resultados para Large-Eddy Simulation

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

40.00% 40.00%

Publicador:

Resumo:

We consider a large quantum system with spins 12 whose dynamics is driven entirely by measurements of the total spin of spin pairs. This gives rise to a dissipative coupling to the environment. When one averages over the measurement results, the corresponding real-time path integral does not suffer from a sign problem. Using an efficient cluster algorithm, we study the real-time evolution from an initial antiferromagnetic state of the two-dimensional Heisenberg model, which is driven to a disordered phase, not by a Hamiltonian, but by sporadic measurements or by continuous Lindblad evolution.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Using quantum Monte Carlo, we study the nonequilibrium transport of magnetization in large open strongly correlated quantum spin-12 systems driven by purely dissipative processes that conserve the uniform or staggered magnetization, disregarding unitary Hamiltonian dynamics. We prepare both a low-temperature Heisenberg ferromagnet and an antiferromagnet in two parts of the system that are initially isolated from each other. We then bring the two subsystems in contact and study their real-time dissipative dynamics for different geometries. The flow of the uniform or staggered magnetization from one part of the system to the other is described by a diffusion equation that can be derived analytically.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The reconstruction of past flash floods in ungauged basins leads to a high level of uncertainty, which increases if other processes are involved such as the transport of large wood material. An important flash flood occurred in 1997 in Venero Claro (Central Spain), causing significant economic losses. The wood material clogged bridge sections, raising the water level upstream. The aim of this study was to reconstruct this event, analysing the influence of woody debris transport on the flood hazard pattern. Because the reach in question was affected by backwater effects due to bridge clogging, using only high water mark or palaeostage indicators may overestimate discharges, and so other methods are required to estimate peak flows. Therefore, the peak discharge was estimated (123 ± 18 m3 s–1) using indirect methods, but one-dimensional hydraulic simulation was also used to validate these indirect estimates through an iterative process (127 ± 33 m3 s–1) and reconstruct the bridge obstruction to obtain the blockage ratio during the 1997 event (~48%) and the bridge clogging curves. Rainfall–Runoff modelling with stochastic simulation of different rainfall field configurations also helped to confirm that a peak discharge greater than 150 m3 s–1 is very unlikely to occur and that the estimated discharge range is consistent with the estimated rainfall amount (233 ± 27 mm). It was observed that the backwater effect due to the obstruction (water level ~7 m) made the 1997 flood (~35-year return period) equivalent to the 50-year flood. This allowed the equivalent return period to be defined as the recurrence interval of an event of specified magnitude, which, where large woody debris is present, is equivalent in water depth and extent of flooded area to a more extreme event of greater magnitude. These results highlight the need to include obstruction phenomena in flood hazard analysis.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud Computing is an enabler for delivering large-scale, distributed enterprise applications with strict requirements in terms of performance. It is often the case that such applications have complex scaling and Service Level Agreement (SLA) management requirements. In this paper we present a simulation approach for validating and comparing SLA-aware scaling policies using the CloudSim simulator, using data from an actual Distributed Enterprise Information System (dEIS). We extend CloudSim with concurrent and multi-tenant task simulation capabilities. We then show how different scaling policies can be used for simulating multiple dEIS applications. We present multiple experiments depicting the impact of VM scaling on both datacenter energy consumption and dEIS performance indicators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In modern medico-legal literature, only a small number of publications deal with fatal injuries from black powder guns. Most of them focus on the morphological features such as intense soot soiling, blast tattooing and burn effects in close-range shots or describe the wound ballistics of spherical lead bullets. Another kind of "unusual" and potentially lethal weapons are handguns destined for firing only blank cartridges such as starter and alarm pistols. The dangerousness of these guns is restricted to very close and contact range shots and results from the gas jet produced by the deflagration of the propellant. The present paper reports on a suicide committed with a muzzle-loading percussion pistol cal. 45. An unusually large stellate entrance wound was located in the precordial region, accompanied by an imprint mark from the ramrod and a faint greenish discoloration (apparently due to the formation of sulfhemoglobin). Autopsy revealed an oversized powder cavity, multiple fractures of the anterior thoracic wall as well as ruptures of the heart, the aorta, the left hepatic lobe and the diaphragm. In total, the zone of mechanical destruction had a diameter of approx. 15 cm. As there was no exit wound and no bullet lodged in the body, the injury was caused exclusively by the inrushing combustion gases of the propellant (black powder) comparable with the gas jet of a blank cartridge gun. In contact shots to ballistic gelatine using the suicide's pistol loaded with black powder but no projectile, the formation of a nearly spherical cavity could be demonstrated by means of a high-speed camera. The extent of the temporary cavity after firing with 5 g of black powder roughly corresponded to the zone of destruction found in the suicide's body.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud Computing has evolved to become an enabler for delivering access to large scale distributed applications running on managed network-connected computing systems. This makes possible hosting Distributed Enterprise Information Systems (dEISs) in cloud environments, while enforcing strict performance and quality of service requirements, defined using Service Level Agreements (SLAs). {SLAs} define the performance boundaries of distributed applications, and are enforced by a cloud management system (CMS) dynamically allocating the available computing resources to the cloud services. We present two novel VM-scaling algorithms focused on dEIS systems, which optimally detect most appropriate scaling conditions using performance-models of distributed applications derived from constant-workload benchmarks, together with SLA-specified performance constraints. We simulate the VM-scaling algorithms in a cloud simulator and compare against trace-based performance models of dEISs. We compare a total of three SLA-based VM-scaling algorithms (one using prediction mechanisms) based on a real-world application scenario involving a large variable number of users. Our results show that it is beneficial to use autoregressive predictive SLA-driven scaling algorithms in cloud management systems for guaranteeing performance invariants of distributed cloud applications, as opposed to using only reactive SLA-based VM-scaling algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gaussian random field (GRF) conditional simulation is a key ingredient in many spatial statistics problems for computing Monte-Carlo estimators and quantifying uncertainties on non-linear functionals of GRFs conditional on data. Conditional simulations are known to often be computer intensive, especially when appealing to matrix decomposition approaches with a large number of simulation points. This work studies settings where conditioning observations are assimilated batch sequentially, with one point or a batch of points at each stage. Assuming that conditional simulations have been performed at a previous stage, the goal is to take advantage of already available sample paths and by-products to produce updated conditional simulations at mini- mal cost. Explicit formulae are provided, which allow updating an ensemble of sample paths conditioned on n ≥ 0 observations to an ensemble conditioned on n + q observations, for arbitrary q ≥ 1. Compared to direct approaches, the proposed formulae proveto substantially reduce computational complexity. Moreover, these formulae explicitly exhibit how the q new observations are updating the old sample paths. Detailed complexity calculations highlighting the benefits of this approach with respect to state-of-the-art algorithms are provided and are complemented by numerical experiments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Three extended families live around a lake. One family are rice farmers, the second family are vegetable farmers, and the third are a family of livestock herders. All of them depend on the use of lake water for their production, and all of them need large quantities of water. All are dependent on the use of the lake water to secure their livelihood. In the game, the families are represented by their councils of elders. Each of the councils has to find means and ways to increase production in order to keep up with the growth of its family and their demands. This puts more and more pressure on the water resources, increasing the risk of overuse. Conflicts over water are about to emerge between the families. Each council of elders must try to pursue its families interests, while at the same time preventing excessive pressure on the water resources. Once a council of elders is no longer able to meet the needs of its family, it is excluded from the game. Will the parties cooperate or compete? To face the challenge of balancing economic well-being, sustainable resource management, and individual and collective interests, the three parties have a set of options for action at hand. These include power play to safeguard their own interests, communication and cooperation to negotiate with neighbours, and searching for alternatives to reduce pressure on existing water resources. During the game the players can experience how tensions may arise, increase and finally escalate. They realise what impact power play has and how alliances form, and the importance of trust-building measures, consensus and cooperation. From the insights gained, important conflict prevention and mitigation measures are derived in a debriefing session. The game is facilitated by a moderator, and lasts for 3-4 hours. Aim of the game: Each family pursues the objective of serving its own interests and securing its position through appropriate strategies and skilful negotiation, while at the same time optimising use of the water resources in a way that prevents their degradation. The end of the game is open. While the game may end by one or two families dropping out because they can no longer secure their subsistence, it is also possible that the three families succeed in creating a situation that allows them to meet their own needs as well as the requirements for sustainable water use in the long term. Learning objectives The game demonstrates how tension builds up, increases, and finally escalates; it shows how power positions work and alliances are formed; and it enables the players to experience the great significance of mutual agreement and cooperation. During the game and particularly during the debriefing and evaluation session it is important to link experiences made during the game to the players’ real-life experiences, and to discuss these links in the group. The resulting insights will provide a basis for deducing important conflict prevention and transformation measures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Today, there is little knowledge on the attitude state of decommissioned intact objects in Earth orbit. Observational means have advanced in the past years, but are still limited with respect to an accurate estimate of motion vector orientations and magnitude. Especially for the preparation of Active Debris Removal (ADR) missions as planned by ESA’s Clean Space initiative or contingency scenarios for ESA spacecraft like ENVISAT, such knowledge is needed. ESA's “Debris Attitude Motion Measurements and Modelling” project (ESA Contract No. 40000112447), led by the Astronomical Institute of the University of Bern (AIUB), addresses this problem. The goal of the project is to achieve a good understanding of the attitude evolution and the considerable internal and external effects which occur. To characterize the attitude state of selected targets in LEO and GTO, multiple observation methods are combined. Optical observations are carried out by AIUB, Satellite Laser Ranging (SLR) is performed by the Space Research Institute of the Austrian Academy of Sciences (IWF) and radar measurements and signal level determination are provided by the Fraunhofer Institute for High Frequency Physics and Radar Techniques (FHR). The In-Orbit Tumbling Analysis tool (ιOTA) is a prototype software, currently in development by Hyperschall Technologie Göttingen GmbH (HTG) within the framework of the project. ιOTA will be a highly modular software tool to perform short-(days), medium-(months) and long-term (years) propagation of the orbit and attitude motion (six degrees-of-freedom) of spacecraft in Earth orbit. The simulation takes into account all relevant acting forces and torques, including aerodynamic drag, solar radiation pressure, gravitational influences of Earth, Sun and Moon, eddy current damping, impulse and momentum transfer from space debris or micro meteoroid impact, as well as the optional definition of particular spacecraft specific influences like tank sloshing, reaction wheel behaviour, magnetic torquer activity and thruster firing. The purpose of ιOTA is to provide high accuracy short-term simulations to support observers and potential ADR missions, as well as medium-and long-term simulations to study the significance of the particular internal and external influences on the attitude, especially damping factors and momentum transfer. The simulation will also enable the investigation of the altitude dependency of the particular external influences. ιOTA's post-processing modules will generate synthetic measurements for observers and for software validation. The validation of the software will be done by cross-calibration with observations and measurements acquired by the project partners.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The cometary coma is a unique phenomenon in the solar system being a planetary atmosphere influenced by little or no gravity. As a comet approaches the sun, the water vapor with some fraction of other gases sublimate, generating a cloud of gas, ice and other refractory materials (rocky and organic dust) ejected from the surface of the nucleus. Sublimating gas molecules undergo frequent collisions and photochemical processes in the near‐nucleus region. Owing to its negligible gravity, comets produce a large and highly variable extensive dusty coma with a size much larger than the characteristic size of the cometary nucleus. The Rosetta spacecraft is en route to comet 67P/Churyumov‐Gerasimenko for a rendezvous, landing, and extensive orbital phase beginning in 2014. Both, interpretation of measurements and safety consideration of the spacecraft require modeling of the comet’s dusty gas environment. In this work we present results of a numerical study of multispecies gaseous and electrically charged dust environment of comet Chyuryumov‐Gerasimenko. Both, gas and dust phases of the coma are simulated kinetically. Photolytic reactions are taken into account. Parameters of the ambient plasma as well as the distribution of electric/magnetic fields are obtained from an MHD simulation [1] of the coma connected to the solar wind. Trajectories of ions and electrically charged dust grains are simulated by accounting for the Lorentz force and the nucleus gravity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Domestic dog rabies is an endemic disease in large parts of the developing world and also epidemic in previously free regions. For example, it continues to spread in eastern Indonesia and currently threatens adjacent rabies-free regions with high densities of free-roaming dogs, including remote northern Australia. Mathematical and simulation disease models are useful tools to provide insights on the most effective control strategies and to inform policy decisions. Existing rabies models typically focus on long-term control programs in endemic countries. However, simulation models describing the dog rabies incursion scenario in regions where rabies is still exotic are lacking. We here describe such a stochastic, spatially explicit rabies simulation model that is based on individual dog information collected in two remote regions in northern Australia. Illustrative simulations produced plausible results with epidemic characteristics expected for rabies outbreaks in disease free regions (mean R0 1.7, epidemic peak 97 days post-incursion, vaccination as the most effective response strategy). Systematic sensitivity analysis identified that model outcomes were most sensitive to seven of the 30 model parameters tested. This model is suitable for exploring rabies spread and control before an incursion in populations of largely free-roaming dogs that live close together with their owners. It can be used for ad-hoc contingency or response planning prior to and shortly after incursion of dog rabies in previously free regions. One challenge that remains is model parameterisation, particularly how dogs' roaming and contacts and biting behaviours change following a rabies incursion in a previously rabies free population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Syndromic surveillance (SyS) systems currently exploit various sources of health-related data, most of which are collected for purposes other than surveillance (e.g. economic). Several European SyS systems use data collected during meat inspection for syndromic surveillance of animal health, as some diseases may be more easily detected post-mortem than at their point of origin or during the ante-mortem inspection upon arrival at the slaughterhouse. In this paper we use simulation to evaluate the performance of a quasi-Poisson regression (also known as an improved Farrington) algorithm for the detection of disease outbreaks during post-mortem inspection of slaughtered animals. When parameterizing the algorithm based on the retrospective analyses of 6 years of historic data, the probability of detection was satisfactory for large (range 83-445 cases) outbreaks but poor for small (range 20-177 cases) outbreaks. Varying the amount of historical data used to fit the algorithm can help increasing the probability of detection for small outbreaks. However, while the use of a 0·975 quantile generated a low false-positive rate, in most cases, more than 50% of outbreak cases had already occurred at the time of detection. High variance observed in the whole carcass condemnations time-series, and lack of flexibility in terms of the temporal distribution of simulated outbreaks resulting from low reporting frequency (monthly), constitute major challenges for early detection of outbreaks in the livestock population based on meat inspection data. Reporting frequency should be increased in the future to improve timeliness of the SyS system while increased sensitivity may be achieved by integrating meat inspection data into a multivariate system simultaneously evaluating multiple sources of data on livestock health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Direct Simulation Monte Carlo (DSMC) is a powerful numerical method to study rarefied gas flows such as cometary comae and has been used by several authors over the past decade to study cometary outflow. However, the investigation of the parameter space in simulations can be time consuming since 3D DSMC is computationally highly intensive. For the target of ESA's Rosetta mission, comet 67P/Churyumov-Gerasimenko, we have identified to what extent modification of several parameters influence the 3D flow and gas temperature fields and have attempted to establish the reliability of inferences about the initial conditions from in situ and remote sensing measurements. A large number of DSMC runs have been completed with varying input parameters. In this work, we present the simulation results and conclude on the sensitivity of solutions to certain inputs. It is found that among cases of water outgassing, the surface production rate distribution is the most influential variable to the flow field.