11 resultados para Simulation experiments
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
We study state-based video communication where a client simultaneously informs the server about the presence status of various packets in its buffer. In sender-driven transmission, the client periodically sends to the server a single acknowledgement packet that provides information about all packets that have arrived at the client by the time the acknowledgment is sent. In receiver-driven streaming, the client periodically sends to the server a single request packet that comprises a transmission schedule for sending missing data to the client over a horizon of time. We develop a comprehensive optimization framework that enables computing packet transmission decisions that maximize the end-to-end video quality for the given bandwidth resources, in both prospective scenarios. The core step of the optimization comprises computing the probability that a single packet will be communicated in error as a function of the expected transmission redundancy (or cost) used to communicate the packet. Through comprehensive simulation experiments, we carefully examine the performance advances that our framework enables relative to state-of-the-art scheduling systems that employ regular acknowledgement or request packets. Consistent gains in video quality of up to 2B are demonstrated across a variety of content types. We show that there is a direct analogy between the error-cost efficiency of streaming a single packet and the overall rate-distortion performance of streaming the whole content. In the case of sender-driven transmission, we develop an effective modeling approach that accurately characterizes the end-to-end performance as a function of the packet loss rate on the backward channel and the source encoding characteristics.
Resumo:
Numerical simulation experiments give insight into the evolving energy partitioning during high-strain torsion experiments of calcite. Our numerical experiments are designed to derive a generic macroscopic grain size sensitive flow law capable of describing the full evolution from the transient regime to steady state. The transient regime is crucial for understanding the importance of micro structural processes that may lead to strain localization phenomena in deforming materials. This is particularly important in geological and geodynamic applications where the phenomenon of strain localization happens outside the time frame that can be observed under controlled laboratory conditions. Ourmethod is based on an extension of the paleowattmeter approach to the transient regime. We add an empirical hardening law using the Ramberg-Osgood approximation and assess the experiments by an evolution test function of stored over dissipated energy (lambda factor). Parameter studies of, strain hardening, dislocation creep parameter, strain rates, temperature, and lambda factor as well asmesh sensitivity are presented to explore the sensitivity of the newly derived transient/steady state flow law. Our analysis can be seen as one of the first steps in a hybrid computational-laboratory-field modeling workflow. The analysis could be improved through independent verifications by thermographic analysis in physical laboratory experiments to independently assess lambda factor evolution under laboratory conditions.
Resumo:
This paper studies the energy-efficiency and service characteristics of a recently developed energy-efficient MAC protocol for wireless sensor networks in simulation and on a real sensor hardware testbed. This opportunity is seized to illustrate how simulation models can be verified by cross-comparing simulation results with real-world experiment results. The paper demonstrates that by careful calibration of simulation model parameters, the inevitable gap between simulation models and real-world conditions can be reduced. It concludes with guidelines for a methodology for model calibration and validation of sensor network simulation models.
Resumo:
Cloud Computing is an enabler for delivering large-scale, distributed enterprise applications with strict requirements in terms of performance. It is often the case that such applications have complex scaling and Service Level Agreement (SLA) management requirements. In this paper we present a simulation approach for validating and comparing SLA-aware scaling policies using the CloudSim simulator, using data from an actual Distributed Enterprise Information System (dEIS). We extend CloudSim with concurrent and multi-tenant task simulation capabilities. We then show how different scaling policies can be used for simulating multiple dEIS applications. We present multiple experiments depicting the impact of VM scaling on both datacenter energy consumption and dEIS performance indicators.
Resumo:
Upconverter materials and upconverter solar devices were recently investigated with broad-band excitation revealing the great potential of upconversion to enhance the efficiency of solar cell at comparatively low solar concentration factors. In this work first attempts are made to simulate the behavior of the upconverter β-NaYF4 doped with Er3+ under broad-band excitation. An existing model was adapted to account for the lower absorption of broader excitation spectra. While the same trends as observed for the experiments were found in the simulation, the absolute values are fairly different. This makes an upconversion model that specifically considers the line shape function of the ground state absorption indispensable to achieve accurate simulations of upconverter materials and upconverter solar cell devices with broadband excitations, such as the solar radiation.
Resumo:
BACKGROUND Electrochemical conversion of xenobiotics has been shown to mimic human phase I metabolism for a few compounds. MATERIALS & METHODS Twenty-one compounds were analyzed with a semiautomated electrochemical setup and mass spectrometry detection. RESULTS The system was able to mimic some metabolic pathways, such as oxygen gain, dealkylation and deiodination, but many of the expected and known metabolites were not produced. CONCLUSION Electrochemical conversion is a useful approach for the preparative synthesis of some types of metabolites, but as a screening method for unknown phase I metabolites, the method is, in our opinion, inferior to incubation with human liver microsomes and in vivo experiments with laboratory animals, for example.
Resumo:
Phase-sensitive X-ray imaging shows a high sensitivity towards electron density variations, making it well suited for imaging of soft tissue matter. However, there are still open questions about the details of the image formation process. Here, a framework for numerical simulations of phase-sensitive X-ray imaging is presented, which takes both particle- and wave-like properties of X-rays into consideration. A split approach is presented where we combine a Monte Carlo method (MC) based sample part with a wave optics simulation based propagation part, leading to a framework that takes both particle- and wave-like properties into account. The framework can be adapted to different phase-sensitive imaging methods and has been validated through comparisons with experiments for grating interferometry and propagation-based imaging. The validation of the framework shows that the combination of wave optics and MC has been successfully implemented and yields good agreement between measurements and simulations. This demonstrates that the physical processes relevant for developing a deeper understanding of scattering in the context of phase-sensitive imaging are modelled in a sufficiently accurate manner. The framework can be used for the simulation of phase-sensitive X-ray imaging, for instance for the simulation of grating interferometry or propagation-based imaging.
Resumo:
This study aims at assessing the skill of several climate field reconstruction techniques (CFR) to reconstruct past precipitation over continental Europe and the Mediterranean at seasonal time scales over the last two millennia from proxy records. A number of pseudoproxy experiments are performed within the virtual reality ofa regional paleoclimate simulation at 45 km resolution to analyse different aspects of reconstruction skill. Canonical Correlation Analysis (CCA), two versions of an Analog Method (AM) and Bayesian hierarchical modeling (BHM) are applied to reconstruct precipitation from a synthetic network of pseudoproxies that are contaminated with various types of noise. The skill of the derived reconstructions is assessed through comparison with precipitation simulated by the regional climate model. Unlike BHM, CCA systematically underestimates the variance. The AM can be adjusted to overcome this shortcoming, presenting an intermediate behaviour between the two aforementioned techniques. However, a trade-off between reconstruction-target correlations and reconstructed variance is the drawback of all CFR techniques. CCA (BHM) presents the largest (lowest) skill in preserving the temporal evolution, whereas the AM can be tuned to reproduce better correlation at the expense of losing variance. While BHM has been shown to perform well for temperatures, it relies heavily on prescribed spatial correlation lengths. While this assumption is valid for temperature, it is hardly warranted for precipitation. In general, none of the methods outperforms the other. All experiments agree that a dense and regularly distributed proxy network is required to reconstruct precipitation accurately, reflecting its high spatial and temporal variability. This is especially true in summer, when a specifically short de-correlation distance from the proxy location is caused by localised summertime convective precipitation events.
Resumo:
Gaussian random field (GRF) conditional simulation is a key ingredient in many spatial statistics problems for computing Monte-Carlo estimators and quantifying uncertainties on non-linear functionals of GRFs conditional on data. Conditional simulations are known to often be computer intensive, especially when appealing to matrix decomposition approaches with a large number of simulation points. This work studies settings where conditioning observations are assimilated batch sequentially, with one point or a batch of points at each stage. Assuming that conditional simulations have been performed at a previous stage, the goal is to take advantage of already available sample paths and by-products to produce updated conditional simulations at mini- mal cost. Explicit formulae are provided, which allow updating an ensemble of sample paths conditioned on n ≥ 0 observations to an ensemble conditioned on n + q observations, for arbitrary q ≥ 1. Compared to direct approaches, the proposed formulae proveto substantially reduce computational complexity. Moreover, these formulae explicitly exhibit how the q new observations are updating the old sample paths. Detailed complexity calculations highlighting the benefits of this approach with respect to state-of-the-art algorithms are provided and are complemented by numerical experiments.