937 resultados para C15 - Statistical Simulation Methods
Resumo:
Les fichiers qui accompagnent mon document ont été réalisés avec le logiciel Latex et les simulations ont été réalisés par Splus(R).
Resumo:
Spanning avalanches in the 3D Gaussian Random Field Ising Model (3D-GRFIM) with metastable dynamics at T=0 have been studied. Statistical analysis of the field values for which avalanches occur has enabled a Finite-Size Scaling (FSS) study of the avalanche density to be performed. Furthermore, a direct measurement of the geometrical properties of the avalanches has confirmed an earlier hypothesis that several types of spanning avalanches with two different fractal dimensions coexist at the critical point. We finally compare the phase diagram of the 3D-GRFIM with metastable dynamics with the same model in equilibrium at T=0.
Resumo:
Collective dynamic properties in Lennard-Jones crystals are investigated by molecular dynamics simulation. The study is focused on properties such as the dynamic structure factors, the longitudinal and transverse currents and the density of states. The influence on these properties of the structural disorder is analyzed by comparing the results for one-component crystals with those for liquids and supercooled liquids at analogous conditions. The effects of species-disorder on the collective properties of binary crystals are also discussed.
Resumo:
While channel coding is a standard method of improving a system’s energy efficiency in digital communications, its practice does not extend to high-speed links. Increasing demands in network speeds are placing a large burden on the energy efficiency of high-speed links and render the benefit of channel coding for these systems a timely subject. The low error rates of interest and the presence of residual intersymbol interference (ISI) caused by hardware constraints impede the analysis and simulation of coded high-speed links. Focusing on the residual ISI and combined noise as the dominant error mechanisms, this paper analyses error correlation through concepts of error region, channel signature, and correlation distance. This framework provides a deeper insight into joint error behaviours in high-speed links, extends the range of statistical simulation for coded high-speed links, and provides a case against the use of biased Monte Carlo methods in this setting
Statistical evaluation of the fixed concentration procedure for acute inhalation toxicity assessment
Resumo:
The conventional method for the assessment of acute inhalation toxicity (OECD Test Guideline 403, 1981) uses death of animals as an endpoint to identify the median lethal concentration (LC50). A new OECD Testing Guideline called the Fixed Concentration Procedure (FCP) is being prepared to provide an alternative to Test Guideline 403. Unlike Test Guideline 403, the FCP does not provide a point estimate of the LC50, but aims to identify an airborne exposure level that causes clear signs of nonlethal toxicity. This is then used to assign classification according to the new Globally Harmonized System of Classification and Labelling scheme (GHS). The FCP has been validated using statistical simulation rather than byin vivo testing. The statistical simulation approach predicts the GHS classification outcome and the numbers of deaths and animals used in the test for imaginary substances with a range of LC50 values and dose response curve slopes. This paper describes the FCP and reports the results from the statistical simulation study assessing its properties. It is shown that the procedure will be completed with considerably less death and suffering than Test Guideline 403, and will classify substances either in the same or a more stringent GHS class than that assigned on the basis of the LC50 value.
Resumo:
This article introduces a quantitative approach to e-commerce system evaluation based on the theory of process simulation. The general concept of e-commerce system simulation is presented based on the considerations of some limitations in e-commerce system development such as the huge amount of initial investments of time and money, and the long period from business planning to system development, then to system test and operation, and finally to exact return; in other words, currently used system analysis and development method cannot tell investors about some keen attentions such as how good their e-commerce system could be, how many investment repayments they could have, and which area they should improve regarding the initial business plan. In order to exam the value and its potential effects of an e-commerce business plan, it is necessary to use a quantitative evaluation approach and the authors of this article believe that process simulation is an appropriate option. The overall objective of this article is to apply the theory of process simulation to e-commerce system evaluation, and the authors will achieve this though an experimental study on a business plan for online construction and demolition waste exchange. The methodologies adopted in this article include literature review, system analysis and development, simulation modelling and analysis, and case study. The results from this article include the concept of e-commerce system simulation, a comprehensive review of simulation methods adopted in e-commerce system evaluation, and a real case study of applying simulation to e-commerce system evaluation. Furthermore, the authors hope that the adoption and implementation of the process simulation approach can effectively support business decision-making, and improve the efficiency of e-commerce systems.
Resumo:
Researchers often rely on the t-statistic to make inference on parameters in statistical models. It is common practice to obtain critical values by simulation techniques. This paper proposes a novel numerical method to obtain an approximately similar test. This test rejects the null hypothesis when the test statistic islarger than a critical value function (CVF) of the data. We illustrate this procedure when regressors are highly persistent, a case in which commonly-used simulation methods encounter dificulties controlling size uniformly. Our approach works satisfactorily, controls size, and yields a test which outperforms the two other known similar tests.
Resumo:
Purpose: To evaluate the root fracture strength of human single-rooted premolars restored with customized fiberglass post-core systems after fatigue simulation. Methods: 40 human premolars had their crowns cut and the root length was standardized to 13 mm. The teeth were endodontically treated and embedded in acrylic resin. The specimens were distributed into four groups (n=10) according to the restorative material used: prefabricated fiber post (PFP), PFP+accessory fiber posts (PFPa), PFP+unidirectional fiberglass (PFPf), and unidirectional fiberglass customized post (CP). All posts were luted using resin cement and the cores were built up with a resin composite. The samples were stored for 24 hours at 37 degrees C and 100% relative humidity and then submitted to mechanical cycling. The specimens were then compressive-loaded in a universal testing machine at a crosshead speed of 0.5 mm/minute until fracture. The failure patterns were analyzed and classified. Data was submitted to one-way ANOVA and Tukey's test (alpha= 0.05). Results: The mean values of maximum load (N) were: PFP - 811.4 +/- 124.3; PFPa - 729.2 +/- 157.2; PFPf - 747.5 +/- 204.7; CP - 762.4 +/- 110. Statistical differences were not observed among the groups. All groups showed favorable restorable failures. Fiberglass customized post did not show improved fracture resistance or differences in failure patterns when compared to prefabricated glass fiber posts. (Am J Dent 2012;25:35-38).
Resumo:
Background: The purpose of this study is to analyze the tension distribution on bone tissue around implants with different angulations (0 degrees, 17 degrees, and 30 degrees) and connections (external hexagon and tapered) through the use of three-dimensional finite element and statistical analyses.Methods: Twelve different configurations of three-dimensional finite element models, including three inclinations of the implants (0 degrees, 17 degrees, and 30 degrees), two connections (an external hexagon and a tapered), and two load applications (axial and oblique), were simulated. The maximum principal stress values for cortical bone were measured at the mesial, distal, buccal, and lingual regions around the implant for each analyzed situation, totaling 48 groups. Loads of 200 and 100 N were applied at the occlusal surface in the axial and oblique directions, respectively. Maximum principal stress values were measured at the bone crest and statistically analyzed using analysis of variance. Stress patterns in the bone tissue around the implant were analyzed qualitatively.Results: The results demonstrated that under the oblique loading process, the external hexagon connection showed significantly higher stress concentrations in the bone tissue (P < 0.05) compared with the tapered connection. Moreover, the buccal and mesial regions of the cortical bone concentrated significantly higher stress (P < 0.005) to the external hexagon implant type. Under the oblique loading direction, the increased external hexagon implant angulation induced a significantly higher stress concentration (P = 0.045).Conclusions: The study results show that: 1) the oblique load was more damaging to bone tissue, mainly when associated with external hexagon implants; and 2) there was a higher stress concentration on the buccal region in comparison to all other regions under oblique load.
Resumo:
Monte Carlo simulation methods were used in order to study the conformational properties of partially ionized polyelectrolyte chains with Debye-Hückel screening in 1:1 electrolyte solution at room temperature. Configurational properties such as the distributions of probability for the square end to end distances, for the square radii of gyration and for the angles between polyion bonds were investigated as a function of the chain ionization and the salt concentration. © 1993.
Resumo:
Brazil is the largest sugarcane producer in the world and has a privileged position to attend to national and international market places. To maintain the high production of sugarcane, it is fundamental to improve the forecasting models of crop seasons through the use of alternative technologies, such as remote sensing. Thus, the main purpose of this article is to assess the results of two different statistical forecasting methods applied to an agroclimatic index (the water requirement satisfaction index; WRSI) and the sugarcane spectral response (normalized difference vegetation index; NDVI) registered on National Oceanic and Atmospheric Administration Advanced Very High Resolution Radiometer (NOAA-AVHRR) satellite images. We also evaluated the cross-correlation between these two indexes. According to the results obtained, there are meaningful correlations between NDVI and WRSI with time lags. Additionally, the adjusted model for NDVI presented more accurate results than the forecasting models for WRSI. Finally, the analyses indicate that NDVI is more predictable due to its seasonality and the WRSI values are more variable making it difficult to forecast.
Resumo:
While the use of statistical physics methods to analyze large corpora has been useful to unveil many patterns in texts, no comprehensive investigation has been performed on the interdependence between syntactic and semantic factors. In this study we propose a framework for determining whether a text (e.g., written in an unknown alphabet) is compatible with a natural language and to which language it could belong. The approach is based on three types of statistical measurements, i.e. obtained from first-order statistics of word properties in a text, from the topology of complex networks representing texts, and from intermittency concepts where text is treated as a time series. Comparative experiments were performed with the New Testament in 15 different languages and with distinct books in English and Portuguese in order to quantify the dependency of the different measurements on the language and on the story being told in the book. The metrics found to be informative in distinguishing real texts from their shuffled versions include assortativity, degree and selectivity of words. As an illustration, we analyze an undeciphered medieval manuscript known as the Voynich Manuscript. We show that it is mostly compatible with natural languages and incompatible with random texts. We also obtain candidates for keywords of the Voynich Manuscript which could be helpful in the effort of deciphering it. Because we were able to identify statistical measurements that are more dependent on the syntax than on the semantics, the framework may also serve for text analysis in language-dependent applications.
Resumo:
This thesis deals with the development of a novel simulation technique for macromolecules in electrolyte solutions, with the aim of a performance improvement over current molecular-dynamics based simulation methods. In solutions containing charged macromolecules and salt ions, it is the complex interplay of electrostatic interactions and hydrodynamics that determines the equilibrium and non-equilibrium behavior. However, the treatment of the solvent and dissolved ions makes up the major part of the computational effort. Thus an efficient modeling of both components is essential for the performance of a method. With the novel method we approach the solvent in a coarse-grained fashion and replace the explicit-ion description by a dynamic mean-field treatment. Hence we combine particle- and field-based descriptions in a hybrid method and thereby effectively solve the electrokinetic equations. The developed algorithm is tested extensively in terms of accuracy and performance, and suitable parameter sets are determined. As a first application we study charged polymer solutions (polyelectrolytes) in shear flow with focus on their viscoelastic properties. Here we also include semidilute solutions, which are computationally demanding. Secondly we study the electro-osmotic flow on superhydrophobic surfaces, where we perform a detailed comparison to theoretical predictions.
Resumo:
Abelian and non-Abelian gauge theories are of central importance in many areas of physics. In condensed matter physics, AbelianU(1) lattice gauge theories arise in the description of certain quantum spin liquids. In quantum information theory, Kitaev’s toric code is a Z(2) lattice gauge theory. In particle physics, Quantum Chromodynamics (QCD), the non-Abelian SU(3) gauge theory of the strong interactions between quarks and gluons, is nonperturbatively regularized on a lattice. Quantum link models extend the concept of lattice gauge theories beyond the Wilson formulation, and are well suited for both digital and analog quantum simulation using ultracold atomic gases in optical lattices. Since quantum simulators do not suffer from the notorious sign problem, they open the door to studies of the real-time evolution of strongly coupled quantum systems, which are impossible with classical simulation methods. A plethora of interesting lattice gauge theories suggests itself for quantum simulation, which should allow us to address very challenging problems, ranging from confinement and deconfinement, or chiral symmetry breaking and its restoration at finite baryon density, to color superconductivity and the real-time evolution of heavy-ion collisions, first in simpler model gauge theories and ultimately in QCD.