74 resultados para Stochastic simulation methods

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

90.00% 90.00%

Publicador:

Resumo:

The Madden-Julian oscillation (MJO) is the most prominent form of tropical intraseasonal variability. This study investigated the following questions. Do inter-annual-to-decadal variations in tropical sea surface temperature (SST) lead to substantial changes in MJO activity? Was there a change in the MJO in the 1970s? Can this change be associated to SST anomalies? What was the level of MJO activity in the pre-reanalysis era? These questions were investigated with a stochastic model of the MJO. Reanalysis data (1948-2008) were used to develop a nine-state first order Markov model capable to simulate the non-stationarity of the MJO. The model is driven by observed SST anomalies and a large ensemble of simulations was performed to infer the activity of the MJO in the instrumental period (1880-2008). The model is capable to reproduce the activity of the MJO during the reanalysis period. The simulations indicate that the MJO exhibited a regime of near normal activity in 1948-1972 (3.4 events year(-1)) and two regimes of high activity in 1973-1989 (3.9 events) and 1990-2008 (4.6 events). Stochastic simulations indicate decadal shifts with near normal levels in 1880-1895 (3.4 events), low activity in 1896 1917 (2.6 events) and a return to near normal levels during 1918-1947 (3.3 events). The results also point out to significant decadal changes in probabilities of very active years (5 or more MJO events): 0.214 (1880-1895), 0.076 (1896-1917), 0.197 (1918-1947) and 0.193 (1948-1972). After a change in behavior in the 1970s, this probability has increased to 0.329 (1973-1989) and 0.510 (1990-2008). The observational and stochastic simulations presented here call attention to the need to further understand the variability of the MJO on a wide range of time scales.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The issue of smoothing in kriging has been addressed either by estimation or simulation. The solution via estimation calls for postprocessing kriging estimates in order to correct the smoothing effect. Stochastic simulation provides equiprobable images presenting no smoothing and reproducing the covariance model. Consequently, these images reproduce both the sample histogram and the sample semivariogram. However, there is still a problem, which is the lack of local accuracy of simulated images. In this paper, a postprocessing algorithm for correcting the smoothing effect of ordinary kriging estimates is compared with sequential Gaussian simulation realizations. Based on samples drawn from exhaustive data sets, the postprocessing algorithm is shown to be superior to any individual simulation realization yet, at the expense of providing one deterministic estimate of the random function.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper shows a new hybrid method for risk assessment regarding interruptions in sensitive processes due to faults in electric power distribution systems. This method determines indices related to long duration interruptions and short duration voltage variations (SDVV), such as voltage sags and swells in each customer supplied by the distribution network. Frequency of such occurrences and their impact on customer processes are determined for each bus and classified according to their corresponding magnitude and duration. The method is based on information regarding network configuration, system parameters and protective devices. It randomly generates a number of fault scenarios in order to assess risk areas regarding long duration interruptions and voltage sags and swells in an especially inventive way, including frequency of events according to their magnitude and duration. Based on sensitivity curves, the method determines frequency indices regarding disruption in customer processes that represent equipment malfunction and possible process interruptions due to voltage sags and swells. Such approach allows for the assessment of the annual costs associated with each one of the evaluated power quality indices.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The ever-increasing robustness and reliability of flow-simulation methods have consolidated CFD as a major tool in virtually all branches of fluid mechanics. Traditionally, those methods have played a crucial role in the analysis of flow physics. In more recent years, though, the subject has broadened considerably, with the development of optimization and inverse design applications. Since then, the search for efficient ways to evaluate flow-sensitivity gradients has received the attention of numerous researchers. In this scenario, the adjoint method has emerged as, quite possibly, the most powerful tool for the job, which heightens the need for a clear understanding of its conceptual basis. Yet, some of its underlying aspects are still subject to debate in the literature, despite all the research that has been carried out on the method. Such is the case with the adjoint boundary and internal conditions, in particular. The present work aims to shed more light on that topic, with emphasis on the need for an internal shock condition. By following the path of previous authors, the quasi-1D Euler problem is used as a vehicle to explore those concepts. The results clearly indicate that the behavior of the adjoint solution through a shock wave ultimately depends upon the nature of the objective functional.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The interest in using titanium to fabricate removable partial denture (RPD) frameworks has increased, but there are few studies evaluating the effects of casting methods on clasp behavior. OBJECTIVE: This study compared the occurrence of porosities and the retentive force of commercially pure titanium (CP Ti) and cobalt-chromium (Co-Cr) removable partial denture circumferential clasps cast by induction/centrifugation and plasma/vacuum-pressure. MATERIAL AND METHODS: 72 frameworks were cast from CP Ti (n=36) and Co-Cr alloy (n=36; control group). For each material, 18 frameworks were casted by electromagnetic induction and injected by centrifugation, whereas the other 18 were casted by plasma and injected by vacuum-pressure. For each casting method, three subgroups (n=6) were formed: 0.25 mm, 0.50 mm, and 0.75 mm undercuts. The specimens were radiographed and subjected to an insertion/removal test simulating 5 years of framework use. Data were analyzed by ANOVA and Tukey's to compare materials and cast methods (α=0.05). RESULTS: Three of 18 specimens of the induction/centrifugation group and 9 of 18 specimens of plasma/vacuum-pressure cast presented porosities, but only 1 and 7 specimens, respectively, were rejected for simulation test. For Co-Cr alloy, no defects were found. Comparing the casting methods, statistically significant differences (p<0.05) were observed only for the Co-Cr alloy with 0.25 mm and 0.50 mm undercuts. Significant differences were found for the 0.25 mm and 0.75 mm undercuts dependent on the material used. For the 0.50 mm undercut, significant differences were found when the materials were induction casted. CONCLUSION: Although both casting methods produced satisfactory CP Ti RPD frameworks, the occurrence of porosities was greater in the plasma/vacuum-pressure than in the induction/centrifugation method, the latter resulting in higher clasp rigidity, generating higher retention force values.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We have the purpose of analyzing the effect of explicit diffusion processes in a predator-prey stochastic lattice model. More precisely we wish to investigate the possible effects due to diffusion upon the thresholds of coexistence of species, i. e., the possible changes in the transition between the active state and the absorbing state devoid of predators. To accomplish this task we have performed time dependent simulations and dynamic mean-field approximations. Our results indicate that the diffusive process can enhance the species coexistence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Aims. We create a catalogue of simulated fossil groups and study their properties, in particular the merging histories of their first-ranked galaxies. We compare the simulated fossil group properties with those of both simulated non-fossil and observed fossil groups. Methods. Using simulations and a mock galaxy catalogue, we searched for massive (>5 x 10(13) h(-1) M-circle dot) fossil groups in the Millennium Simulation Galaxy Catalogue. In addition, we attempted to identify observed fossil groups in the Sloan Digital Sky Survey Data Release 6 using identical selection criteria. Results. Our predictions on the basis of the simulation data are: (a) fossil groups comprise about 5.5% of the total population of groups/clusters with masses larger than 5 x 10(13) h(-1) M-circle dot. This fraction is consistent with the fraction of fossil groups identified in the SDSS, after all observational biases have been taken into account; (b) about 88% of the dominant central objects in fossil groups are elliptical galaxies that have a median R-band absolute magnitude of similar to-23.5-5 log h, which is typical of the observed fossil groups known in the literature; (c) first-ranked galaxies of systems with M > 5 x 10(13) h(-1) M-circle dot, regardless of whether they are either fossil or non-fossil, are mainly formed by gas-poor mergers; (d) although fossil groups, in general, assembled most of their virial masses at higher redshifts in comparison with non-fossil groups, first-ranked galaxies in fossil groups merged later, i.e. at lower redshifts, compared with their non-fossil-group counterparts. Conclusions. We therefore expect to observe a number of luminous galaxies in the centres of fossil groups that show signs of a recent major merger.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Context. Fossil systems are defined to be X- ray bright galaxy groups ( or clusters) with a two- magnitude difference between their two brightest galaxies within half the projected virial radius, and represent an interesting extreme of the population of galaxy agglomerations. However, the physical conditions and processes leading to their formation are still poorly constrained. Aims. We compare the outskirts of fossil systems with that of normal groups to understand whether environmental conditions play a significant role in their formation. We study the groups of galaxies in both, numerical simulations and observations. Methods. We use a variety of statistical tools including the spatial cross- correlation function and the local density parameter Delta(5) to probe differences in the density and structure of the environments of "" normal"" and "" fossil"" systems in the Millennium simulation. Results. We find that the number density of galaxies surrounding fossil systems evolves from greater than that observed around normal systems at z = 0.69, to lower than the normal systems by z = 0. Both fossil and normal systems exhibit an increment in their otherwise radially declining local density measure (Delta(5)) at distances of order 2.5 r(vir) from the system centre. We show that this increment is more noticeable for fossil systems than normal systems and demonstrate that this difference is linked to the earlier formation epoch of fossil groups. Despite the importance of the assembly time, we show that the environment is different for fossil and non- fossil systems with similar masses and formation times along their evolution. We also confirm that the physical characteristics identified in the Millennium simulation can also be detected in SDSS observations. Conclusions. Our results confirm the commonly held belief that fossil systems assembled earlier than normal systems but also show that the surroundings of fossil groups could be responsible for the formation of their large magnitude gap.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present four estimators of the shared information (or interdepency) in ground states given that the coefficients appearing in the wave function are all real non-negative numbers and therefore can be interpreted as probabilities of configurations. Such ground states of Hermitian and non-Hermitian Hamiltonians can be given, for example, by superpositions of valence bond states which can describe equilibrium but also stationary states of stochastic models. We consider in detail the last case, the system being a classical not a quantum one. Using analytical and numerical methods we compare the values of the estimators in the directed polymer and the raise and peel models which have massive, conformal invariant and nonconformal invariant massless phases. We show that like in the case of the quantum problem, the estimators verify the area law with logarithmic corrections when phase transitions take place.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the method of Galerkin and the Askey-Wiener scheme are used to obtain approximate solutions to the stochastic displacement response of Kirchhoff plates with uncertain parameters. Theoretical and numerical results are presented. The Lax-Milgram lemma is used to express the conditions for existence and uniqueness of the solution. Uncertainties in plate and foundation stiffness are modeled by respecting these conditions, hence using Legendre polynomials indexed in uniform random variables. The space of approximate solutions is built using results of density between the space of continuous functions and Sobolev spaces. Approximate Galerkin solutions are compared with results of Monte Carlo simulation, in terms of first and second order moments and in terms of histograms of the displacement response. Numerical results for two example problems show very fast convergence to the exact solution, at excellent accuracies. The Askey-Wiener Galerkin scheme developed herein is able to reproduce the histogram of the displacement response. The scheme is shown to be a theoretically sound and efficient method for the solution of stochastic problems in engineering. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an accurate and efficient solution for the random transverse and angular displacement fields of uncertain Timoshenko beams. Approximate, numerical solutions are obtained using the Galerkin method and chaos polynomials. The Chaos-Galerkin scheme is constructed by respecting the theoretical conditions for existence and uniqueness of the solution. Numerical results show fast convergence to the exact solution, at excellent accuracies. The developed Chaos-Galerkin scheme accurately approximates the complete cumulative distribution function of the displacement responses. The Chaos-Galerkin scheme developed herein is a theoretically sound and efficient method for the solution of stochastic problems in engineering. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the Askey-Wiener scheme and the Galerkin method are used to obtain approximate solutions to stochastic beam bending on Winkler foundation. The study addresses Euler-Bernoulli beams with uncertainty in the bending stiffness modulus and in the stiffness of the foundation. Uncertainties are represented by parameterized stochastic processes. The random behavior of beam response is modeled using the Askey-Wiener scheme. One contribution of the paper is a sketch of proof of existence and uniqueness of the solution to problems involving fourth order operators applied to random fields. From the approximate Galerkin solution, expected value and variance of beam displacement responses are derived, and compared with corresponding estimates obtained via Monte Carlo simulation. Results show very fast convergence and excellent accuracies in comparison to Monte Carlo simulation. The Askey-Wiener Galerkin scheme presented herein is shown to be a theoretically solid and numerically efficient method for the solution of stochastic problems in engineering.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the last decades, the air traffic system has been changing to adapt itself to new social demands, mainly the safe growth of worldwide traffic capacity. Those changes are ruled by the Communication, Navigation, Surveillance/Air Traffic Management (CNS/ATM) paradigm, based on digital communication technologies (mainly satellites) as a way of improving communication, surveillance, navigation and air traffic management services. However, CNS/ATM poses new challenges and needs, mainly related to the safety assessment process. In face of these new challenges, and considering the main characteristics of the CNS/ATM, a methodology is proposed at this work by combining ""absolute"" and ""relative"" safety assessment methods adopted by the International Civil Aviation Organization (ICAO) in ICAO Doc.9689 [14], using Fluid Stochastic Petri Nets (FSPN) as the modeling formalism, and compares the safety metrics estimated from the simulation of both the proposed (in analysis) and the legacy system models. To demonstrate its usefulness, the proposed methodology was applied to the ""Automatic Dependent Surveillance-Broadcasting"" (ADS-B) based air traffic control system. As conclusions, the proposed methodology assured to assess CNS/ATM system safety properties, in which FSPN formalism provides important modeling capabilities, and discrete event simulation allowing the estimation of the desired safety metric. (C) 2011 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The motivation for this research is to make a comparison between dynamic results of a free railway wheelset derailment and safety limits. For this purpose, a numerical simulation of a wheelset derailment submitted to increasing lateral force is used to compare with the safety limit, using different criteria. A simplified wheelset model is used to simulate derailments with different adhesion conditions. The contact force components, including the longitudinal and spin effects, are identified in a steady-state condition on the verge of a derailment. The contact force ratios are used in a three-dimensional (3D) analytical formula to calculate the safety limits. Simulation results obtained with two contact methods were compared with the published results and the safety limit was identified with the two criteria. Results confirm Nadal`s conservative aspect and show that safety 3D analytical formula presents slightly higher safety limits for lower friction coefficients and smaller limits for high friction, in comparison with the simulation results with Fastsim.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We examine the representation of judgements of stochastic independence in probabilistic logics. We focus on a relational logic where (i) judgements of stochastic independence are encoded by directed acyclic graphs, and (ii) probabilistic assessments are flexible in the sense that they are not required to specify a single probability measure. We discuss issues of knowledge representation and inference that arise from our particular combination of graphs, stochastic independence, logical formulas and probabilistic assessments. (C) 2007 Elsevier B.V. All rights reserved.