996 resultados para Statistical Convergence


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rapid evolution of nanotechnology appeals for the understanding of global response of nanoscale systems based on atomic interactions, hence necessitates novel, sophisticated, and physically based approaches to bridge the gaps between various length and time scales. In this paper, we propose a group of statistical thermodynamics methods for the simulations of nanoscale systems under quasi-static loading at finite temperature, that is, molecular statistical thermodynamics (MST) method, cluster statistical thermodynamics (CST) method, and the hybrid molecular/cluster statistical thermodynamics (HMCST) method. These methods, by treating atoms as oscillators and particles simultaneously, as well as clusters, comprise different spatial and temporal scales in a unified framework. One appealing feature of these methods is their "seamlessness" or consistency in the same underlying atomistic model in all regions consisting of atoms and clusters, and hence can avoid the ghost force in the simulation. On the other hand, compared with conventional MD simulations, their high computational efficiency appears very attractive, as manifested by the simulations of uniaxial compression and nanoindenation. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A relative displacement between the grid points of optical fields and those of phase screens may occur in the simulation of light propagation through the turbulent atmosphere. A statistical interpolator is proposed to solve this problem in this paper. It is evaluated by the phase structure function and numerical experiments of light propagation through atmospheric turbulence with/without adaptive optics (AO) and it is also compared with the well-known linear interpolator under the same condition. Results of the phase structure function show that the statistical interpolator is more accurate in comparison with the linear one, especially in the high frequency region. More importantly, the long-exposure results of light propagation through the turbulent atmosphere with/without AO also show that the statistical interpolator is more accurate and reliable than the linear one. (C) 2009 Optical Society of America.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The system of coupled oscillators and its time-discretization (with constant stepsize h) are considered in this paper. Under some conditions, it is showed that the discrete systems have one-dimensional global attractors l(h) converging to l which is the global attractor of continuous system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Fracture owing to the coalescence of numerous microcracks can be described by a simple statistical model, where a coalescence event stochastically occurs as the number density of nucleated microcracks increases. Both numerical simulation and statistical analysis reveal that a microcrack coalescence process may display avalanche behavior and that the final failure is catastrophic. The cumulative distribution of coalescence events in the vicinity of critical fracture follows a power law and the fracture profile has self-affine fractal characteristic. Some macromechanical quantities may be traced back and extracted from the mesoscopic process based on the statistical analysis of coalescence events.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The stress release model, a stochastic version of the elastic rebound theory, is applied to the large events from four synthetic earthquake catalogs generated by models with various levels of disorder in distribution of fault zone strength (Ben-Zion, 1996) They include models with uniform properties (U), a Parkfield-type asperity (A), fractal brittle properties (F), and multi-size-scale heterogeneities (M). The results show that the degree of regularity or predictability in the assumed fault properties, based on both the Akaike information criterion and simulations, follows the order U, F, A, and M, which is in good agreement with that obtained by pattern recognition techniques applied to the full set of synthetic data. Data simulated from the best fitting stress release models reproduce, both visually and in distributional terms, the main features of the original catalogs. The differences in character and the quality of prediction between the four cases are shown to be dependent on two main aspects: the parameter controlling the sensitivity to departures from the mean stress level and the frequency-magnitude distribution, which differs substantially between the four cases. In particular, it is shown that the predictability of the data is strongly affected by the form of frequency-magnitude distribution, being greatly reduced if a pure Gutenburg-Richter form is assumed to hold out to high magnitudes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The stress release model, a stochastic version of the elastic-rebound theory, is applied to the historical earthquake data from three strong earthquake-prone regions of China, including North China, Southwest China, and the Taiwan seismic regions. The results show that the seismicity along a plate boundary (Taiwan) is more active than in intraplate regions (North and Southwest China). The degree of predictability or regularity of seismic events in these seismic regions, based on both the Akaike information criterion (AIC) and fitted sensitivity parameters, follows the order Taiwan, Southwest China, and North China, which is further identified by numerical simulations. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Optimal Bayesian multi-target filtering is in general computationally impractical owing to the high dimensionality of the multi-target state. The Probability Hypothesis Density (PHD) filter propagates the first moment of the multi-target posterior distribution. While this reduces the dimensionality of the problem, the PHD filter still involves intractable integrals in many cases of interest. Several authors have proposed Sequential Monte Carlo (SMC) implementations of the PHD filter. However, these implementations are the equivalent of the Bootstrap Particle Filter, and the latter is well known to be inefficient. Drawing on ideas from the Auxiliary Particle Filter (APF), a SMC implementation of the PHD filter which employs auxiliary variables to enhance its efficiency was proposed by Whiteley et. al. Numerical examples were presented for two scenarios, including a challenging nonlinear observation model, to support the claim. This paper studies the theoretical properties of this auxiliary particle implementation. $\mathbb{L}_p$ error bounds are established from which almost sure convergence follows.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Simulated annealing is a popular method for approaching the solution of a global optimization problem. Existing results on its performance apply to discrete combinatorial optimization where the optimization variables can assume only a finite set of possible values. We introduce a new general formulation of simulated annealing which allows one to guarantee finite-time performance in the optimization of functions of continuous variables. The results hold universally for any optimization problem on a bounded domain and establish a connection between simulated annealing and up-to-date theory of convergence of Markov chain Monte Carlo methods on continuous domains. This work is inspired by the concept of finite-time learning with known accuracy and confidence developed in statistical learning theory.