903 resultados para Geo-statistical model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The rapid evolution of nanotechnology appeals for the understanding of global response of nanoscale systems based on atomic interactions, hence necessitates novel, sophisticated, and physically based approaches to bridge the gaps between various length and time scales. In this paper, we propose a group of statistical thermodynamics methods for the simulations of nanoscale systems under quasi-static loading at finite temperature, that is, molecular statistical thermodynamics (MST) method, cluster statistical thermodynamics (CST) method, and the hybrid molecular/cluster statistical thermodynamics (HMCST) method. These methods, by treating atoms as oscillators and particles simultaneously, as well as clusters, comprise different spatial and temporal scales in a unified framework. One appealing feature of these methods is their "seamlessness" or consistency in the same underlying atomistic model in all regions consisting of atoms and clusters, and hence can avoid the ghost force in the simulation. On the other hand, compared with conventional MD simulations, their high computational efficiency appears very attractive, as manifested by the simulations of uniaxial compression and nanoindenation. (C) 2008 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A general analytical model for a composite with an isotropic matrix and two populations of spherical inclusions is proposed. The method is based on the second order moment of stress for evaluating the homogenised effective stress in the matrix and on the secant moduli concept for the plastic deformation. With Webull's statistical law for the strength of SiCp particles, the model can quantitatively predict the influence of particle fracture on the mechanical properties of PMMCs. Application of the proposed model to the particle cluster shows that the particle cluster has neglected influence on the strain and stress curves of the composite. (C) 1998 Elsevier Science B.V.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stress release model, a stochastic version of the elastic rebound theory, is applied to the large events from four synthetic earthquake catalogs generated by models with various levels of disorder in distribution of fault zone strength (Ben-Zion, 1996) They include models with uniform properties (U), a Parkfield-type asperity (A), fractal brittle properties (F), and multi-size-scale heterogeneities (M). The results show that the degree of regularity or predictability in the assumed fault properties, based on both the Akaike information criterion and simulations, follows the order U, F, A, and M, which is in good agreement with that obtained by pattern recognition techniques applied to the full set of synthetic data. Data simulated from the best fitting stress release models reproduce, both visually and in distributional terms, the main features of the original catalogs. The differences in character and the quality of prediction between the four cases are shown to be dependent on two main aspects: the parameter controlling the sensitivity to departures from the mean stress level and the frequency-magnitude distribution, which differs substantially between the four cases. In particular, it is shown that the predictability of the data is strongly affected by the form of frequency-magnitude distribution, being greatly reduced if a pure Gutenburg-Richter form is assumed to hold out to high magnitudes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A generalized model for the effective thermal conductivity of porous media is derived based on the fact that statistical self-similarity exists in porous media. The proposed model assumes that porous media consist of two portions: randomly distributed non-touching particles and self-similarly distributed particles contacting each other with resistance. The latter are simulated by Sierpinski carpets with side length L = 13 and cutout size C = 3, 5, 7 and 9, respectively, depending upon the porosity concerned. Recursive formulae are presented and expressed as a function of porosity, ratio of areas, ratio of component thermal conductivities and contact resistance, and there is no empirical constant and every parameter has a clear physical meaning. The model predictions are compared with the existing experimental data, and good agreement is found in a wide range of porosity of 0.14-0.80, and this verifies the validity of the proposed model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An approximate model, a fractal geometry model, for the effective thermal conductivity of three-phase/unsaturated porous media is proposed based on the thermal-electrical analogy technique and on statistical self-similarity of porous media. The proposed thermal conductivity model is expressed as a function of porosity (related to stage n of Sierpinski carpet), ratio of areas, ratio of component thermal conductivities, and saturation. The recursive algorithm for the thermal conductivity by the proposed model is presented and found to be quite simple. The model predictions are compared with the existing measurements. Good agreement is found between the present model predictions and the existing experimental data. This verifies the validity of the proposed model. (C) 2004 American Institute of Physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The stress release model, a stochastic version of the elastic-rebound theory, is applied to the historical earthquake data from three strong earthquake-prone regions of China, including North China, Southwest China, and the Taiwan seismic regions. The results show that the seismicity along a plate boundary (Taiwan) is more active than in intraplate regions (North and Southwest China). The degree of predictability or regularity of seismic events in these seismic regions, based on both the Akaike information criterion (AIC) and fitted sensitivity parameters, follows the order Taiwan, Southwest China, and North China, which is further identified by numerical simulations. (c) 2004 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A model of dynamical process and stochastic jump has been put forward to study the pattern evolution in damage-fracture. According to the final states of evolution processes, the evolution modes can be classified as globally stable modes (GS modes) and evolution induced catastrophic modes (ElC modes); the latter are responsible for fracture. A statistical description is introduced to clarify the pattern evolution in this paper. It is indicated that the appearance of fracture in disordered materials should be depicted by probability distribution function.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In order to understand the mechanism of the incipient spallation in rolled metals, a one dimensional statistical mode1 on evolution of microcracks in spallation was proposed. The crack length appears to be the fundamental variable in the statistical description. Two dynamic processes, crack nucleation and growth, were involved in the model of damage evolution. A simplified case was examined and preliminary correlation to experimental observations of spallation was made.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The method of statistical mechanics is applied to the study of the one-dimensional model of turbulence proposed in an earlier paper. The closure problem is solved by the variational approach which has been developed for the three-dimensional case, yielding two integral equations for two unknown functions. By solving the two integral equations, the Kolmogorov k−5/3 law is derived and the (one-dimensional) Kolmogorov constant Ko is evaluated, obtaining Ko=0.55, which is in good agreement with the result of numerical experiments on one-dimensional turbulence.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper analyzes the cyclical properties of a generalized version of Uzawa-Lucas endogenous growth model. We study the dynamic features of different cyclical components of this model characterized by a variety of decomposition methods. The decomposition methods considered can be classified in two groups. On the one hand, we consider three statistical filters: the Hodrick-Prescott filter, the Baxter-King filter and Gonzalo-Granger decomposition. On the other hand, we use four model-based decomposition methods. The latter decomposition procedures share the property that the cyclical components obtained by these methods preserve the log-linear approximation of the Euler-equation restrictions imposed by the agent’s intertemporal optimization problem. The paper shows that both model dynamics and model performance substantially vary across decomposition methods. A parallel exercise is carried out with a standard real business cycle model. The results should help researchers to better understand the performance of Uzawa-Lucas model in relation to standard business cycle models under alternative definitions of the business cycle.

Relevância:

30.00% 30.00%

Publicador:

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a method of image-speckle contrast for the nonprecalibration measurement of the root-mean-square roughness and the lateral-correlation length of random surfaces with Gaussian correlation. We use the simplified model of the speckle fields produced by the weak scattering object in the theoretical analysis. The explicit mathematical relation shows that the saturation value of the image-speckle contrast at a large aperture radius determines the roughness, while the variation of the contrast with the aperture radius determines the lateral-correlation length. In the experimental performance, we specially fabricate the random surface samples with Gaussian correlation. The square of the image-speckle contrast is measured versus the radius of the aperture in the 4f system, and the roughness and the lateral-correlation length are extracted by fitting the theoretical result to the experimental data. Comparison of the measurement with that by an atomic force microscope shows our method has a satisfying accuracy. (C) 2002 Optical Society of America.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis explores the problem of mobile robot navigation in dense human crowds. We begin by considering a fundamental impediment to classical motion planning algorithms called the freezing robot problem: once the environment surpasses a certain level of complexity, the planner decides that all forward paths are unsafe, and the robot freezes in place (or performs unnecessary maneuvers) to avoid collisions. Since a feasible path typically exists, this behavior is suboptimal. Existing approaches have focused on reducing predictive uncertainty by employing higher fidelity individual dynamics models or heuristically limiting the individual predictive covariance to prevent overcautious navigation. We demonstrate that both the individual prediction and the individual predictive uncertainty have little to do with this undesirable navigation behavior. Additionally, we provide evidence that dynamic agents are able to navigate in dense crowds by engaging in joint collision avoidance, cooperatively making room to create feasible trajectories. We accordingly develop interacting Gaussian processes, a prediction density that captures cooperative collision avoidance, and a "multiple goal" extension that models the goal driven nature of human decision making. Navigation naturally emerges as a statistic of this distribution.

Most importantly, we empirically validate our models in the Chandler dining hall at Caltech during peak hours, and in the process, carry out the first extensive quantitative study of robot navigation in dense human crowds (collecting data on 488 runs). The multiple goal interacting Gaussian processes algorithm performs comparably with human teleoperators in crowd densities nearing 1 person/m2, while a state of the art noncooperative planner exhibits unsafe behavior more than 3 times as often as the multiple goal extension, and twice as often as the basic interacting Gaussian process approach. Furthermore, a reactive planner based on the widely used dynamic window approach proves insufficient for crowd densities above 0.55 people/m2. We also show that our noncooperative planner or our reactive planner capture the salient characteristics of nearly any dynamic navigation algorithm. For inclusive validation purposes, we show that either our non-interacting planner or our reactive planner captures the salient characteristics of nearly any existing dynamic navigation algorithm. Based on these experimental results and theoretical observations, we conclude that a cooperation model is critical for safe and efficient robot navigation in dense human crowds.

Finally, we produce a large database of ground truth pedestrian crowd data. We make this ground truth database publicly available for further scientific study of crowd prediction models, learning from demonstration algorithms, and human robot interaction models in general.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the measurement of the Higgs Boson decaying into two photons the parametrization of an appropriate background model is essential for fitting the Higgs signal mass peak over a continuous background. This diphoton background modeling is crucial in the statistical process of calculating exclusion limits and the significance of observations in comparison to a background-only hypothesis. It is therefore ideal to obtain knowledge of the physical shape for the background mass distribution as the use of an improper function can lead to biases in the observed limits. Using an Information-Theoretic (I-T) approach for valid inference we apply Akaike Information Criterion (AIC) as a measure of the separation for a fitting model from the data. We then implement a multi-model inference ranking method to build a fit-model that closest represents the Standard Model background in 2013 diphoton data recorded by the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider (LHC). Potential applications and extensions of this model-selection technique are discussed with reference to CMS detector performance measurements as well as in potential physics analyses at future detectors.