1000 resultados para ASEP models
Resumo:
Approximate Bayesian computation (ABC) is a popular technique for analysing data for complex models where the likelihood function is intractable. It involves using simulation from the model to approximate the likelihood, with this approximate likelihood then being used to construct an approximate posterior. In this paper, we consider methods that estimate the parameters by maximizing the approximate likelihood used in ABC. We give a theoretical analysis of the asymptotic properties of the resulting estimator. In particular, we derive results analogous to those of consistency and asymptotic normality for standard maximum likelihood estimation. We also discuss how sequential Monte Carlo methods provide a natural method for implementing our likelihood-based ABC procedures.
Resumo:
Based on the scaling criteria of polymer flooding reservoir obtained in our previous work in which the gravity and capillary forces, compressibility, non-Newtonian behavior, absorption, dispersion, and diffusion are considered, eight partial similarity models are designed. A new numerical approach of sensitivity analysis is suggested to quantify the dominance degree of relaxed dimensionless parameters for partial similarity model. The sensitivity factor quantifying the dominance degree of relaxed dimensionless parameter is defined. By solving the dimensionless governing equations including all dimensionless parameters, the sensitivity factor of each relaxed dimensionless parameter is calculated for each partial similarity model; thus, the dominance degree of the relaxed one is quantitatively determined. Based on the sensitivity analysis, the effect coefficient of partial similarity model is defined as the summation of product of sensitivity factor of relaxed dimensionless parameter and its relative relaxation quantity. The effect coefficient is used as a criterion to evaluate each partial similarity model. Then the partial similarity model with the smallest effect coefficient can be singled out to approximate to the prototype. Results show that the precision of partial similarity model is not only determined by the number of satisfied dimensionless parameters but also the relative relaxation quantity of the relaxed ones.
Resumo:
The transitions between the different contact models which include the Hertz, Bradley, Johnson-Kendall-Roberts (JKR), Derjaguin-Muller-Toporov (DMT) and Maugis-Dugdale (MD) models are revealed by analyzing their contact pressure profiles and surface interactions. Inside the contact area, surface interaction/adhesion induces tensile contact pressure around the contact edge. Outside the contact area, whether or not to consider the surface interaction has a significant influence on the contact system equilibrium. The difference in contact pressure due to the surface interaction inside the contact area and the equilibrium influenced by the surface interaction outside the contact area are physically responsible for the different results of the different models. A systematic study on the transitions between different models is shown by analyzing the contact pressure profiles and the surface interactions both inside and outside the contact area. The definitions of contact radius and the flatness of contact surfaces are also discussed. (C) Koninklijke Brill NV, Leiden, 2008.
Resumo:
Two types of peeling experiments are performed in the present research. One is for the Al film/Al2O3 substrate system with an adhesive layer between the film and the substrate. The other one is for the Cu film/Al2O3 substrate system without adhesive layer between the film and the substrate, and the Cu films are electroplated onto the Al2O3 substrates. For the case with adhesive layer, two kinds of adhesives are selected, which are all the mixtures of epoxy and polyimide with mass ratios 1:1.5 and 1:1, respectively. The relationships between energy release rate, the film thickness and the adhesive layer thickness are measured during the steady-state peeling process. The effects of the adhesive layer on the energy release rate are analyzed. Using the experimental results, several analytical criteria for the steady-state peeling based on the bending model and on the two-dimensional finite element analysis model are critically assessed. Through assessment of analytical models, we find that the cohesive zone criterion based on the beam bend model is suitable for a weak interface strength case and it describes a macroscale fracture process zone case, while the two-dimensional finite element model is effective to both the strong interface and weak interface, and it describes a small-scale fracture process zone case. (C) 2007 Elsevier Ltd. All rights reserved.
Resumo:
Many problems in control and signal processing can be formulated as sequential decision problems for general state space models. However, except for some simple models one cannot obtain analytical solutions and has to resort to approximation. In this thesis, we have investigated problems where Sequential Monte Carlo (SMC) methods can be combined with a gradient based search to provide solutions to online optimisation problems. We summarise the main contributions of the thesis as follows. Chapter 4 focuses on solving the sensor scheduling problem when cast as a controlled Hidden Markov Model. We consider the case in which the state, observation and action spaces are continuous. This general case is important as it is the natural framework for many applications. In sensor scheduling, our aim is to minimise the variance of the estimation error of the hidden state with respect to the action sequence. We present a novel SMC method that uses a stochastic gradient algorithm to find optimal actions. This is in contrast to existing works in the literature that only solve approximations to the original problem. In Chapter 5 we presented how an SMC can be used to solve a risk sensitive control problem. We adopt the use of the Feynman-Kac representation of a controlled Markov chain flow and exploit the properties of the logarithmic Lyapunov exponent, which lead to a policy gradient solution for the parameterised problem. The resulting SMC algorithm follows a similar structure with the Recursive Maximum Likelihood(RML) algorithm for online parameter estimation. In Chapters 6, 7 and 8, dynamic Graphical models were combined with with state space models for the purpose of online decentralised inference. We have concentrated more on the distributed parameter estimation problem using two Maximum Likelihood techniques, namely Recursive Maximum Likelihood (RML) and Expectation Maximization (EM). The resulting algorithms can be interpreted as an extension of the Belief Propagation (BP) algorithm to compute likelihood gradients. In order to design an SMC algorithm, in Chapter 8 uses a nonparametric approximations for Belief Propagation. The algorithms were successfully applied to solve the sensor localisation problem for sensor networks of small and medium size.
Resumo:
The stress release model, a stochastic version of the elastic rebound theory, is applied to the large events from four synthetic earthquake catalogs generated by models with various levels of disorder in distribution of fault zone strength (Ben-Zion, 1996) They include models with uniform properties (U), a Parkfield-type asperity (A), fractal brittle properties (F), and multi-size-scale heterogeneities (M). The results show that the degree of regularity or predictability in the assumed fault properties, based on both the Akaike information criterion and simulations, follows the order U, F, A, and M, which is in good agreement with that obtained by pattern recognition techniques applied to the full set of synthetic data. Data simulated from the best fitting stress release models reproduce, both visually and in distributional terms, the main features of the original catalogs. The differences in character and the quality of prediction between the four cases are shown to be dependent on two main aspects: the parameter controlling the sensitivity to departures from the mean stress level and the frequency-magnitude distribution, which differs substantially between the four cases. In particular, it is shown that the predictability of the data is strongly affected by the form of frequency-magnitude distribution, being greatly reduced if a pure Gutenburg-Richter form is assumed to hold out to high magnitudes.
Resumo:
We introduce a conceptual model for the in-plane physics of an earthquake fault. The model employs cellular automaton techniques to simulate tectonic loading, earthquake rupture, and strain redistribution. The impact of a hypothetical crustal elastodynamic Green's function is approximated by a long-range strain redistribution law with a r(-p) dependance. We investigate the influence of the effective elastodynamic interaction range upon the dynamical behaviour of the model by conducting experiments with different values of the exponent (p). The results indicate that this model has two distinct, stable modes of behaviour. The first mode produces a characteristic earthquake distribution with moderate to large events preceeded by an interval of time in which the rate of energy release accelerates. A correlation function analysis reveals that accelerating sequences are associated with a systematic, global evolution of strain energy correlations within the system. The second stable mode produces Gutenberg-Richter statistics, with near-linear energy release and no significant global correlation evolution. A model with effectively short-range interactions preferentially displays Gutenberg-Richter behaviour. However, models with long-range interactions appear to switch between the characteristic and GR modes. As the range of elastodynamic interactions is increased, characteristic behaviour begins to dominate GR behaviour. These models demonstrate that evolution of strain energy correlations may occur within systems with a fixed elastodynamic interaction range. Supposing that similar mode-switching dynamical behaviour occurs within earthquake faults then intermediate-term forecasting of large earthquakes may be feasible for some earthquakes but not for others, in alignment with certain empirical seismological observations. Further numerical investigation of dynamical models of this type may lead to advances in earthquake forecasting research and theoretical seismology.