971 resultados para Stochastic modelling


Relevância:

30.00% 30.00%

Publicador:

Resumo:

A new multi-scale model of brittle fracture growth in an Ag plate with macroscopic dimensions is proposed in which the crack propagation is identified with the stochastic drift-diffusion motion of the crack-tip atom through the material. The model couples molecular dynamics simulations, based on many-body interatomic potentials, with the continuum-based theories of fracture mechanics. The Ito stochastic differential equation is used to advance the tip position on a macroscopic scale before each nano-scale simulation is performed. Well-known crack characteristics, such as the roughening transitions of the crack surfaces, as well as the macroscopic crack trajectories are obtained.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Vacuum Arc Remelting (VAR) is the accepted method for producing homogeneous, fine microstructures that are free of inclusions required for rotating grade applications. However, as ingot sizes are increasing INCONEL 718 becomes increasingly susceptible to defects such as freckles, tree rings, and white spots increases for large diameter billets. Therefore, predictive models of these defects are required to allow optimization of process parameters. In this paper, a multiscale and multi-physics model is presented to predict the development of microstructures in the VAR ingot during solidification. At the microscale, a combined stochastic nucleation approach and finite difference solution of the solute diffusion is applied in the semi-solid zone of the VAR ingot. The micromodel is coupled with a solution of the macroscale heat transfer, fluid flow and electromagnetism in the VAR process through the temperature, pressure and fluid flow fields. The main objective of this study is to achieve a better understanding of the formation of the defects in VAR by quantifying the influence of VAR processing parameters on grain nucleation and dendrite growth. In particular, the effect of different ingot growth velocities on the microstructure formation was investigated. It was found that reducing the velocity produces significantly more coarse grains.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-market effects of agriculture are often estimated using discrete choice models from stated preference surveys. In this context we propose two ways of modelling attribute non-attendance. The first involves constraining coefficients to zero in a latent class framework, whereas the second is based on stochastic attribute selection and grounded in Bayesian estimation. Their implications are explored in the context of a stated preference survey designed to value landscapes in Ireland. Taking account of attribute non-attendance with these data improves fit and tends to involve two attributes one of which is likely to be cost, thereby leading to substantive changes in derived welfare estimates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A Monte-Carlo simulation-based model has been constructed to assess a public health scheme involving mobile-volunteer cardiac First-Responders. The scheme being assessed aims to improve survival of Sudden-Cardiac-Arrest (SCA) patients, through reducing the time until administration of life-saving defibrillation treatment, with volunteers being paged to respond to possible SCA incidents alongside the Emergency Medical Services. The need for a model, for example, to assess the impact of the scheme in different geographical regions, was apparent upon collection of observational trial data (given it exhibited stochastic and spatial complexities). The simulation-based model developed has been validated and then used to assess the scheme's benefits in an alternative rural region (not a part of the original trial). These illustrative results conclude that the scheme may not be the most efficient use of National Health Service resources in this geographical region, thus demonstrating the importance and usefulness of simulation modelling in aiding decision making.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Reliable prediction of long-term medical device performance using computer simulation requires consideration of variability in surgical procedure, as well as patient-specific factors. However, even deterministic simulation of long-term failure processes for such devices is time and resource consuming so that including variability can lead to excessive time to achieve useful predictions. This study investigates the use of an accelerated probabilistic framework for predicting the likely performance envelope of a device and applies it to femoral prosthesis loosening in cemented hip arthroplasty.
A creep and fatigue damage failure model for bone cement, in conjunction with an interfacial fatigue model for the implant–cement interface, was used to simulate loosening of a prosthesis within a cement mantle. A deterministic set of trial simulations was used to account for variability of a set of surgical and patient factors, and a response surface method was used to perform and accelerate a Monte Carlo simulation to achieve an estimate of the likely range of prosthesis loosening. The proposed framework was used to conceptually investigate the influence of prosthesis selection and surgical placement on prosthesis migration.
Results demonstrate that the response surface method is capable of dramatically reducing the time to achieve convergence in mean and variance of predicted response variables. A critical requirement for realistic predictions is the size and quality of the initial training dataset used to generate the response surface and further work is required to determine the recommendations for a minimum number of initial trials. Results of this conceptual application predicted that loosening was sensitive to the implant size and femoral width. Furthermore, different rankings of implant performance were predicted when only individual simulations (e.g. an average condition) were used to rank implants, compared with when stochastic simulations were used. In conclusion, the proposed framework provides a viable approach to predicting realistic ranges of loosening behaviour for orthopaedic implants in reduced timeframes compared with conventional Monte Carlo simulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When studying heterogeneous aquifer systems, especially at regional scale, a degree of generalization is anticipated. This can be due to sparse sampling regimes, complex depositional environments or lack of accessibility to measure the subsurface. This can lead to an inaccurate conceptualization which can be detrimental when applied to groundwater flow models. It is important that numerical models are based on observed and accurate geological information and do not rely on the distribution of artificial aquifer properties. This can still be problematic as data will be modelled at a different scale to which it was collected. It is proposed here that integrating geophysics and upscaling techniques can assist in a more realistic and deterministic groundwater flow model. In this study, the sedimentary aquifer of the Lagan Valley in Northern Ireland is chosen due to intruding sub-vertical dolerite dykes. These dykes are of a lower permeability than the sandstone aquifer. The use of airborne magnetics allows the delineation of heterogeneities, confirmed by field analysis. Permeability measured at the field scale is then upscaled to different levels using a correlation with the geophysical data, creating equivalent parameters that can be directly imported into numerical groundwater flow models. These parameters include directional equivalent permeabilities and anisotropy. Several stages of upscaling are modelled in finite element. Initial modelling is providing promising results, especially at the intermediate scale, suggesting an accurate distribution of aquifer properties. This deterministic based methodology is being expanded to include stochastic methods of obtaining heterogeneity location based on airborne geophysical data. This is through the Direct Sample method of Multiple-Point Statistics (MPS). This method uses the magnetics as a training image to computationally determine a probabilistic occurrence of heterogeneity. There is also a need to apply the method to alternate geological contexts where the heterogeneity is of a higher permeability than the host rock.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates the effects of ground heterogeneity, considering permeability as a random variable, on an intruding SW wedge using Monte Carlo simulations. Random permeability fields were generated, using the method of Local Average Subdivision (LAS), based on a lognormal probability density function. The LAS method allows the creation of spatially correlated random fields, generated using coefficients of variation (COV) and horizontal and vertical scales of fluctuation (SOF). The numerical modelling code SUTRA was employed to solve the coupled flow and transport problem. The well-defined 2D dispersive Henry problem was used as the test case for the method. The intruding SW wedge is defined by two key parameters, the toe penetration length (TL) and the width of mixing zone (WMZ). These parameters were compared to the results of a homogeneous case simulated using effective permeability values. The simulation results revealed: (1) an increase in COV resulted in a seaward movement of TL; (2) the WMZ extended with increasing COV; (3) a general increase in horizontal and vertical SOF produced a seaward movement of TL, with the WMZ increasing slightly; (4) as the anisotropic ratio increased the TL intruded further inland and the WMZ reduced in size. The results show that for large values of COV, effective permeability parameters are inadequate at reproducing the effects of heterogeneity on SW intrusion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the most widely used techniques in computer vision for foreground detection is to model each background pixel as a Mixture of Gaussians (MoG). While this is effective for a static camera with a fixed or a slowly varying background, it fails to handle any fast, dynamic movement in the background. In this paper, we propose a generalised framework, called region-based MoG (RMoG), that takes into consideration neighbouring pixels while generating the model of the observed scene. The model equations are derived from Expectation Maximisation theory for batch mode, and stochastic approximation is used for online mode updates. We evaluate our region-based approach against ten sequences containing dynamic backgrounds, and show that the region-based approach provides a performance improvement over the traditional single pixel MoG. For feature and region sizes that are equal, the effect of increasing the learning rate is to reduce both true and false positives. Comparison with four state-of-the art approaches shows that RMoG outperforms the others in reducing false positives whilst still maintaining reasonable foreground definition. Lastly, using the ChangeDetection (CDNet 2014) benchmark, we evaluated RMoG against numerous surveillance scenes and found it to amongst the leading performers for dynamic background scenes, whilst providing comparable performance for other commonly occurring surveillance scenes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predicting life expectancy has become of upmost importance in society. Pension providers, insurance companies, government bodies and individuals in the developed world have a vested interest in understanding how long people will live for. This desire to better understand life expectancy has resulted in an explosion of stochastic mortality models many of which identify linear trends in mortality rates by time. In making use of such models for forecasting purposes we rely on the assumption that the direction of the linear trend (determined from the data used for fitting purposes) will not change in the future, recent literature has started to question this assumption. In this paper we carry out a comprehensive investigation of these types of models using male and female data from 30 countries and using the theory of structural breaks to identify changes in the extracted trends by time. We find that structural breaks are present in a substantial number of cases, that they are more prevalent in male data than in female data, that the introduction of additional period factors into the model reduces their presence, and that allowing for changes in the trend improves the fit and forecast substantially.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper prepared for the Handbook of Statistics (Vol.14: Statistical Methods in Finance), surveys the subject of stochastic volatility. the following subjects are covered: volatility in financial markets (instantaneous volatility of asset returns, implied volatilities in option prices and related stylized facts), statistical modelling in discrete and continuous time and, finally, statistical inference (methods of moments, quasi-maximum likelihood, likelihood-based and bayesian methods and indirect inference).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The classical methods of analysing time series by Box-Jenkins approach assume that the observed series uctuates around changing levels with constant variance. That is, the time series is assumed to be of homoscedastic nature. However, the nancial time series exhibits the presence of heteroscedasticity in the sense that, it possesses non-constant conditional variance given the past observations. So, the analysis of nancial time series, requires the modelling of such variances, which may depend on some time dependent factors or its own past values. This lead to introduction of several classes of models to study the behaviour of nancial time series. See Taylor (1986), Tsay (2005), Rachev et al. (2007). The class of models, used to describe the evolution of conditional variances is referred to as stochastic volatility modelsThe stochastic models available to analyse the conditional variances, are based on either normal or log-normal distributions. One of the objectives of the present study is to explore the possibility of employing some non-Gaussian distributions to model the volatility sequences and then study the behaviour of the resulting return series. This lead us to work on the related problem of statistical inference, which is the main contribution of the thesis