932 resultados para Stochastic Extension


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The effects of stochastic extension on the statistical evolution of the ideal microcrack system are discussed. First, a general theoretical formulation and an expression for the transition probability of extension process are presented, then the features of evolution in stochastic model are demonstrated by several numerical results and compared with that in deterministic model.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

We extend the Sznajd Model for opinion formation by introducing persuasion probabilities for opinions. Moreover, we couple the system to an environment which mimics the application of the opinion. This results in a feedback, representing single-state opinion transitions in opposite to the two-state opinion transitions for persuading other people. We call this model opinion formation in an open community (OFOC). It can be seen as "stochastic extension of the Sznajd model for an open community, because it allows for "special choice of parameters to recover the original Sznajd model. We demonstrate the effect of feedback in the OFOC model by applying it to a scenario in which, e.g., opinion B is worse then opinion A but easier explained to other people. Casually formulated we analyzed the question, how much better one has to be, in order to persuade other people, provided the opinion is worse. Our results reveal a linear relation between the transition probability for opinion B and the influence of the environment on B.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The last 30 years have seen Fuzzy Logic (FL) emerging as a method either complementing or challenging stochastic methods as the traditional method of modelling uncertainty. But the circumstances under which FL or stochastic methods should be used are shrouded in disagreement, because the areas of application of statistical and FL methods are overlapping with differences in opinion as to when which method should be used. Lacking are practically relevant case studies comparing these two methods. This work compares stochastic and FL methods for the assessment of spare capacity on the example of pharmaceutical high purity water (HPW) utility systems. The goal of this study was to find the most appropriate method modelling uncertainty in industrial scale HPW systems. The results provide evidence which suggests that stochastic methods are superior to the methods of FL in simulating uncertainty in chemical plant utilities including HPW systems in typical cases whereby extreme events, for example peaks in demand, or day-to-day variation rather than average values are of interest. The average production output or other statistical measures may, for instance, be of interest in the assessment of workshops. Furthermore the results indicate that the stochastic model should be used only if found necessary by a deterministic simulation. Consequently, this thesis concludes that either deterministic or stochastic methods should be used to simulate uncertainty in chemical plant utility systems and by extension some process system because extreme events or the modelling of day-to-day variation are important in capacity extension projects. Other reasons supporting the suggestion that stochastic HPW models are preferred to FL HPW models include: 1. The computer code for stochastic models is typically less complex than a FL models, thus reducing code maintenance and validation issues. 2. In many respects FL models are similar to deterministic models. Thus the need for a FL model over a deterministic model is questionable in the case of industrial scale HPW systems as presented here (as well as other similar systems) since the latter requires simpler models. 3. A FL model may be difficult to "sell" to an end-user as its results represent "approximate reasoning" a definition of which is, however, lacking. 4. Stochastic models may be applied with some relatively minor modifications on other systems, whereas FL models may not. For instance, the stochastic HPW system could be used to model municipal drinking water systems, whereas the FL HPW model should or could not be used on such systems. This is because the FL and stochastic model philosophies of a HPW system are fundamentally different. The stochastic model sees schedule and volume uncertainties as random phenomena described by statistical distributions based on either estimated or historical data. The FL model, on the other hand, simulates schedule uncertainties based on estimated operator behaviour e.g. tiredness of the operators and their working schedule. But in a municipal drinking water distribution system the notion of "operator" breaks down. 5. Stochastic methods can account for uncertainties that are difficult to model with FL. The FL HPW system model does not account for dispensed volume uncertainty, as there appears to be no reasonable method to account for it with FL whereas the stochastic model includes volume uncertainty.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we extend the ideas of Brugnano, Iavernaro and Trigiante in their development of HBVM($s,r$) methods to construct symplectic Runge-Kutta methods for all values of $s$ and $r$ with $s\geq r$. However, these methods do not see the dramatic performance improvement that HBVMs can attain. Nevertheless, in the case of additive stochastic Hamiltonian problems an extension of these ideas, which requires the simulation of an independent Wiener process at each stage of a Runge-Kutta method, leads to methods that have very favourable properties. These ideas are illustrated by some simple numerical tests for the modified midpoint rule.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The numerical solution of stochastic differential equations (SDEs) has been focused recently on the development of numerical methods with good stability and order properties. These numerical implementations have been made with fixed stepsize, but there are many situations when a fixed stepsize is not appropriate. In the numerical solution of ordinary differential equations, much work has been carried out on developing robust implementation techniques using variable stepsize. It has been necessary, in the deterministic case, to consider the "best" choice for an initial stepsize, as well as developing effective strategies for stepsize control-the same, of course, must be carried out in the stochastic case. In this paper, proportional integral (PI) control is applied to a variable stepsize implementation of an embedded pair of stochastic Runge-Kutta methods used to obtain numerical solutions of nonstiff SDEs. For stiff SDEs, the embedded pair of the balanced Milstein and balanced implicit method is implemented in variable stepsize mode using a predictive controller for the stepsize change. The extension of these stepsize controllers from a digital filter theory point of view via PI with derivative (PID) control will also be implemented. The implementations show the improvement in efficiency that can be attained when using these control theory approaches compared with the regular stepsize change strategy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A spatial process observed over a lattice or a set of irregular regions is usually modeled using a conditionally autoregressive (CAR) model. The neighborhoods within a CAR model are generally formed deterministically using the inter-distances or boundaries between the regions. An extension of CAR model is proposed in this article where the selection of the neighborhood depends on unknown parameter(s). This extension is called a Stochastic Neighborhood CAR (SNCAR) model. The resulting model shows flexibility in accurately estimating covariance structures for data generated from a variety of spatial covariance models. Specific examples are illustrated using data generated from some common spatial covariance functions as well as real data concerning radioactive contamination of the soil in Switzerland after the Chernobyl accident.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we introduce the Stochastic Adams-Bashforth (SAB) and Stochastic Adams-Moulton (SAM) methods as an extension of the tau-leaping framework to past information. Using the theta-trapezoidal tau-leap method of weak order two as a starting procedure, we show that the k-step SAB method with k >= 3 is order three in the mean and correlation, while a predictor-corrector implementation of the SAM method is weak order three in the mean but only order one in the correlation. These convergence results have been derived analytically for linear problems and successfully tested numerically for both linear and non-linear systems. A series of additional examples have been implemented in order to demonstrate the efficacy of this approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stochastic structural systems having a stochastic distribution of material properties and stochastic external loadings in space are analysed when a crack of deterministic size is present. The material properties and external loadings are considered to constitute independent, two-dimensional, univariate, real, homogeneous stochastic fields. The stochastic fields are characterized by their means, variances, autocorrelation functions or the equivalent power spectral density functions, and scale fluctuations. The Young's modulus and Poisson's ratio are treated to be stochastic quantities. The external loading is treated to be a stochastic field in space. The energy release rate is derived using the method of virtual crack extension. The deterministic relationship is derived to represent the sensitivities of energy release rate with respect to both virtual crack extension and real system parameter fluctuations. Taylor series expansion is used and truncation is made to the first order. This leads to the determination of second-order properties of the output quantities to the first order. Using the linear perturbations about the mean values of the output quantities, the statistical information about the energy release rates, SIF and crack opening displacements are obtained. Both plane stress and plane strain cases are considered. The general expressions for the SIF in all the three fracture modes are derived and a more detailed analysis is conducted for a mode I situation. A numerical example is given.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we study two multi-dimensional Goodness-of-Fit tests for spectrum sensing in cognitive radios. The multi-dimensional scenario refers to multiple CR nodes, each with multiple antennas, that record multiple observations from multiple primary users for spectrum sensing. These tests, viz., the Interpoint Distance (ID) based test and the h, f distance based tests are constructed based on the properties of stochastic distances. The ID test is studied in detail for a single CR node case, and a possible extension to handle multiple nodes is discussed. On the other hand, the h, f test is applicable in a multi-node setup. A robustness feature of the KL distance based test is discussed, which has connections with Middleton's class A model. Through Monte-Carlo simulations, the proposed tests are shown to outperform the existing techniques such as the eigenvalue ratio based test, John's test, and the sphericity test, in several scenarios.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we extend to the multistage case two recent risk averse measures for two-stage stochastic programs based on first- and second-order stochastic dominance constraints induced by mixed-integer linear recourse. Additionally, we consider Time Stochastic Dominance (TSD) along a given horizon. Given the dimensions of medium-sized problems augmented by the new variables and constraints required by those risk measures, it is unrealistic to solve the problem up to optimality by plain use of MIP solvers in a reasonable computing time, at least. Instead of it, decomposition algorithms of some type should be used. We present an extension of our Branch-and-Fix Coordination algorithm, so named BFC-TSD, where a special treatment is given to cross scenario group constraints that link variables from different scenario groups. A broad computational experience is presented by comparing the risk neutral approach and the tested risk averse strategies. The performance of the new version of the BFC algorithm versus the plain use of a state-of-the-artMIP solver is also reported.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We report a Monte Carlo representation of the long-term inter-annual variability of monthly snowfall on a detailed (1 km) grid of points throughout the southwest. An extension of the local climate model of the southwestern United States (Stamm and Craig 1992) provides spatially based estimates of mean and variance of monthly temperature and precipitation. The mean is the expected value from a canonical regression using independent variables that represent controls on climate in this area, including orography. Variance is computed as the standard error of the prediction and provides site-specific measures of (1) natural sources of variation and (2) errors due to limitations of the data and poor distribution of climate stations. Simulation of monthly temperature and precipitation over a sequence of years is achieved by drawing from a bivariate normal distribution. The conditional expectation of precipitation. given temperature in each month, is the basis of a numerical integration of the normal probability distribution of log precipitation below a threshold temperature (3°C) to determine snowfall as a percent of total precipitation. Snowfall predictions are tested at stations for which long-term records are available. At Donner Memorial State Park (elevation 1811 meters) a 34-year simulation - matching the length of instrumental record - is within 15 percent of observed for mean annual snowfall. We also compute resulting snowpack using a variation of the model of Martinec et al. (1983). This allows additional tests by examining spatial patterns of predicted snowfall and snowpack and their hydrologic implications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major research on equity index dynamics has investigated only US indices (usually the S&P 500) and has provided contradictory results. In this paper a clarification and extension of that previous research is given. We find that European equity indices have quite different dynamics from the S&P 500. Each of the European indices considered may be satisfactorily modelled using either an affine model with price and volatility jumps or a GARCH volatility process without jumps. The S&P 500 dynamics are much more difficult to capture in a jump-diffusion framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper considers an extension to the skew-normal model through the inclusion of an additional parameter which can lead to both uni- and bi-modal distributions. The paper presents various basic properties of this family of distributions and provides a stochastic representation which is useful for obtaining theoretical properties and to simulate from the distribution. Moreover, the singularity of the Fisher information matrix is investigated and maximum likelihood estimation for a random sample with no covariates is considered. The main motivation is thus to avoid using mixtures in fitting bimodal data as these are well known to be complicated to deal with, particularly because of identifiability problems. Data-based illustrations show that such model can be useful. Copyright (C) 2009 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider risk-averse convex stochastic programs expressed in terms of extended polyhedral risk measures. We derive computable con dence intervals on the optimal value of such stochastic programs using the Robust Stochastic Approximation and the Stochastic Mirror Descent (SMD) algorithms. When the objective functions are uniformly convex, we also propose a multistep extension of the Stochastic Mirror Descent algorithm and obtain con dence intervals on both the optimal values and optimal solutions. Numerical simulations show that our con dence intervals are much less conservative and are quicker to compute than previously obtained con dence intervals for SMD and that the multistep Stochastic Mirror Descent algorithm can obtain a good approximate solution much quicker than its nonmultistep counterpart. Our con dence intervals are also more reliable than asymptotic con dence intervals when the sample size is not much larger than the problem size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we reported some results about the stochastic quantization of the spherical model. We started by reviewing some basic aspects of this method with emphasis in the connection between the Langevin equation and the supersymmetric quantum mechanics, aiming at the application of the corresponding connection to the spherical model. An intuitive idea is that when applied to the spherical model this gives rise to a supersymmetric version that is identified with one studied in Phys. Rev. E 85, 061109, (2012). Before investigating in detail this aspect, we studied the stochastic quantization of the mean spherical model that is simpler to implement than the one with the strict constraint. We also highlight some points concerning more traditional methods discussed in the literature like canonical and path integral quantization. To produce a supersymmetric version, grounded in the Nicolai map, we investigated the stochastic quantization of the strict spherical model. We showed in fact that the result of this process is an off-shell supersymmetric extension of the quantum spherical model (with the precise supersymmetric constraint structure). That analysis establishes a connection between the classical model and its supersymmetric quantum counterpart. The supersymmetric version in this way constructed is a more natural one and gives further support and motivations to investigate similar connections in other models of the literature.