986 resultados para Stochastic Approach


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Ionospheric scintillations are caused by time-varying electron density irregularities in the ionosphere, occurring more often at equatorial and high latitudes. This paper focuses exclusively on experiments undertaken in Europe, at geographic latitudes between similar to 50 degrees N and similar to 80 degrees N, where a network of GPS receivers capable of monitoring Total Electron Content and ionospheric scintillation parameters was deployed. The widely used ionospheric scintillation indices S4 and sigma(phi) represent a practical measure of the intensity of amplitude and phase scintillation affecting GNSS receivers. However, they do not provide sufficient information regarding the actual tracking errors that degrade GNSS receiver performance. Suitable receiver tracking models, sensitive to ionospheric scintillation, allow the computation of the variance of the output error of the receiver PLL (Phase Locked Loop) and DLL (Delay Locked Loop), which expresses the quality of the range measurements used by the receiver to calculate user position. The ability of such models of incorporating phase and amplitude scintillation effects into the variance of these tracking errors underpins our proposed method of applying relative weights to measurements from different satellites. That gives the least squares stochastic model used for position computation a more realistic representation, vis-a-vis the otherwise 'equal weights' model. For pseudorange processing, relative weights were computed, so that a 'scintillation-mitigated' solution could be performed and compared to the (non-mitigated) 'equal weights' solution. An improvement between 17 and 38% in height accuracy was achieved when an epoch by epoch differential solution was computed over baselines ranging from 1 to 750 km. The method was then compared with alternative approaches that can be used to improve the least squares stochastic model such as weighting according to satellite elevation angle and by the inverse of the square of the standard deviation of the code/carrier divergence (sigma CCDiv). The influence of multipath effects on the proposed mitigation approach is also discussed. With the use of high rate scintillation data in addition to the scintillation indices a carrier phase based mitigated solution was also implemented and compared with the conventional solution. During a period of occurrence of high phase scintillation it was observed that problems related to ambiguity resolution can be reduced by the use of the proposed mitigated solution.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Using the Langevin approach for stochastic processes, we study the renormalizability of the massive Thirring model. At finite fictitious time, we prove the absence of induced quadrilinear counterterms by verifying the cancellation of the divergencies of graphs with four external lines. This implies that the vanishing of the renormalization group beta function already occurs at finite times.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We study the 1/N expansion of field theories in the stochastic quantization method of Parisi and Wu using the supersymmetric functional approach. This formulation provides a systematic procedure to implement the 1/N expansion which resembles the ones used in the equilibrium. The 1/N perturbation theory for the nonlinear sigma-model in two dimensions is worked out as an example.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Wind-excited vibrations in the frequency range of 10 to 50 Hz due to vortex shedding often cause fatigue failures in the cables of overhead transmission lines. Damping devices, such as the Stockbridge dampers, have been in use for a long time for supressing these vibrations. The dampers are conveniently modelled by means of their driving point impedance, measured in the lab over the frequency range under consideration. The cables can be modelled as strings with additional small bending stiffness. The main problem in modelling the vibrations does however lay in the aerodynamic forces, which usually are approximated by the forces acting on a rigid cylinder in planar flow. In the present paper, the wind forces are represented by stochastic processes with arbitrary crosscorrelation in space; the case of a Kármán vortex street on a rigid cylinder in planar flow is contained as a limit case in this approach. The authors believe that this new view of the problem may yield useful results, particularly also concerning the reliability of the lines and the probability of fatigue damages. © 1987.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Power-law distributions, i.e. Levy flights have been observed in various economical, biological, and physical systems in high-frequency regime. These distributions can be successfully explained via gradually truncated Levy flight (GTLF). In general, these systems converge to a Gaussian distribution in the low-frequency regime. In the present work, we develop a model for the physical basis for the cut-off length in GTLF and its variation with respect to the time interval between successive observations. We observe that GTLF automatically approach a Gaussian distribution in the low-frequency regime. We applied the present method to analyze time series in some physical and financial systems. The agreement between the experimental results and theoretical curves is excellent. The present method can be applied to analyze time series in a variety of fields, which in turn provide a basis for the development of further microscopic models for the system. © 2000 Elsevier Science B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper deals with exponential stability of discrete-time singular systems with Markov jump parameters. We propose a set of coupled generalized Lyapunov equations (CGLE) that provides sufficient conditions to check this property for this class of systems. A method for solving the obtained CGLE is also presented, based on iterations of standard singular Lyapunov equations. We present also a numerical example to illustrate the effectiveness of the approach we are proposing.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The transcription process is crucial to life and the enzyme RNA polymerase (RNAP) is the major component of the transcription machinery. The development of single-molecule techniques, such as magnetic and optical tweezers, atomic-force microscopy and single-molecule fluorescence, increased our understanding of the transcription process and complements traditional biochemical studies. Based on these studies, theoretical models have been proposed to explain and predict the kinetics of the RNAP during the polymerization, highlighting the results achieved by models based on the thermodynamic stability of the transcription elongation complex. However, experiments showed that if more than one RNAP initiates from the same promoter, the transcription behavior slightly changes and new phenomenona are observed. We proposed and implemented a theoretical model that considers collisions between RNAPs and predicts their cooperative behavior during multi-round transcription generalizing the Bai et al. stochastic sequence-dependent model. In our approach, collisions between elongating enzymes modify their transcription rate values. We performed the simulations in Mathematica® and compared the results of the single and the multiple-molecule transcription with experimental results and other theoretical models. Our multi-round approach can recover several expected behaviors, showing that the transcription process for the studied sequences can be accelerated up to 48% when collisions are allowed: the dwell times on pause sites are reduced as well as the distance that the RNAPs backtracked from backtracking sites. © 2013 Costa et al.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Consider a one-dimensional environment with N randomly distributed sites. An agent explores this random medium moving deterministically with a spatial memory μ. A crossover from local to global exploration occurs in one dimension at a well-defined memory value μ1=log2N. In its stochastic version, the dynamics is ruled by the memory and by temperature T, which affects the hopping displacement. This dynamics also shows a crossover in one dimension, obtained computationally, between exploration schemes, characterized yet by the trajectory size (Np) (aging effect). In this paper we provide an analytical approach considering the modified stochastic version where the parameter T plays the role of a maximum hopping distance. This modification allows us to obtain a general analytical expression for the crossover, as a function of the parameters μ, T, and Np. Differently from what has been proposed by previous studies, we find that the crossover occurs in any dimension d. These results have been validated by numerical experiments and may be of great value for fixing optimal parameters in search algorithms. © 2013 American Physical Society.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Complex non-linear interactions between banks and assets we model by two time-dependent Erdos-Renyi network models where each node, representing a bank, can invest either to a single asset (model I) or multiple assets (model II). We use a dynamical network approach to evaluate the collective financial failure -systemic risk- quantified by the fraction of active nodes. The systemic risk can be calculated over any future time period, divided into sub-periods, where within each sub-period banks may contiguously fail due to links to either i) assets or ii) other banks, controlled by two parameters, probability of internal failure p and threshold T-h ("solvency" parameter). The systemic risk decreases with the average network degree faster when all assets are equally distributed across banks than if assets are randomly distributed. The more inactive banks each bank can sustain (smaller T-h), the smaller the systemic risk -for some Th values in I we report a discontinuity in systemic risk. When contiguous spreading becomes stochastic ii) controlled by probability p(2) -a condition for the bank to be solvent (active) is stochasticthe- systemic risk decreases with decreasing p(2). We analyse the asset allocation for the U.S. banks. Copyright (C) EPLA, 2014

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Competitive learning is an important machine learning approach which is widely employed in artificial neural networks. In this paper, we present a rigorous definition of a new type of competitive learning scheme realized on large-scale networks. The model consists of several particles walking within the network and competing with each other to occupy as many nodes as possible, while attempting to reject intruder particles. The particle's walking rule is composed of a stochastic combination of random and preferential movements. The model has been applied to solve community detection and data clustering problems. Computer simulations reveal that the proposed technique presents high precision of community and cluster detections, as well as low computational complexity. Moreover, we have developed an efficient method for estimating the most likely number of clusters by using an evaluator index that monitors the information generated by the competition process itself. We hope this paper will provide an alternative way to the study of competitive learning.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Semisupervised learning is a machine learning approach that is able to employ both labeled and unlabeled samples in the training process. In this paper, we propose a semisupervised data classification model based on a combined random-preferential walk of particles in a network (graph) constructed from the input dataset. The particles of the same class cooperate among themselves, while the particles of different classes compete with each other to propagate class labels to the whole network. A rigorous model definition is provided via a nonlinear stochastic dynamical system and a mathematical analysis of its behavior is carried out. A numerical validation presented in this paper confirms the theoretical predictions. An interesting feature brought by the competitive-cooperative mechanism is that the proposed model can achieve good classification rates while exhibiting low computational complexity order in comparison to other network-based semisupervised algorithms. Computer simulations conducted on synthetic and real-world datasets reveal the effectiveness of the model.