70 resultados para GENERALIZED SYNCHRONIZATION
Resumo:
We analyze the collective behavior of a lattice model of pulse-coupled oscillators. By means of computer simulations we find the relation between the intrinsic dynamics of each member of the population and their mutual interactions that ensures, in a general context, the existence of a fully synchronized regime. This condition turns out to be the same as that obtained for the globally coupled population. When the condition is not completely satisfied we find different spatial structures. This also gives some hints about self-organized criticality.
Resumo:
We develop a general theory for percolation in directed random networks with arbitrary two-point correlations and bidirectional edgesthat is, edges pointing in both directions simultaneously. These two ingredients alter the previously known scenario and open new views and perspectives on percolation phenomena. Equations for the percolation threshold and the sizes of the giant components are derived in the most general case. We also present simulation results for a particular example of uncorrelated network with bidirectional edges confirming the theoretical predictions.
Resumo:
We analyze the physical mechanisms leading either to synchronization or to the formation of spatiotemporal patterns in a lattice model of pulse-coupled oscillators. In order to make the system tractable from a mathematical point of view we study a one-dimensional ring with unidirectional coupling. In such a situation, exact results concerning the stability of the fixed of the dynamic evolution of the lattice can be obtained. Furthermore, we show that this stability is the responsible for the different behaviors.
Resumo:
We study synchronization dynamics of a population of pulse-coupled oscillators. In particular, we focus our attention on the interplay between topological disorder and synchronization features of networks. First, we analyze synchronization time T in random networks, and find a scaling law which relates T to network connectivity. Then, we compare synchronization time for several other topological configurations, characterized by a different degree of randomness. The analysis shows that regular lattices perform better than a disordered network. This fact can be understood by considering the variability in the number of links between two adjacent neighbors. This phenomenon is equivalent to having a nonrandom topology with a distribution of interactions and it can be removed by an adequate local normalization of the couplings.
Resumo:
We study the motion of a particle governed by a generalized Langevin equation. We show that, when no fluctuation-dissipation relation holds, the long-time behavior of the particle may be from stationary to superdiffusive, along with subdiffusive and diffusive. When the random force is Gaussian, we derive the exact equations for the joint and marginal probability density functions for the position and velocity of the particle and find their solutions.
Resumo:
In this paper we find the quantities that are adiabatic invariants of any desired order for a general slowly time-dependent Hamiltonian. In a preceding paper, we chose a quantity that was initially an adiabatic invariant to first order, and sought the conditions to be imposed upon the Hamiltonian so that the quantum mechanical adiabatic theorem would be valid to mth order. [We found that this occurs when the first (m - 1) time derivatives of the Hamiltonian at the initial and final time instants are equal to zero.] Here we look for a quantity that is an adiabatic invariant to mth order for any Hamiltonian that changes slowly in time, and that does not fulfill any special condition (its first time derivatives are not zero initially and finally).
Resumo:
Generalized KerrSchild space-times for a perfect-fluid source are investigated. New Petrov type D perfect fluid solutions are obtained starting from conformally flat perfect-fluid metrics.
Resumo:
Petrov types D and II perfect-fluid solutions are obtained starting from conformally flat perfect-fluid metrics and by using a generalized KerrSchild ansatz. Most of the Petrov type D metrics obtained have the property that the velocity of the fluid does not lie in the two-space defined by the principal null directions of the Weyl tensor. The properties of the perfect-fluid sources are studied. Finally, a detailed analysis of a new class of spherically symmetric static perfect-fluid metrics is given.
Resumo:
We analyze the emergence of synchronization in a population of moving integrate-and-fire oscillators. Oscillators, while moving on a plane, interact with their nearest neighbor upon firing time. We discover a nonmonotonic dependence of the synchronization time on the velocity of the agents. Moreover, we find that mechanisms that drive synchronization are different for different dynamical regimes. We report the extreme situation where an interplay between the time scales involved in the dynamical processes completely inhibits the achievement of a coherent state. We also provide estimators for the transitions between the different regimes.
Resumo:
We study a Kuramoto model in which the oscillators are associated with the nodes of a complex network and the interactions include a phase frustration, thus preventing full synchronization. The system organizes into a regime of remote synchronization where pairs of nodes with the same network symmetry are fully synchronized, despite their distance on the graph. We provide analytical arguments to explain this result, and we show how the frustration parameter affects the distribution of phases. An application to brain networks suggests that anatomical symmetry plays a role in neural synchronization by determining correlated functional modules across distant locations.
Resumo:
In this paper we analyze the time of ruin in a risk process with the interclaim times being Erlang(n) distributed and a constant dividend barrier. We obtain an integro-differential equation for the Laplace Transform of the time of ruin. Explicit solutions for the moments of the time of ruin are presented when the individual claim amounts have a distribution with rational Laplace transform. Finally, some numerical results and a compare son with the classical risk model, with interclaim times following an exponential distribution, are given.
Resumo:
The present study focuses on single-case data analysis and specifically on two procedures for quantifying differences between baseline and treatment measurements The first technique tested is based on generalized least squares regression analysis and is compared to a proposed non-regression technique, which allows obtaining similar information. The comparison is carried out in the context of generated data representing a variety of patterns (i.e., independent measurements, different serial dependence underlying processes, constant or phase-specific autocorrelation and data variability, different types of trend, and slope and level change). The results suggest that the two techniques perform adequately for a wide range of conditions and researchers can use both of them with certain guarantees. The regression-based procedure offers more efficient estimates, whereas the proposed non-regression procedure is more sensitive to intervention effects. Considering current and previous findings, some tentative recommendations are offered to applied researchers in order to help choosing among the plurality of single-case data analysis techniques.
Resumo:
A new model for dealing with decision making under risk by considering subjective and objective information in the same formulation is here presented. The uncertain probabilistic weighted average (UPWA) is also presented. Its main advantage is that it unifies the probability and the weighted average in the same formulation and considering the degree of importance that each case has in the analysis. Moreover, it is able to deal with uncertain environments represented in the form of interval numbers. We study some of its main properties and particular cases. The applicability of the UPWA is also studied and it is seen that it is very broad because all the previous studies that use the probability or the weighted average can be revised with this new approach. Focus is placed on a multi-person decision making problem regarding the selection of strategies by using the theory of expertons.
Resumo:
Standard practice of wave-height hazard analysis often pays little attention to the uncertainty of assessed return periods and occurrence probabilities. This fact favors the opinion that, when large events happen, the hazard assessment should change accordingly. However, uncertainty of the hazard estimates is normally able to hide the effect of those large events. This is illustrated using data from the Mediterranean coast of Spain, where the last years have been extremely disastrous. Thus, it is possible to compare the hazard assessment based on data previous to those years with the analysis including them. With our approach, no significant change is detected when the statistical uncertainty is taken into account. The hazard analysis is carried out with a standard model. Time-occurrence of events is assumed Poisson distributed. The wave-height of each event is modelled as a random variable which upper tail follows a Generalized Pareto Distribution (GPD). Moreover, wave-heights are assumed independent from event to event and also independent of their occurrence in time. A threshold for excesses is assessed empirically. The other three parameters (Poisson rate, shape and scale parameters of GPD) are jointly estimated using Bayes' theorem. Prior distribution accounts for physical features of ocean waves in the Mediterranean sea and experience with these phenomena. Posterior distribution of the parameters allows to obtain posterior distributions of other derived parameters like occurrence probabilities and return periods. Predictives are also available. Computations are carried out using the program BGPE v2.0