242 resultados para ergodic axiom
Resumo:
Different axioms underlie efficient market theory and Keynes's liquidity preference theory. Efficient market theory assumes the ergodic axiom. Consequently, today's decision makers can calculate with actuarial precision the future value of all possible outcomes resulting from today's decisions. Since in an efficient market world decision makers "know" their intertemporal budget constraints, decision makers never default on a loan, i.e., systemic defaults, insolvencies, and bankruptcies are impossible. Keynes liquidity preference theory rejects the ergodic axiom. The future is ontologically uncertain. Accordingly systemic defaults and insolvencies can occur but can never be predicted in advance.
Resumo:
The existence of a reversed magnetic shear in tokamaks improves the plasma confinement through the formation of internal transport barriers that reduce radial particle and heat transport. However, the transport poloidal profile is much influenced by the presence of chaotic magnetic field lines at the plasma edge caused by external perturbations. Contrary to many expectations, it has been observed that such a chaotic region does not uniformize heat and particle deposition on the inner tokamak wall. The deposition is characterized instead by structured patterns called magnetic footprints, here investigated for a nonmonotonic analytical plasma equilibrium perturbed by an ergodic limiter. The magnetic footprints appear due to the underlying mathematical skeleton of chaotic magnetic field lines determined by the manifold tangles. For the investigated edge safety factor ranges, these effects on the wall are associated with the field line stickiness and escape channels due to internal island chains near the flux surfaces. Comparisons between magnetic footprints and escape basins from different equilibrium and ergodic limiter characteristic parameters show that highly concentrated magnetic footprints can be avoided by properly choosing these parameters. (c) 2008 American Institute of Physics.
Resumo:
We prove that, once an algorithm of perfect simulation for a stationary and ergodic random field F taking values in S(Zd), S a bounded subset of R(n), is provided, the speed of convergence in the mean ergodic theorem occurs exponentially fast for F. Applications from (non-equilibrium) statistical mechanics and interacting particle systems are presented.
Resumo:
Perspectivas é o que temos, quer se discuta o texto quer se discuta o cibertexto. Dizia Ricoeur que o texto como um todo singular se pode comparar a um objecto, visto de vários lados mas nunca de todos, simultaneamente. Decidimos sempre olhar de um certo modo. Ora, estamos num tempo em que do dia para a noite várias propostas, novas perspectivas, novas formas de textualidade emergem. Necessita-se para isso de uma terminologia mais consistente do que as formas que ocorrem.
Resumo:
We present a simple randomized procedure for the prediction of a binary sequence. The algorithm uses ideas from recent developments of the theory of the prediction of individual sequences. We show that if thesequence is a realization of a stationary and ergodic random process then the average number of mistakes converges, almost surely, to that of the optimum, given by the Bayes predictor.
Resumo:
A fluctuation relation for aging systems is introduced and verified by extensive numerical simulations. It is based on the hypothesis of partial equilibration over phase-space regions in a scenario of entropy-driven relaxation. The relation provides a simple alternative method, amenable of experimental implementation, to measure replica symmetry breaking parameters in aging systems. The connection with the effective temperatures obtained from the fluctuation-dissipation theorem is discussed
Resumo:
We consider the billiard dynamics in a non-compact set of ℝ d that is constructed as a bi-infinite chain of translated copies of the same d-dimensional polytope. A random configuration of semi-dispersing scatterers is placed in each copy. The ensemble of dynamical systems thus defined, one for each global realization of the scatterers, is called quenched random Lorentz tube. Under some fairly general conditions, we prove that every system in the ensemble is hyperbolic and almost every system is recurrent, ergodic, and enjoys some higher chaotic properties.
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Consider a one-dimensional environment with N randomly distributed sites. An agent explores this random medium moving deterministically with a spatial memory μ. A crossover from local to global exploration occurs in one dimension at a well-defined memory value μ1=log2N. In its stochastic version, the dynamics is ruled by the memory and by temperature T, which affects the hopping displacement. This dynamics also shows a crossover in one dimension, obtained computationally, between exploration schemes, characterized yet by the trajectory size (Np) (aging effect). In this paper we provide an analytical approach considering the modified stochastic version where the parameter T plays the role of a maximum hopping distance. This modification allows us to obtain a general analytical expression for the crossover, as a function of the parameters μ, T, and Np. Differently from what has been proposed by previous studies, we find that the crossover occurs in any dimension d. These results have been validated by numerical experiments and may be of great value for fixing optimal parameters in search algorithms. © 2013 American Physical Society.
Resumo:
In the present paper, we solve a twist symplectic map for the action of an ergodic magnetic limiter in a large aspect-ratio tokamak. In this model, we study the bifurcation scenarios that occur in the remnants regular islands that co-exist with chaotic magnetic surfaces. The onset of atypical local bifurcations created by secondary shearless tori are identified through numerical profiles of internal rotation number and we observe that their rupture can reduce the usual magnetic field line escape at the tokamak plasma edge.
Resumo:
A problem with a practical application of Varian.s Weak Axiom of Cost Minimization is that an observed violation may be due to random variation in the output quantities produced by firms rather than due to inefficiency on the part of the firm. In this paper, unlike in Varian (1985), the output rather than the input quantities are treated as random and an alternative statistical test of the violation of WACM is proposed. We assume that there is no technical inefficiency and provide a test of the hypothesis that an observed violation of WACM is merely due to random variations in the output levels of the firms being compared.. We suggest an intuitive approach for specifying a value of the variance of the noise term that is needed for the test. The paper includes an illustrative example utilizing a data set relating to a number of U.S. airlines.
Resumo:
We analyse a class of estimators of the generalized diffusion coefficient for fractional Brownian motion Bt of known Hurst index H, based on weighted functionals of the single time square displacement. We show that for a certain choice of the weight function these functionals possess an ergodic property and thus provide the true, ensemble-averaged, generalized diffusion coefficient to any necessary precision from a single trajectory data, but at expense of a progressively higher experimental resolution. Convergence is fastest around H ? 0.30, a value in the subdiffusive regime.
Resumo:
In this paper we define the notion of an axiom dependency hypergraph, which explicitly represents how axioms are included into a module by the algorithm for computing locality-based modules. A locality-based module of an ontology corresponds to a set of connected nodes in the hypergraph, and atoms of an ontology to strongly connected components. Collapsing the strongly connected components into single nodes yields a condensed hypergraph that comprises a representation of the atomic decomposition of the ontology. To speed up the condensation of the hypergraph, we first reduce its size by collapsing the strongly connected components of its graph fragment employing a linear time graph algorithm. This approach helps to significantly reduce the time needed for computing the atomic decomposition of an ontology. We provide an experimental evaluation for computing the atomic decomposition of large biomedical ontologies. We also demonstrate a significant improvement in the time needed to extract locality-based modules from an axiom dependency hypergraph and its condensed version.