61 resultados para LOG-S DISTRIBUTIONS
Resumo:
We have performed a detailed study of the zenith angle dependence of the regeneration factor and distributions of events at SNO and SK for different solutions of the solar neutrino problem. In particular, we discuss the oscillatory behavior and the synchronization effect in the distribution for the LMA solution, the parametric peak for the LOW solution, etc. A physical interpretation of the effects is given. We suggest a new binning of events which emphasizes the distinctive features of the zenith angle distributions for the different solutions. We also find the correlations between the integrated day-night asymmetry and the rates of events in different zenith angle bins. The study of these correlations strengthens the identification power of the analysis.
Resumo:
Acoustic emission avalanche distributions are studied in different alloy systems that exhibit a phase transition from a bcc to a close-packed structure. After a small number of thermal cycles through the transition, the distributions become critically stable (exhibit power-law behavior) and can be characterized by an exponent alpha. The values of alpha can be classified into universality classes, which depend exclusively on the symmetry of the resulting close-packed structure.
Resumo:
An experimental study of the acoustic emission generated during a martensitic transformation is presented. A statistical analysis of the amplitude and lifetime of a large number of signals has revealed power-law behavior for both magnitudes. The exponents of these distributions have been evaluated and, through independent measurements of the statistical lifetime to amplitude dependence, we have checked the scaling relation between the exponents. Our results are discussed in terms of current ideas on avalanche dynamics.
Resumo:
We study the problem of the partition of a system of initial size V into a sequence of fragments s1,s2,s3 . . . . By assuming a scaling hypothesis for the probability p(s;V) of obtaining a fragment of a given size, we deduce that the final distribution of fragment sizes exhibits power-law behavior. This minimal model is useful to understanding the distribution of avalanche sizes in first-order phase transitions at low temperatures.
Resumo:
The concepts of void and cluster for an arbitrary point distribution in a domain D are defined and characterized by some parameters such as volume, density, number of points belonging to them, shape, etc. After assigning a weight to each void and clusterwhich is a function of its characteristicsthe concept of distance between two point configurations S1 and S2 in D is introduced, both with and without the help of a lattice in the domain D. This defines a topology for the point distributions in D, which is different for the different characterizations of the voids and clusters.
Resumo:
We report variational calculations, in the hypernetted-chain (HNC)-Fermi-HNC scheme, of one-body density matrices and one-particle momentum distributions for 3He-4He mixtures described by a Jastrow correlated wave function. The 4He condensate fractions and the 3He strength poles are examined and compared with the Monte Carlo available results. The agreement has been found to be very satisfactory. Their density dependence is also studied.
Resumo:
Methods for generating beams with arbitrary polarization based on the use of liquid crystal displays have recently attracted interest from a wide range of sources. In this paper we present a technique for generating beams with arbitrary polarization and shape distributions at a given plane using a Mach-Zehnder setup. The transverse components of the incident beam are processed independently by means of spatial light modulators placed in each path of the interferometer. The modulators display computer generated holograms designed to dynamically encode any amplitude value and polarization state for each point of the wavefront in a given plane. The steps required to design such beams are described in detail. Several beams performing different polarization and intensity landscapes have been experimentally implemented. The results obtained demonstrate the capability of the proposed technique to tailor the amplitude and polarization of the beam simultaneously.
Resumo:
We apply the formalism of the continuous-time random walk to the study of financial data. The entire distribution of prices can be obtained once two auxiliary densities are known. These are the probability densities for the pausing time between successive jumps and the corresponding probability density for the magnitude of a jump. We have applied the formalism to data on the U.S. dollardeutsche mark future exchange, finding good agreement between theory and the observed data.
Resumo:
We present a generator of random networks where both the degree-dependent clustering coefficient and the degree distribution are tunable. Following the same philosophy as in the configuration model, the degree distribution and the clustering coefficient for each class of nodes of degree k are fixed ad hoc and a priori. The algorithm generates corresponding topologies by applying first a closure of triangles and second the classical closure of remaining free stubs. The procedure unveils an universal relation among clustering and degree-degree correlations for all networks, where the level of assortativity establishes an upper limit to the level of clustering. Maximum assortativity ensures no restriction on the decay of the clustering coefficient whereas disassortativity sets a stronger constraint on its behavior. Correlation measures in real networks are seen to observe this structural bound.
Resumo:
We study the motion of an unbound particle under the influence of a random force modeled as Gaussian colored noise with an arbitrary correlation function. We derive exact equations for the joint and marginal probability density functions and find the associated solutions. We analyze in detail anomalous diffusion behaviors along with the fractal structure of the trajectories of the particle and explore possible connections between dynamical exponents of the variance and the fractal dimension of the trajectories.
Resumo:
We study the motion of a particle governed by a generalized Langevin equation. We show that, when no fluctuation-dissipation relation holds, the long-time behavior of the particle may be from stationary to superdiffusive, along with subdiffusive and diffusive. When the random force is Gaussian, we derive the exact equations for the joint and marginal probability density functions for the position and velocity of the particle and find their solutions.
Resumo:
In a recent paper, Komaki studied the second-order asymptotic properties of predictive distributions, using the Kullback-Leibler divergence as a loss function. He showed that estimative distributions with asymptotically efficient estimators can be improved by predictive distributions that do not belong to the model. The model is assumed to be a multidimensional curved exponential family. In this paper we generalize the result assuming as a loss function any f divergence. A relationship arises between alpha connections and optimal predictive distributions. In particular, using an alpha divergence to measure the goodness of a predictive distribution, the optimal shift of the estimate distribution is related to alpha-covariant derivatives. The expression that we obtain for the asymptotic risk is also useful to study the higher-order asymptotic properties of an estimator, in the mentioned class of loss functions.
Resumo:
The present study explores the statistical properties of a randomization test based on the random assignment of the intervention point in a two-phase (AB) single-case design. The focus is on randomization distributions constructed with the values of the test statistic for all possible random assignments and used to obtain p-values. The shape of those distributions is investigated for each specific data division defined by the moment in which the intervention is introduced. Another aim of the study consisted in testing the detection of inexistent effects (i.e., production of false alarms) in autocorrelated data series, in which the assumption of exchangeability between observations may be untenable. In this way, it was possible to compare nominal and empirical Type I error rates in order to obtain evidence on the statistical validity of the randomization test for each individual data division. The results suggest that when either of the two phases has considerably less measurement times, Type I errors may be too probable and, hence, the decision making process to be carried out by applied researchers may be jeopardized.
Resumo:
This paper introduces a mixture model based on the beta distribution, without preestablishedmeans and variances, to analyze a large set of Beauty-Contest data obtainedfrom diverse groups of experiments (Bosch-Domenech et al. 2002). This model gives a bettert of the experimental data, and more precision to the hypothesis that a large proportionof individuals follow a common pattern of reasoning, described as iterated best reply (degenerate),than mixture models based on the normal distribution. The analysis shows thatthe means of the distributions across the groups of experiments are pretty stable, while theproportions of choices at dierent levels of reasoning vary across groups.