989 resultados para Statistical Theory
Resumo:
‘Modern’ Phillips curve theories predict inflation is an integrated, or near integrated, process. However, inflation appears bounded above and below in developed economies and so cannot be ‘truly’ integrated and more likely stationary around a shifting mean. If agents believe inflation is integrated as in the ‘modern’ theories then they are making systematic errors concerning the statistical process of inflation. An alternative theory of the Phillips curve is developed that is consistent with the ‘true’ statistical process of inflation. It is demonstrated that United States inflation data is consistent with the alternative theory but not with the existing ‘modern’ theories.
Resumo:
First discussion on compositional data analysis is attributable to Karl Pearson, in 1897. However, notwithstanding the recent developments on algebraic structure of the simplex, more than twenty years after Aitchison’s idea of log-transformations of closed data, scientific literature is again full of statistical treatments of this type of data by using traditional methodologies. This is particularly true in environmental geochemistry where besides the problem of the closure, the spatial structure (dependence) of the data have to be considered. In this work we propose the use of log-contrast values, obtained by asimplicial principal component analysis, as LQGLFDWRUV of given environmental conditions. The investigation of the log-constrast frequency distributions allows pointing out the statistical laws able togenerate the values and to govern their variability. The changes, if compared, for example, with the mean values of the random variables assumed as models, or other reference parameters, allow definingmonitors to be used to assess the extent of possible environmental contamination. Case study on running and ground waters from Chiavenna Valley (Northern Italy) by using Na+, K+, Ca2+, Mg2+, HCO3-, SO4 2- and Cl- concentrations will be illustrated
Resumo:
This book gives a general view of sequence analysis, the statistical study of successions of states or events. It includes innovative contributions on life course studies, transitions into and out of employment, contemporaneous and historical careers, and political trajectories. The approach presented in this book is now central to the life-course perspective and the study of social processes more generally. This volume promotes the dialogue between approaches to sequence analysis that developed separately, within traditions contrasted in space and disciplines. It includes the latest developments in sequential concepts, coding, atypical datasets and time patterns, optimal matching and alternative algorithms, survey optimization, and visualization. Field studies include original sequential material related to parenting in 19th-century Belgium, higher education and work in Finland and Italy, family formation before and after German reunification, French Jews persecuted in occupied France, long-term trends in electoral participation, and regime democratization. Overall the book reassesses the classical uses of sequences and it promotes new ways of collecting, formatting, representing and processing them. The introduction provides basic sequential concepts and tools, as well as a history of the method. Chapters are presented in a way that is both accessible to the beginner and informative to the expert.
Resumo:
Background: Recent advances on high-throughput technologies have produced a vast amount of protein sequences, while the number of high-resolution structures has seen a limited increase. This has impelled the production of many strategies to built protein structures from its sequence, generating a considerable amount of alternative models. The selection of the closest model to the native conformation has thus become crucial for structure prediction. Several methods have been developed to score protein models by energies, knowledge-based potentials and combination of both.Results: Here, we present and demonstrate a theory to split the knowledge-based potentials in scoring terms biologically meaningful and to combine them in new scores to predict near-native structures. Our strategy allows circumventing the problem of defining the reference state. In this approach we give the proof for a simple and linear application that can be further improved by optimizing the combination of Zscores. Using the simplest composite score () we obtained predictions similar to state-of-the-art methods. Besides, our approach has the advantage of identifying the most relevant terms involved in the stability of the protein structure. Finally, we also use the composite Zscores to assess the conformation of models and to detect local errors.Conclusion: We have introduced a method to split knowledge-based potentials and to solve the problem of defining a reference state. The new scores have detected near-native structures as accurately as state-of-art methods and have been successful to identify wrongly modeled regions of many near-native conformations.
Resumo:
In the past 20 years the theory of robust estimation has become an important topic of mathematical statistics. We discuss here some basic concepts of this theory with the help of simple examples. Furthermore we describe a subroutine library for the application of robust statistical procedures, which was developed with the support of the Swiss National Science Foundation.
Resumo:
Fractal mathematics has been used to characterize water and solute transport in porous media and also to characterize and simulate porous media properties. The objective of this study was to evaluate the correlation between the soil infiltration parameters sorptivity (S) and time exponent (n) and the parameters dimension (D) and the Hurst exponent (H). For this purpose, ten horizontal columns with pure (either clay or loam) and heterogeneous porous media (clay and loam distributed in layers in the column) were simulated following the distribution of a deterministic Cantor Bar with fractal dimension H" 0.63. Horizontal water infiltration experiments were then simulated using Hydrus 2D software. The sorptivity (S) and time exponent (n) parameters of the Philip equation were estimated for each simulation, using the nonlinear regression procedure of the statistical software package SAS®. Sorptivity increased in the columns with the loam content, which was attributed to the relation of S with the capillary radius. The time exponent estimated by nonlinear regression was found to be less than the traditional value of 0.5. The fractal dimension estimated from the Hurst exponent was 17.5 % lower than the fractal dimension of the Cantor Bar used to generate the columns.
Resumo:
We have investigated the structure of double quantum dots vertically coupled at zero magnetic field within local-spin-density functional theory. The dots are identical and have a finite width, and the whole system is axially symmetric. We first discuss the effect of thickness on the addition spectrum of one single dot. Next we describe the structure of coupled dots as a function of the interdot distance for different electron numbers. Addition spectra, Hund's rule, and molecular-type configurations are discussed. It is shown that self-interaction corrections to the density-functional results do not play a very important role in the calculated addition spectra
Resumo:
We consider the effects of external, multiplicative white noise on the relaxation time of a general representation of a bistable system from the points of view provided by two, quite different, theoretical approaches: the classical Stratonovich decoupling of correlations and the new method due to Jung and Risken. Experimental results, obtained from a bistable electronic circuit, are compared to the theoretical predictions. We show that the phenomenon of critical slowing down appears as a function of the noise parameters, thereby providing a correct characterization of a noise-induced transition.
Resumo:
The general theory of nonlinear relaxation times is developed for the case of Gaussian colored noise. General expressions are obtained and applied to the study of the characteristic decay time of unstable states in different situations, including white and colored noise, with emphasis on the distributed initial conditions. Universal effects of the coupling between colored noise and random initial conditions are predicted.
Resumo:
A theory is presented to explain the statistical properties of the growth of dye-laser radiation. Results are in agreement with recent experimental findings. The different roles of pump-noise intensity and correlation time are elucidated.
Resumo:
We present the relationship between nonlinear-relaxation-time (NLRT) and quasideterministic approaches to characterize the decay of an unstable state. The universal character of the NLRT is established. The theoretical results are applied to study the dynamical relaxation of the Landau model in one and n variables and also a laser model.
Resumo:
We extend the recent microscopic analysis of extremal dyonic Kaluza-Klein (D0-D6) black holes to cover the regime of fast rotation in addition to slow rotation. Fastly rotating black holes, in contrast to slow ones, have nonzero angular velocity and possess ergospheres, so they are more similar to the Kerr black hole. The D-brane model reproduces their entropy exactly, but the mass gets renormalized from weak to strong coupling, in agreement with recent macroscopic analyses of rotating attractors. We discuss how the existence of the ergosphere and superradiance manifest themselves within the microscopic model. In addition, we show in full generality how Myers-Perry black holes are obtained as a limit of Kaluza-Klein black holes, and discuss the slow and fast rotation regimes and superradiance in this context.
Resumo:
In the 1920s, Ronald Fisher developed the theory behind the p value and Jerzy Neyman and Egon Pearson developed the theory of hypothesis testing. These distinct theories have provided researchers important quantitative tools to confirm or refute their hypotheses. The p value is the probability to obtain an effect equal to or more extreme than the one observed presuming the null hypothesis of no effect is true; it gives researchers a measure of the strength of evidence against the null hypothesis. As commonly used, investigators will select a threshold p value below which they will reject the null hypothesis. The theory of hypothesis testing allows researchers to reject a null hypothesis in favor of an alternative hypothesis of some effect. As commonly used, investigators choose Type I error (rejecting the null hypothesis when it is true) and Type II error (accepting the null hypothesis when it is false) levels and determine some critical region. If the test statistic falls into that critical region, the null hypothesis is rejected in favor of the alternative hypothesis. Despite similarities between the two, the p value and the theory of hypothesis testing are different theories that often are misunderstood and confused, leading researchers to improper conclusions. Perhaps the most common misconception is to consider the p value as the probability that the null hypothesis is true rather than the probability of obtaining the difference observed, or one that is more extreme, considering the null is true. Another concern is the risk that an important proportion of statistically significant results are falsely significant. Researchers should have a minimum understanding of these two theories so that they are better able to plan, conduct, interpret, and report scientific experiments.