22 resultados para Bayesian statistical decision theory
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
Knowing which individuals can be more efficient in spreading a pathogen throughout a determinate environment is a fundamental question in disease control. Indeed, over recent years the spread of epidemic diseases and its relationship with the topology of the involved system have been a recurrent topic in complex network theory, taking into account both network models and real-world data. In this paper we explore possible correlations between the heterogeneous spread of an epidemic disease governed by the susceptible-infected-recovered (SIR) model, and several attributes of the originating vertices, considering Erdos-Renyi (ER), Barabasi-Albert (BA) and random geometric graphs (RGG), as well as a real case study, the US air transportation network, which comprises the 500 busiest airports in the US along with inter-connections. Initially, the heterogeneity of the spreading is achieved by considering the RGG networks, in which we analytically derive an expression for the distribution of the spreading rates among the established contacts, by assuming that such rates decay exponentially with the distance that separates the individuals. Such a distribution is also considered for the ER and BA models, where we observe topological effects on the correlations. In the case of the airport network, the spreading rates are empirically defined, assumed to be directly proportional to the seat availability. Among both the theoretical and real networks considered, we observe a high correlation between the total epidemic prevalence and the degree, as well as the strength and the accessibility of the epidemic sources. For attributes such as the betweenness centrality and the k-shell index, however, the correlation depends on the topology considered.
Resumo:
Up to now the raise-and-peel model was the single known example of a one-dimensional stochastic process where one can observe conformal invariance. The model has one parameter. Depending on its value one has a gapped phase, a critical point where one has conformal invariance, and a gapless phase with changing values of the dynamical critical exponent z. In this model, adsorption is local but desorption is not. The raise-and-strip model presented here, in which desorption is also nonlocal, has the same phase diagram. The critical exponents are different as are some physical properties of the model. Our study suggests the possible existence of a whole class of stochastic models in which one can observe conformal invariance.
Resumo:
In this paper we have quantified the consistency of word usage in written texts represented by complex networks, where words were taken as nodes, by measuring the degree of preservation of the node neighborhood. Words were considered highly consistent if the authors used them with the same neighborhood. When ranked according to the consistency of use, the words obeyed a log-normal distribution, in contrast to Zipf's law that applies to the frequency of use. Consistency correlated positively with the familiarity and frequency of use, and negatively with ambiguity and age of acquisition. An inspection of some highly consistent words confirmed that they are used in very limited semantic contexts. A comparison of consistency indices for eight authors indicated that these indices may be employed for author recognition. Indeed, as expected, authors of novels could be distinguished from those who wrote scientific texts. Our analysis demonstrated the suitability of the consistency indices, which can now be applied in other tasks, such as emotion recognition.
Resumo:
Renyi and von Neumann entropies quantifying the amount of entanglement in ground states of critical spin chains are known to satisfy a universal law which is given by the conformal field theory (CFT) describing their scaling regime. This law can be generalized to excitations described by primary fields in CFT, as was done by Alcaraz et al in 2011 (see reference [1], of which this work is a completion). An alternative derivation is presented, together with numerical verifications of our results in different models belonging to the c = 1, 1/2 universality classes. Oscillations of the Renyi entropy in excited states are also discussed.
Resumo:
The mechanisms responsible for containing activity in systems represented by networks are crucial in various phenomena, for example, in diseases such as epilepsy that affect the neuronal networks and for information dissemination in social networks. The first models to account for contained activity included triggering and inhibition processes, but they cannot be applied to social networks where inhibition is clearly absent. A recent model showed that contained activity can be achieved with no need of inhibition processes provided that the network is subdivided into modules (communities). In this paper, we introduce a new concept inspired in the Hebbian theory, through which containment of activity is achieved by incorporating a dynamics based on a decaying activity in a random walk mechanism preferential to the node activity. Upon selecting the decay coefficient within a proper range, we observed sustained activity in all the networks tested, namely, random, Barabasi-Albert and geographical networks. The generality of this finding was confirmed by showing that modularity is no longer needed if the dynamics based on the integrate-and-fire dynamics incorporated the decay factor. Taken together, these results provide a proof of principle that persistent, restrained network activation might occur in the absence of any particular topological structure. This may be the reason why neuronal activity does not spread out to the entire neuronal network, even when no special topological organization exists.
Resumo:
Exact results on particle densities as well as correlators in two models of immobile particles, containing either a single species or else two distinct species, are derived. The models evolve following a descent dynamics through pair annihilation where each particle interacts once at most throughout its entire history. The resulting large number of stationary states leads to a non-vanishing configurational entropy. Our results are established for arbitrary initial conditions and are derived via a generating function method. The single-species model is the dual of the 1D zero-temperature kinetic Ising model with Kimball-Deker-Haake dynamics. In this way, both in finite and semi-infinite chains and also the Bethe lattice can be analysed. The relationship with the random sequential adsorption of dimers and weakly tapped granular materials is discussed.
Resumo:
Spectral decomposition has rarely been used to investigate complex networks. In this work we apply this concept in order to define two kinds of link-directed attacks while quantifying their respective effects on the topology. Several other kinds of more traditional attacks are also adopted and compared. These attacks had substantially diverse effects, depending on each specific network (models and real-world structures). It is also shown that the spectrally based attacks have special effects in affecting the transitivity of the networks.
Resumo:
The ground-state phase diagram of an Ising spin-glass model on a random graph with an arbitrary fraction w of ferromagnetic interactions is analysed in the presence of an external field. Using the replica method, and performing an analysis of stability of the replica-symmetric solution, it is shown that w = 1/2, corresponding to an unbiased spin glass, is a singular point in the phase diagram, separating a region with a spin-glass phase (w < 1/2) from a region with spin-glass, ferromagnetic, mixed and paramagnetic phases (w > 1/2).
Resumo:
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
Resumo:
What does it take to make a coalition successful? Bigger coalitions are more likely to be successful because the GATT/WTO is a consensus-based institution and countries are informally penalized if they isolate themselves. Through a Bayesian statistical analysis, the article corroborates the above hypothesis. To further investigate the research question, qualitative case studies of the G-10 in the Uruguay Round and the Public Health Coalition in the Doha Round are conducted. These cases show that the more convincing the framing of a position, the better are the chances of coalitions keeping a large number of followers and supporters, thereby affecting their odds of success. By building a unique database and applying a new research design to the topic, the study rigorously tests theories about coalitions that had previously only been proposed but not empirically analyzed.
Resumo:
A common interest in gene expression data analysis is to identify from a large pool of candidate genes the genes that present significant changes in expression levels between a treatment and a control biological condition. Usually, it is done using a statistic value and a cutoff value that are used to separate the genes differentially and nondifferentially expressed. In this paper, we propose a Bayesian approach to identify genes differentially expressed calculating sequentially credibility intervals from predictive densities which are constructed using the sampled mean treatment effect from all genes in study excluding the treatment effect of genes previously identified with statistical evidence for difference. We compare our Bayesian approach with the standard ones based on the use of the t-test and modified t-tests via a simulation study, using small sample sizes which are common in gene expression data analysis. Results obtained report evidence that the proposed approach performs better than standard ones, especially for cases with mean differences and increases in treatment variance in relation to control variance. We also apply the methodologies to a well-known publicly available data set on Escherichia coli bacterium.
Resumo:
Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.
Resumo:
We explore the meaning of information about quantities of interest. Our approach is divided in two scenarios: the analysis of observations and the planning of an experiment. First, we review the Sufficiency, Conditionality and Likelihood principles and how they relate to trivial experiments. Next, we review Blackwell Sufficiency and show that sampling without replacement is Blackwell Sufficient for sampling with replacement. Finally, we unify the two scenarios presenting an extension of the relationship between Blackwell Equivalence and the Likelihood Principle.
Resumo:
We consider general d-dimensional lattice ferromagnetic spin systems with nearest neighbor interactions in the high temperature region ('beta' << 1). Each model is characterized by a single site apriori spin distribution taken to be even. We also take the parameter 'alfa' = ('S POT.4') - 3 '(S POT.2') POT.2' > 0, i.e. in the region which we call Gaussian subjugation, where ('S POT.K') denotes the kth moment of the apriori distribution. Associated with the model is a lattice quantum field theory known to contain a particle of asymptotic mass -ln 'beta' and a bound state below the two-particle threshold. We develop a 'beta' analytic perturbation theory for the binding energy of this bound state. As a key ingredient in obtaining our result we show that the Fourier transform of the two-point function is a meromorphic function, with a simple pole, in a suitable complex spectral parameter and the coefficients of its Laurent expansion are analytic in 'beta'.
Resumo:
The purpose of this paper is to develop a Bayesian analysis for the right-censored survival data when immune or cured individuals may be present in the population from which the data is taken. In our approach the number of competing causes of the event of interest follows the Conway-Maxwell-Poisson distribution which generalizes the Poisson distribution. Markov chain Monte Carlo (MCMC) methods are used to develop a Bayesian procedure for the proposed model. Also, some discussions on the model selection and an illustration with a real data set are considered.