43 resultados para statistical mechanics many-body inverse problem graph-theory
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
The Sznajd model is a sociophysics model that is used to model opinion propagation and consensus formation in societies. Its main feature is that its rules favor bigger groups of agreeing people. In a previous work, we generalized the bounded confidence rule in order to model biases and prejudices in discrete opinion models. In that work, we applied this modification to the Sznajd model and presented some preliminary results. The present work extends what we did in that paper. We present results linking many of the properties of the mean-field fixed points, with only a few qualitative aspects of the confidence rule (the biases and prejudices modeled), finding an interesting connection with graph theory problems. More precisely, we link the existence of fixed points with the notion of strongly connected graphs and the stability of fixed points with the problem of finding the maximal independent sets of a graph. We state these results and present comparisons between the mean field and simulations in Barabasi-Albert networks, followed by the main mathematical ideas and appendices with the rigorous proofs of our claims and some graph theory concepts, together with examples. We also show that there is no qualitative difference in the mean-field results if we require that a group of size q > 2, instead of a pair, of agreeing agents be formed before they attempt to convince other sites (for the mean field, this would coincide with the q-voter model).
Resumo:
In this present work we present a methodology that aims to apply the many-body expansion to decrease the computational cost of ab initio molecular dynamics, keeping acceptable accuracy on the results. We implemented this methodology in a program which we called ManBo. In the many-body expansion approach, we partitioned the total energy E of the system in contributions of one body, two bodies, three bodies, etc., until the contribution of the Nth body [1-3]: E = E1 + E2 + E3 + …EN. The E1 term is the sum of the internal energy of the molecules; the term E2 is the energy due to interaction between all pairs of molecules; E3 is the energy due to interaction between all trios of molecules; and so on. In Manbo we chose to truncate the expansion in the contribution of two or three bodies, both for the calculation of the energy and for the calculation of the atomic forces. In order to partially include the many-body interactions neglected when we truncate the expansion, we can include an electrostatic embedding in the electronic structure calculations, instead of considering the monomers, pairs and trios as isolated molecules in space. In simulations we made we chose to simulate water molecules, and use the Gaussian 09 as external program to calculate the atomic forces and energy of the system, as well as reference program for analyzing the accuracy of the results obtained with the ManBo. The results show that the use of the many-body expansion seems to be an interesting approach for reducing the still prohibitive computational cost of ab initio molecular dynamics. The errors introduced on atomic forces in applying such methodology are very small. The inclusion of an embedding electrostatic seems to be a good solution for improving the results with only a small increase in simulation time. As we increase the level of calculation, the simulation time of ManBo tends to largely decrease in relation to a conventional BOMD simulation of Gaussian, due to better scalability of the methodology presented. References [1] E. E. Dahlke and D. G. Truhlar; J. Chem. Theory Comput., 3, 46 (2007). [2] E. E. Dahlke and D. G. Truhlar; J. Chem. Theory Comput., 4, 1 (2008). [3] R. Rivelino, P. Chaudhuri and S. Canuto; J. Chem. Phys., 118, 10593 (2003).
Resumo:
The present work propounds an inverse method to estimate the heat sources in the transient two-dimensional heat conduction problem in a rectangular domain with convective bounders. The non homogeneous partial differential equation (PDE) is solved using the Integral Transform Method. The test function for the heat generation term is obtained by the chip geometry and thermomechanical cutting. Then the heat generation term is estimated by the conjugated gradient method (CGM) with adjoint problem for parameter estimation. The experimental trials were organized to perform six different conditions to provide heat sources of different intensities. This method was compared with others in the literature and advantages are discussed. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The ground-state phase diagram of an Ising spin-glass model on a random graph with an arbitrary fraction w of ferromagnetic interactions is analysed in the presence of an external field. Using the replica method, and performing an analysis of stability of the replica-symmetric solution, it is shown that w = 1/2, corresponding to an unbiased spin glass, is a singular point in the phase diagram, separating a region with a spin-glass phase (w < 1/2) from a region with spin-glass, ferromagnetic, mixed and paramagnetic phases (w > 1/2).
Resumo:
The development of new statistical and computational methods is increasingly making it possible to bridge the gap between hard sciences and humanities. In this study, we propose an approach based on a quantitative evaluation of attributes of objects in fields of humanities, from which concepts such as dialectics and opposition are formally defined mathematically. As case studies, we analyzed the temporal evolution of classical music and philosophy by obtaining data for 8 features characterizing the corresponding fields for 7 well-known composers and philosophers, which were treated with multivariate statistics and pattern recognition methods. A bootstrap method was applied to avoid statistical bias caused by the small sample data set, with which hundreds of artificial composers and philosophers were generated, influenced by the 7 names originally chosen. Upon defining indices for opposition, skewness and counter-dialectics, we confirmed the intuitive analysis of historians in that classical music evolved according to a master apprentice tradition, while in philosophy changes were driven by opposition. Though these case studies were meant only to show the possibility of treating phenomena in humanities quantitatively, including a quantitative measure of concepts such as dialectics and opposition, the results are encouraging for further application of the approach presented here to many other areas, since it is entirely generic.
Resumo:
In this work, we report the construction of potential energy surfaces for the (3)A '' and (3)A' states of the system O(P-3) + HBr. These surfaces are based on extensive ab initio calculations employing the MRCI+Q/CBS+SO level of theory. The complete basis set energies were estimated from extrapolation of MRCI+Q/aug-cc-VnZ(-PP) (n = Q, 5) results and corrections due to spin-orbit effects obtained at the CASSCF/aug-cc-pVTZ(-PP) level of theory. These energies, calculated over a region of the configuration space relevant to the study of the reaction O(P-3) + HBr -> OH + Br, were used to generate functions based on the many-body expansion. The three-body potentials were interpolated using the reproducing kernel Hilbert space method. The resulting surface for the (3)A '' electronic state contains van der Waals minima on the entrance and exit channels and a transition state 6.55 kcal/mol higher than the reactants. This barrier height was then scaled to reproduce the value of 5.01 kcal/mol, which was estimated from coupled cluster benchmark calculations performed to include high-order and core-valence correlation, as well as scalar relativistic effects. The (3)A' surface was also scaled, based on the fact that in the collinear saddle point geometry these two electronic states are degenerate. The vibrationally adiabatic barrier heights are 3.44 kcal/mol for the (3)A '' and 4.16 kcal/mol for the (3)A' state. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4705428]
Resumo:
This paper addresses the functional reliability and the complexity of reconfigurable antennas using graph models. The correlation between complexity and reliability for any given reconfigurable antenna is defined. Two methods are proposed to reduce failures and improve the reliability of reconfigurable antennas. The failures are caused by the reconfiguration technique or by the surrounding environment. These failure reduction methods proposed are tested and examples are given which verify these methods.
Resumo:
Knowing which individuals can be more efficient in spreading a pathogen throughout a determinate environment is a fundamental question in disease control. Indeed, over recent years the spread of epidemic diseases and its relationship with the topology of the involved system have been a recurrent topic in complex network theory, taking into account both network models and real-world data. In this paper we explore possible correlations between the heterogeneous spread of an epidemic disease governed by the susceptible-infected-recovered (SIR) model, and several attributes of the originating vertices, considering Erdos-Renyi (ER), Barabasi-Albert (BA) and random geometric graphs (RGG), as well as a real case study, the US air transportation network, which comprises the 500 busiest airports in the US along with inter-connections. Initially, the heterogeneity of the spreading is achieved by considering the RGG networks, in which we analytically derive an expression for the distribution of the spreading rates among the established contacts, by assuming that such rates decay exponentially with the distance that separates the individuals. Such a distribution is also considered for the ER and BA models, where we observe topological effects on the correlations. In the case of the airport network, the spreading rates are empirically defined, assumed to be directly proportional to the seat availability. Among both the theoretical and real networks considered, we observe a high correlation between the total epidemic prevalence and the degree, as well as the strength and the accessibility of the epidemic sources. For attributes such as the betweenness centrality and the k-shell index, however, the correlation depends on the topology considered.
Resumo:
In this work, we present a theoretical photoluminescence (PL) for p-doped GaAs/InGaAsN nanostructures arrays. We apply a self-consistent method in the framework of the effective mass theory. Solving a full 8 x 8 Kane's Hamiltonian, generalized to treat different materials in conjunction with the Poisson equation, we calculate the optical properties of these systems. The trends in the calculated PL spectra, due to many-body effects within the quasi-two-dimensional hole gas, are analyzed as a function of the acceptor doping concentration and the well width. Effects of temperature in the PL spectra are also investigated. This is the first attempt to show theoretical luminescence spectra for GaAs/InGaAsN nanostructures and can be used as a guide for the design of nanostructured devices such as optoelectronic devices, solar cells, and others.
Resumo:
Many discussions have enlarged the literature in Bibliometrics since the Hirsch proposal, the so called h-index. Ranking papers according to their citations, this index quantifies a researcher only by its greatest possible number of papers that are cited at least h times. A closed formula for h-index distribution that can be applied for distinct databases is not yet known. In fact, to obtain such distribution, the knowledge of citation distribution of the authors and its specificities are required. Instead of dealing with researchers randomly chosen, here we address different groups based on distinct databases. The first group is composed of physicists and biologists, with data extracted from Institute of Scientific Information (IS!). The second group is composed of computer scientists, in which data were extracted from Google-Scholar system. In this paper, we obtain a general formula for the h-index probability density function (pdf) for groups of authors by using generalized exponentials in the context of escort probability. Our analysis includes the use of several statistical methods to estimate the necessary parameters. Also an exhaustive comparison among the possible candidate distributions are used to describe the way the citations are distributed among authors. The h-index pdf should be used to classify groups of researchers from a quantitative point of view, which is meaningfully interesting to eliminate obscure qualitative methods. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This work proposes a method for data clustering based on complex networks theory. A data set is represented as a network by considering different metrics to establish the connection between each pair of objects. The clusters are obtained by taking into account five community detection algorithms. The network-based clustering approach is applied in two real-world databases and two sets of artificially generated data. The obtained results suggest that the exponential of the Minkowski distance is the most suitable metric to quantify the similarities between pairs of objects. In addition, the community identification method based on the greedy optimization provides the best cluster solution. We compare the network-based clustering approach with some traditional clustering algorithms and verify that it provides the lowest classification error rate. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Up to now the raise-and-peel model was the single known example of a one-dimensional stochastic process where one can observe conformal invariance. The model has one parameter. Depending on its value one has a gapped phase, a critical point where one has conformal invariance, and a gapless phase with changing values of the dynamical critical exponent z. In this model, adsorption is local but desorption is not. The raise-and-strip model presented here, in which desorption is also nonlocal, has the same phase diagram. The critical exponents are different as are some physical properties of the model. Our study suggests the possible existence of a whole class of stochastic models in which one can observe conformal invariance.
Resumo:
This paper compares the effectiveness of the Tsallis entropy over the classic Boltzmann-Gibbs-Shannon entropy for general pattern recognition, and proposes a multi-q approach to improve pattern analysis using entropy. A series of experiments were carried out for the problem of classifying image patterns. Given a dataset of 40 pattern classes, the goal of our image case study is to assess how well the different entropies can be used to determine the class of a newly given image sample. Our experiments show that the Tsallis entropy using the proposed multi-q approach has great advantages over the Boltzmann-Gibbs-Shannon entropy for pattern classification, boosting image recognition rates by a factor of 3. We discuss the reasons behind this success, shedding light on the usefulness of the Tsallis entropy and the multi-q approach. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
In this paper, a new algebraic-graph method for identification of islanding in power system grids is proposed. The proposed method identifies all the possible cases of islanding, due to the loss of a equipment, by means of a factorization of the bus-branch incidence matrix. The main features of this new method include: (i) simple implementation, (ii) high speed, (iii) real-time adaptability, (iv) identification of all islanding cases and (v) identification of the buses that compose each island in case of island formation. The method was successfully tested on large-scale systems such as the reduced south Brazilian system (45 buses/72 branches) and the south-southeast Brazilian system (810 buses/1340 branches). (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
In this paper we have quantified the consistency of word usage in written texts represented by complex networks, where words were taken as nodes, by measuring the degree of preservation of the node neighborhood. Words were considered highly consistent if the authors used them with the same neighborhood. When ranked according to the consistency of use, the words obeyed a log-normal distribution, in contrast to Zipf's law that applies to the frequency of use. Consistency correlated positively with the familiarity and frequency of use, and negatively with ambiguity and age of acquisition. An inspection of some highly consistent words confirmed that they are used in very limited semantic contexts. A comparison of consistency indices for eight authors indicated that these indices may be employed for author recognition. Indeed, as expected, authors of novels could be distinguished from those who wrote scientific texts. Our analysis demonstrated the suitability of the consistency indices, which can now be applied in other tasks, such as emotion recognition.