1000 resultados para statistical physics
Resumo:
Inference and optimisation of real-value edge variables in sparse graphs are studied using the tree based Bethe approximation optimisation algorithms. Equilibrium states of general energy functions involving a large set of real edge-variables that interact at the network nodes are obtained for networks in various cases. These include different cost functions, connectivity values, constraints on the edge bandwidth and the case of multiclass optimisation.
Resumo:
Networking encompasses a variety of tasks related to the communication of information on networks; it has a substantial economic and societal impact on a broad range of areas including transportation systems, wired and wireless communications and a range of Internet applications. As transportation and communication networks become increasingly more complex, the ever increasing demand for congestion control, higher traffic capacity, quality of service, robustness and reduced energy consumption requires new tools and methods to meet these conflicting requirements. The new methodology should serve for gaining better understanding of the properties of networking systems at the macroscopic level, as well as for the development of new principled optimization and management algorithms at the microscopic level. Methods of statistical physics seem best placed to provide new approaches as they have been developed specifically to deal with nonlinear large-scale systems. This review aims at presenting an overview of tools and methods that have been developed within the statistical physics community and that can be readily applied to address the emerging problems in networking. These include diffusion processes, methods from disordered systems and polymer physics, probabilistic inference, which have direct relevance to network routing, file and frequency distribution, the exploration of network structures and vulnerability, and various other practical networking applications. © 2013 IOP Publishing Ltd.
Resumo:
DUE TO COPYRIGHT RESTRICTIONS ONLY AVAILABLE FOR CONSULTATION AT ASTON UNIVERSITY LIBRARY AND INFORMATION SERVICES WITH PRIOR ARRANGEMENT
Resumo:
Motivation: Within bioinformatics, the textual alignment of amino acid sequences has long dominated the determination of similarity between proteins, with all that implies for shared structure, function, and evolutionary descent. Despite the relative success of modern-day sequence alignment algorithms, so-called alignment-free approaches offer a complementary means of determining and expressing similarity, with potential benefits in certain key applications, such as regression analysis of protein structure-function studies, where alignment-base similarity has performed poorly. Results: Here, we offer a fresh, statistical physics-based perspective focusing on the question of alignment-free comparison, in the process adapting results from “first passage probability distribution” to summarize statistics of ensemble averaged amino acid propensity values. In this paper, we introduce and elaborate this approach.
Resumo:
The extended Gaussian ensemble (EGE) is introduced as a generalization of the canonical ensemble. This ensemble is a further extension of the Gaussian ensemble introduced by Hetherington [J. Low Temp. Phys. 66, 145 (1987)]. The statistical mechanical formalism is derived both from the analysis of the system attached to a finite reservoir and from the maximum statistical entropy principle. The probability of each microstate depends on two parameters ß and ¿ which allow one to fix, independently, the mean energy of the system and the energy fluctuations, respectively. We establish the Legendre transform structure for the generalized thermodynamic potential and propose a stability criterion. We also compare the EGE probability distribution with the q-exponential distribution. As an example, an application to a system with few independent spins is presented.
Resumo:
During plastic deformation of crystalline materials, the collective dynamics of interacting dislocations gives rise to various patterning phenomena. A crucial and still open question is whether the long range dislocation-dislocation interactions which do not have an intrinsic range can lead to spatial patterns which may exhibit well-defined characteristic scales. It is demonstrated for a general model of two-dimensional dislocation systems that spontaneously emerging dislocation pair correlations introduce a length scale which is proportional to the mean dislocation spacing. General properties of the pair correlation functions are derived, and explicit calculations are performed for a simple special case, viz pair correlations in single-glide dislocation dynamics. It is shown that in this case the dislocation system exhibits a patterning instability leading to the formation of walls normal to the glide plane. The results are discussed in terms of their general implications for dislocation patterning.
Resumo:
An immense variety of problems in theoretical physics are of the non-linear type. Non~linear partial differential equations (NPDE) have almost become the rule rather than an exception in diverse branches of physics such as fluid mechanics, field theory, particle physics, statistical physics and optics, and the construction of exact solutions of these equations constitutes one of the most vigorous activities in theoretical physics today. The thesis entitled ‘Some Non-linear Problems in Theoretical Physics’ addresses various aspects of this problem at the classical level. For obtaining exact solutions we have used mathematical tools like the bilinear operator method, base equation technique and similarity method with emphasis on its group theoretical aspects. The thesis deals with certain methods of finding exact solutions of a number of non-linear partial differential equations of importance to theoretical physics. Some of these new solutions are of relevance from the applications point of view in diverse branches such as elementary particle physics, field theory, solid state physics and non-linear optics and give some insight into the stable or unstable behavior of dynamical Systems The thesis consists of six chapters.
Resumo:
During plastic deformation of crystalline materials, the collective dynamics of interacting dislocations gives rise to various patterning phenomena. A crucial and still open question is whether the long range dislocation-dislocation interactions which do not have an intrinsic range can lead to spatial patterns which may exhibit well-defined characteristic scales. It is demonstrated for a general model of two-dimensional dislocation systems that spontaneously emerging dislocation pair correlations introduce a length scale which is proportional to the mean dislocation spacing. General properties of the pair correlation functions are derived, and explicit calculations are performed for a simple special case, viz pair correlations in single-glide dislocation dynamics. It is shown that in this case the dislocation system exhibits a patterning instability leading to the formation of walls normal to the glide plane. The results are discussed in terms of their general implications for dislocation patterning.
Resumo:
We consider the general response theory recently proposed by Ruelle for describing the impact of small perturbations to the non-equilibrium steady states resulting from Axiom A dynamical systems. We show that the causality of the response functions entails the possibility of writing a set of Kramers-Kronig (K-K) relations for the corresponding susceptibilities at all orders of nonlinearity. Nonetheless, only a special class of directly observable susceptibilities obey K-K relations. Specific results are provided for the case of arbitrary order harmonic response, which allows for a very comprehensive K-K analysis and the establishment of sum rules connecting the asymptotic behavior of the harmonic generation susceptibility to the short-time response of the perturbed system. These results set in a more general theoretical framework previous findings obtained for optical systems and simple mechanical models, and shed light on the very general impact of considering the principle of causality for testing self-consistency: the described dispersion relations constitute unavoidable benchmarks that any experimental and model generated dataset must obey. The theory exposed in the present paper is dual to the time-dependent theory of perturbations to equilibrium states and to non-equilibrium steady states, and has in principle similar range of applicability and limitations. In order to connect the equilibrium and the non equilibrium steady state case, we show how to rewrite the classical response theory by Kubo so that response functions formally identical to those proposed by Ruelle, apart from the measure involved in the phase space integration, are obtained. These results, taking into account the chaotic hypothesis by Gallavotti and Cohen, might be relevant in several fields, including climate research. In particular, whereas the fluctuation-dissipation theorem does not work for non-equilibrium systems, because of the non-equivalence between internal and external fluctuations, K-K relations might be robust tools for the definition of a self-consistent theory of climate change.
Resumo:
The nearest-neighbor spacing distributions proposed by four models, namely, the Berry-Robnik, Caurier-Grammaticos-Ramani, Lenz-Haake, and the deformed Gaussian orthogonal ensemble, as well as the ansatz by Brody, are applied to the transition between chaos and order that occurs in the isotropic quartic oscillator. The advantages and disadvantages of these five descriptions are discussed. In addition, the results of a simple extension of the expression for the Dyson-Mehta statistic Δ3 are compared with those of a more popular one, usually associated with the Berry-Robnik formalism. ©1999 The American Physical Society.
Resumo:
While the use of statistical physics methods to analyze large corpora has been useful to unveil many patterns in texts, no comprehensive investigation has been performed on the interdependence between syntactic and semantic factors. In this study we propose a framework for determining whether a text (e.g., written in an unknown alphabet) is compatible with a natural language and to which language it could belong. The approach is based on three types of statistical measurements, i.e. obtained from first-order statistics of word properties in a text, from the topology of complex networks representing texts, and from intermittency concepts where text is treated as a time series. Comparative experiments were performed with the New Testament in 15 different languages and with distinct books in English and Portuguese in order to quantify the dependency of the different measurements on the language and on the story being told in the book. The metrics found to be informative in distinguishing real texts from their shuffled versions include assortativity, degree and selectivity of words. As an illustration, we analyze an undeciphered medieval manuscript known as the Voynich Manuscript. We show that it is mostly compatible with natural languages and incompatible with random texts. We also obtain candidates for keywords of the Voynich Manuscript which could be helpful in the effort of deciphering it. Because we were able to identify statistical measurements that are more dependent on the syntax than on the semantics, the framework may also serve for text analysis in language-dependent applications.
Resumo:
The subject of this work concerns the study of the immigration phenomenon, with emphasis on the aspects related to the integration of an immigrant population in a hosting one. Aim of this work is to show the forecasting ability of a recent finding where the behavior of integration quantifiers was analyzed and investigated with a mathematical model of statistical physics origins (a generalization of the monomer dimer model). After providing a detailed literature review of the model, we show that not only such a model is able to identify the social mechanism that drives a particular integration process, but it also provides correct forecast. The research reported here proves that the proposed model of integration and its forecast framework are simple and effective tools to reduce uncertainties about how integration phenomena emerge and how they are likely to develop in response to increased migration levels in the future.
Resumo:
Statistical physicists assume a probability distribution over micro-states to explain thermodynamic behavior. The question of this paper is whether these probabilities are part of a best system and can thus be interpreted as Humean chances. I consider two strategies, viz. a globalist as suggested by Loewer, and a localist as advocated by Frigg and Hoefer. Both strategies fail because the system they are part of have rivals that are roughly equally good, while ontic probabilities should be part of a clearly winning system. I conclude with the diagnosis that well-defined micro-probabilities under-estimate the robust character of explanations in statistical physics.
Resumo:
Using methods of Statistical Physics, we investigate the generalization performance of support vector machines (SVMs), which have been recently introduced as a general alternative to neural networks. For nonlinear classification rules, the generalization error saturates on a plateau, when the number of examples is too small to properly estimate the coefficients of the nonlinear part. When trained on simple rules, we find that SVMs overfit only weakly. The performance of SVMs is strongly enhanced, when the distribution of the inputs has a gap in feature space.
Resumo:
Using techniques from Statistical Physics, the annealed VC entropy for hyperplanes in high dimensional spaces is calculated as a function of the margin for a spherical Gaussian distribution of inputs.