17 resultados para statistical mechanics many-body inverse problem graph-theory

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the first part of this work our concern was to investigate the thermal effects in organic crystals using the theory of the polarons. To analyse such effect, we used the Fröhlich s Hamiltonian, that describes the dynamics of the polarons, using a treatment based on the quantum mechanics, to elucidate the electron-phonon interaction. Many are the forms to analyzing the polaronic phenomenon. However, the measure of the dielectric function can supply important information about the small polarons hopping process. Besides, the dielectric function measures the answer to an applied external electric field, and it is an important tool for the understanding of the many-body effects in the normal state of a polaronic system. We calculate the dielectric function and its dependence on temperature using the Hartree-Fock decoupling method. The dieletric function s dependence on the temperature is depicted by through a 3D graph. We also analyzed the so called Arrhenius resistivity, as a functionof the temperature, which is an important tool to characterize the conductivity of an organic molecule. In the second part we analyzed two perovskita type crystalline oxides, namely the cadmium silicate triclinic (CdSiO3) and the calcium plumbate orthorhombic (CaPbO3), respectively. These materials are normally denominated ABO3 and they have been especially investigated for displaying ferroelectric, piezoelectric, dielectrics, semiconductors and superconductors properties. We found our results through ab initio method within the functional density theory (DFT) in the GGA-PBE and LDA-CAPZ approximations. After the geometry optimization for the two structure using the in two approximations, we found the structure parameters and compared them with the experimental data. We still determined further the angles of connection for the two analyzed cases. Soon after the convergence of the energy, we determined their band structures, fundamental information to characterize the nature of the material, as well as their dielectrics functions, optical absorption, partial density of states and effective masses of electrons and holes

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We address the generalization of thermodynamic quantity q-deformed by q-algebra that describes a general algebra for bosons and fermions . The motivation for our study stems from an interest to strengthen our initial ideas, and a possible experimental application. On our journey, we met a generalization of the recently proposed formalism of the q-calculus, which is the application of a generalized sequence described by two parameters deformation positive real independent and q1 and q2, known for Fibonacci oscillators . We apply the wellknown problem of Landau diamagnetism immersed in a space D-dimensional, which still generates good discussions by its nature, and dependence with the number of dimensions D, enables us future extend its application to systems extra-dimensional, such as Modern Cosmology, Particle Physics and String Theory. We compare our results with some experimentally obtained performing major equity. We also use the formalism of the oscillators to Einstein and Debye solid, strengthening the interpretation of the q-deformation acting as a factor of disturbance or impurity in a given system, modifying the properties of the same. Our results show that the insertion of two parameters of disorder, allowed a wider range of adjustment , i.e., enabling change only the desired property, e.g., the thermal conductivity of a same element without the waste essence

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Verbal fluency is the ability to produce a satisfying sequence of spoken words during a given time interval. The core of verbal fluency lies in the capacity to manage the executive aspects of language. The standard scores of the semantic verbal fluency test are broadly used in the neuropsychological assessment of the elderly, and different analytical methods are likely to extract even more information from the data generated in this test. Graph theory, a mathematical approach to analyze relations between items, represents a promising tool to understand a variety of neuropsychological states. This study reports a graph analysis of data generated by the semantic verbal fluency test by cognitively healthy elderly (NC), patients with Mild Cognitive Impairment – subtypes amnestic(aMCI) and amnestic multiple domain (a+mdMCI) - and patients with Alzheimer’s disease (AD). Sequences of words were represented as a speech graph in which every word corresponded to a node and temporal links between words were represented by directed edges. To characterize the structure of the data we calculated 13 speech graph attributes (SGAs). The individuals were compared when divided in three (NC – MCI – AD) and four (NC – aMCI – a+mdMCI – AD) groups. When the three groups were compared, significant differences were found in the standard measure of correct words produced, and three SGA: diameter, average shortest path, and network density. SGA sorted the elderly groups with good specificity and sensitivity. When the four groups were compared, the groups differed significantly in network density, except between the two MCI subtypes and NC and aMCI. The diameter of the network and the average shortest path were significantly different between the NC and AD, and between aMCI and AD. SGA sorted the elderly in their groups with good specificity and sensitivity, performing better than the standard score of the task. These findings provide support for a new methodological frame to assess the strength of semantic memory through the verbal fluency task, with potential to amplify the predictive power of this test. Graph analysis is likely to become clinically relevant in neurology and psychiatry, and may be particularly useful for the differential diagnosis of the elderly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Systems whose spectra are fractals or multifractals have received a lot of attention in recent years. The complete understanding of the behavior of many physical properties of these systems is still far from being complete because of the complexity of such systems. Thus, new applications and new methods of study of their spectra have been proposed and consequently a light has been thrown on their properties, enabling a better understanding of these systems. We present in this work initially the basic and necessary theoretical framework regarding the calculation of energy spectrum of elementary excitations in some systems, especially in quasiperiodic ones. Later we show, by using the Schr¨odinger equation in tight-binding approximation, the results for the specific heat of electrons within the statistical mechanics of Boltzmann-Gibbs for one-dimensional quasiperiodic systems, growth by following the Fibonacci and Double Period rules. Structures of this type have already been exploited enough, however the use of non-extensive statistical mechanics proposed by Constantino Tsallis is well suited to systems that have a fractal profile, and therefore our main objective was to apply it to the calculation of thermodynamical quantities, by extending a little more the understanding of the properties of these systems. Accordingly, we calculate, analytical and numerically, the generalized specific heat of electrons in one-dimensional quasiperiodic systems (quasicrystals) generated by the Fibonacci and Double Period sequences. The electronic spectra were obtained by solving the Schr¨odinger equation in the tight-binding approach. Numerical results are presented for the two types of systems with different values of the parameter of nonextensivity q

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this Thesis, we analyzed the formation of maxwellian tails of the distributions of the rotational velocity in the context of the out of equilibrium Boltzmann Gibbs statistical mechanics. We start from a unified model for the angular momentum loss rate which made possible the construction of a general theory for the rotational decay in the which, finally, through the compilation between standard Maxwellian and the relation of rotational decay, we defined the (_, _) Maxwellian distributions. The results reveal that the out of equilibrium Boltzmann Gibbs statistics supplies us results as good as the one of the Tsallis and Kaniadakis generalized statistics, besides allowing fittings controlled by physical properties extracted of the own theory of stellar rotation. In addition, our results point out that these generalized statistics converge to the one of Boltzmann Gibbs when we inserted, in your respective functions of distributions, a rotational velocity defined as a distribution

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Considering a non-relativistic ideal gas, the standard foundations of kinetic theory are investigated in the context of non-gaussian statistical mechanics introduced by Kaniadakis. The new formalism is based on the generalization of the Boltzmann H-theorem and the deduction of Maxwells statistical distribution. The calculated power law distribution is parameterized through a parameter measuring the degree of non-gaussianity. In the limit = 0, the theory of gaussian Maxwell-Boltzmann distribution is recovered. Two physical applications of the non-gaussian effects have been considered. The first one, the -Doppler broadening of spectral lines from an excited gas is obtained from analytical expressions. The second one, a mathematical relationship between the entropic index and the stellar polytropic index is shown by using the thermodynamic formulation for self-gravitational systems

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In general, an inverse problem corresponds to find a value of an element x in a suitable vector space, given a vector y measuring it, in some sense. When we discretize the problem, it usually boils down to solve an equation system f(x) = y, where f : U Rm ! Rn represents the step function in any domain U of the appropriate Rm. As a general rule, we arrive to an ill-posed problem. The resolution of inverse problems has been widely researched along the last decades, because many problems in science and industry consist in determining unknowns that we try to know, by observing its effects under certain indirect measures. Our general subject of this dissertation is the choice of Tykhonov´s regulaziration parameter of a poorly conditioned linear problem, as we are going to discuss on chapter 1 of this dissertation, focusing on the three most popular methods in nowadays literature of the area. Our more specific focus in this dissertation consists in the simulations reported on chapter 2, aiming to compare the performance of the three methods in the recuperation of images measured with the Radon transform, perturbed by the addition of gaussian i.i.d. noise. We choosed a difference operator as regularizer of the problem. The contribution we try to make, in this dissertation, mainly consists on the discussion of numerical simulations we execute, as is exposed in Chapter 2. We understand that the meaning of this dissertation lays much more on the questions which it raises than on saying something definitive about the subject. Partly, for beeing based on numerical experiments with no new mathematical results associated to it, partly for being about numerical experiments made with a single operator. On the other hand, we got some observations which seemed to us interesting on the simulations performed, considered the literature of the area. In special, we highlight observations we resume, at the conclusion of this work, about the different vocations of methods like GCV and L-curve and, also, about the optimal parameters tendency observed in the L-curve method of grouping themselves in a small gap, strongly correlated with the behavior of the generalized singular value decomposition curve of the involved operators, under reasonably broad regularity conditions in the images to be recovered

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The history match procedure in an oil reservoir is of paramount importance in order to obtain a characterization of the reservoir parameters (statics and dynamics) that implicates in a predict production more perfected. Throughout this process one can find reservoir model parameters which are able to reproduce the behaviour of a real reservoir.Thus, this reservoir model may be used to predict production and can aid the oil file management. During the history match procedure the reservoir model parameters are modified and for every new set of reservoir model parameters found, a fluid flow simulation is performed so that it is possible to evaluate weather or not this new set of parameters reproduces the observations in the actual reservoir. The reservoir is said to be matched when the discrepancies between the model predictions and the observations of the real reservoir are below a certain tolerance. The determination of the model parameters via history matching requires the minimisation of an objective function (difference between the observed and simulated productions according to a chosen norm) in a parameter space populated by many local minima. In other words, more than one set of reservoir model parameters fits the observation. With respect to the non-uniqueness of the solution, the inverse problem associated to history match is ill-posed. In order to reduce this ambiguity, it is necessary to incorporate a priori information and constraints in the model reservoir parameters to be determined. In this dissertation, the regularization of the inverse problem associated to the history match was performed via the introduction of a smoothness constraint in the following parameter: permeability and porosity. This constraint has geological bias of asserting that these two properties smoothly vary in space. In this sense, it is necessary to find the right relative weight of this constrain in the objective function that stabilizes the inversion and yet, introduces minimum bias. A sequential search method called COMPLEX was used to find the reservoir model parameters that best reproduce the observations of a semi-synthetic model. This method does not require the usage of derivatives when searching for the minimum of the objective function. Here, it is shown that the judicious introduction of the smoothness constraint in the objective function formulation reduces the associated ambiguity and introduces minimum bias in the estimates of permeability and porosity of the semi-synthetic reservoir model

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The pioneering work proposed by Skumanich (1972) has shown that the projected mean rotational velocity < v sini > for solar type stars follows a rotation law decreases with the time given by t −1/2 , where t is the stellar age. This relationship is consistent with the theories of the angular momentum loss through the ionized stellar wind, which in turn is coupled to the star through its magnetic field. Several authors (e.g.: Silva et al. 2013 and de Freitas et al. 2014) have analyzed the possible matches between the rotational decay and the profile of the velocity distribution. These authors came to a simple heuristic relationship, but did not build a direct path between the exponent of the rotational decay (j) and the exponent of the distribution of the rotational velocity (q). The whole theoretical scenario has been proposed using an efficient and strong statistical mechanics well known as non-extensive statistical mechanics. The present dissertation proposes effectively to close this issue by elaborating a theoretical way to modify the q-Maxwellians’ distributions into q-Maxwellians with physics links extracted from the theory of magnetic braking. In order to test our distributions we have used the GenevaCapenhagen Survey data with approximately 6000 F and G field stars limited by age. As a result, we obtained that the exponents of the decay law and distribution follow a similar relationship to that proposed by Silva et al. (2013).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Verbal fluency is the ability to produce a satisfying sequence of spoken words during a given time interval. The core of verbal fluency lies in the capacity to manage the executive aspects of language. The standard scores of the semantic verbal fluency test are broadly used in the neuropsychological assessment of the elderly, and different analytical methods are likely to extract even more information from the data generated in this test. Graph theory, a mathematical approach to analyze relations between items, represents a promising tool to understand a variety of neuropsychological states. This study reports a graph analysis of data generated by the semantic verbal fluency test by cognitively healthy elderly (NC), patients with Mild Cognitive Impairment – subtypes amnestic(aMCI) and amnestic multiple domain (a+mdMCI) - and patients with Alzheimer’s disease (AD). Sequences of words were represented as a speech graph in which every word corresponded to a node and temporal links between words were represented by directed edges. To characterize the structure of the data we calculated 13 speech graph attributes (SGAs). The individuals were compared when divided in three (NC – MCI – AD) and four (NC – aMCI – a+mdMCI – AD) groups. When the three groups were compared, significant differences were found in the standard measure of correct words produced, and three SGA: diameter, average shortest path, and network density. SGA sorted the elderly groups with good specificity and sensitivity. When the four groups were compared, the groups differed significantly in network density, except between the two MCI subtypes and NC and aMCI. The diameter of the network and the average shortest path were significantly different between the NC and AD, and between aMCI and AD. SGA sorted the elderly in their groups with good specificity and sensitivity, performing better than the standard score of the task. These findings provide support for a new methodological frame to assess the strength of semantic memory through the verbal fluency task, with potential to amplify the predictive power of this test. Graph analysis is likely to become clinically relevant in neurology and psychiatry, and may be particularly useful for the differential diagnosis of the elderly.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Injectivity decline, which can be caused by particle retention, generally occurs during water injection or reinjection in oil fields. Several mechanisms, including straining, are responsible for particle retention and pore blocking causing formation damage and injectivity decline. Predicting formation damage and injectivity decline is essential in waterflooding projects. The Classic Model (CM), which incorporates filtration coefficients and formation damage functions, has been widely used to predict injectivity decline. However, various authors have reported significant discrepancies between Classical Model and experimental results, motivating the development of deep bed filtration models considering multiple particle retention mechanisms (Santos & Barros, 2010; SBM). In this dissertation, inverse problem solution was studied and a software for experimental data treatment was developed. Finally, experimental data were fitted using both the CM and SBM. The results showed that, depending on the formation damage function, the predictions for injectivity decline using CM and SBM models can be significantly different

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ensuring the dependability requirements is essential for the industrial applications since faults may cause failures whose consequences result in economic losses, environmental damage or hurting people. Therefore, faced from the relevance of topic, this thesis proposes a methodology for the dependability evaluation of industrial wireless networks (WirelessHART, ISA100.11a, WIA-PA) on early design phase. However, the proposal can be easily adapted to maintenance and expansion stages of network. The proposal uses graph theory and fault tree formalism to create automatically an analytical model from a given wireless industrial network topology, where the dependability can be evaluated. The evaluation metrics supported are the reliability, availability, MTTF (mean time to failure), importance measures of devices, redundancy aspects and common cause failures. It must be emphasized that the proposal is independent of any tool to evaluate quantitatively the target metrics. However, due to validation issues it was used a tool widely accepted on academy for this purpose (SHARPE). In addition, an algorithm to generate the minimal cut sets, originally applied on graph theory, was adapted to fault tree formalism to guarantee the scalability of methodology in wireless industrial network environments (< 100 devices). Finally, the proposed methodology was validate from typical scenarios found in industrial environments, as star, line, cluster and mesh topologies. It was also evaluated scenarios with common cause failures and best practices to guide the design of an industrial wireless network. For guarantee scalability requirements, it was analyzed the performance of methodology in different scenarios where the results shown the applicability of proposal for networks typically found in industrial environments

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this thesis we investigate physical problems which present a high degree of complexity using tools and models of Statistical Mechanics. We give a special attention to systems with long-range interactions, such as one-dimensional long-range bondpercolation, complex networks without metric and vehicular traffic. The flux in linear chain (percolation) with bond between first neighbor only happens if pc = 1, but when we consider long-range interactions , the situation is completely different, i.e., the transitions between the percolating phase and non-percolating phase happens for pc < 1. This kind of transition happens even when the system is diluted ( dilution of sites ). Some of these effects are investigated in this work, for example, the extensivity of the system, the relation between critical properties and the dilution, etc. In particular we show that the dilution does not change the universality of the system. In another work, we analyze the implications of using a power law quality distribution for vertices in the growth dynamics of a network studied by Bianconi and Barabási. It incorporates in the preferential attachment the different ability (fitness) of the nodes to compete for links. Finally, we study the vehicular traffic on road networks when it is submitted to an increasing flux of cars. In this way, we develop two models which enable the analysis of the total flux on each road as well as the flux leaving the system and the behavior of the total number of congested roads

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This dissertation briefly presents the random graphs and the main quantities calculated from them. At the same time, basic thermodynamics quantities such as energy and temperature are associated with some of their characteristics. Approaches commonly used in Statistical Mechanics are employed and rules that describe a time evolution for the graphs are proposed in order to study their ergodicity and a possible thermal equilibrium between them

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this work a study of social networks based on analysis of family names is presented. A basic approach to the mathematical formalism of graphs is developed and then main theoretical models for complex networks are presented aiming to support the analysis of surnames networks models. These, in turn, are worked so as to be drawn leading quantities, such as aggregation coefficient, minimum average path length and connectivity distribution. Based on these quantities, it can be stated that surnames networks are an example of complex network, showing important features such as preferential attachment and small-world character