24 resultados para entropia

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The rational construction necessary to systematize scientific knowledge in physics, introduces difficulties of understanding in some of its concepts. One of these concepts which exemplify properly this difficulty in learning or teaching is entropy. This thesis propose the construction of a didactic route which constitute itself a historical and epistemological course to entropy, intending to contribute for teaching this concept as well as other physics concepts. The basic assumption to build this route is that through the historical review of the development of this concept in the way suggested by Bachelard s (1884-1962) epistemology it is possible to make subjects, to be taught and learned, more meaningful. Initially I composed a brief biographical note to give the reader an idea about the issues, interests and reflections, related to science, and how I dealt with them in my private and professional life, as well as the role they played to lead me to write this thesis. The strategy to construct the route to entropy was to split the usual contents of basic thermodynamics in three moments in a way they can constitute epistemological units , which can be identified by the way of thinking in the corresponding moments of scientific knowledge production: a technical and empiricist moment, a rationalist and positivist moment and a post-positivist rationalist one. The transition between each moment is characterized by a rupture with the former way of thinking; however the progress in the construction of knowledge in the area is evident. As the final part of this work I present an analysis based on elements of Bachelard s epistemology that are present in each moment. This analysis is the basic component of the didactic route that I propose myself to build. The way I made this route guide to entropy could contribute to the construction of other didactic routes in physics and other sciences, in a way to unveil hidden meanings and as a tool to humanize scientific knowledge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A posição que a renomada estatí stica de Boltzmann-Gibbs (BG) ocupa no cenário cientifíco e incontestável, tendo um âmbito de aplicabilidade muito abrangente. Por em, muitos fenômenos físicos não podem ser descritos por esse formalismo. Isso se deve, em parte, ao fato de que a estatística de BG trata de fenômenos que se encontram no equilíbrio termodinâmico. Em regiões onde o equilíbrio térmico não prevalece, outros formalismos estatísticos devem ser utilizados. Dois desses formalismos emergiram nas duas ultimas décadas e são comumente denominados de q-estatística e k-estatística; o primeiro deles foi concebido por Constantino Tsallis no final da década de 80 e o ultimo por Giorgio Kaniadakis em 2001. Esses formalismos possuem caráter generalizador e, por isso, contem a estatística de BG como caso particular para uma escolha adequada de certos parâmetros. Esses dois formalismos, em particular o de Tsallis, nos conduzem também a refletir criticamente sobre conceitos tão fortemente enraizados na estat ística de BG como a aditividade e a extensividade de certas grandezas físicas. O escopo deste trabalho esta centrado no segundo desses formalismos. A k -estatstica constitui não só uma generalização da estatística de BG, mas, atraves da fundamentação do Princípio de Interação Cinético (KIP), engloba em seu âmago as celebradas estatísticas quânticas de Fermi- Dirac e Bose-Einstein; além da própria q-estatística. Neste trabalho, apresentamos alguns aspectos conceituais da q-estatística e, principalmente, da k-estatística. Utilizaremos esses conceitos junto com o conceito de informação de bloco para apresentar um funcional entrópico espelhado no formalismo de Kaniadakis que será utilizado posteriormente para descrever aspectos informacionais contidos em fractais tipo Cantor. Em particular, estamos interessados em conhecer as relações entre parâmetros fractais, como a dimensão fractal, e o parâmetro deformador. Apesar da simplicidade, isso nos proporcionará, em trabalho futuros, descrever estatisticamente estruturas mais complexas como o DNA, super-redes e sistema complexos

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Various physical systems have dynamics that can be modeled by percolation processes. Percolation is used to study issues ranging from fluid diffusion through disordered media to fragmentation of a computer network caused by hacker attacks. A common feature of all of these systems is the presence of two non-coexistent regimes associated to certain properties of the system. For example: the disordered media can allow or not allow the flow of the fluid depending on its porosity. The change from one regime to another characterizes the percolation phase transition. The standard way of analyzing this transition uses the order parameter, a variable related to some characteristic of the system that exhibits zero value in one of the regimes and a nonzero value in the other. The proposal introduced in this thesis is that this phase transition can be investigated without the explicit use of the order parameter, but rather through the Shannon entropy. This entropy is a measure of the uncertainty degree in the information content of a probability distribution. The proposal is evaluated in the context of cluster formation in random graphs, and we apply the method to both classical percolation (Erd¨os- R´enyi) and explosive percolation. It is based in the computation of the entropy contained in the cluster size probability distribution and the results show that the transition critical point relates to the derivatives of the entropy. Furthermore, the difference between the smooth and abrupt aspects of the classical and explosive percolation transitions, respectively, is reinforced by the observation that the entropy has a maximum value in the classical transition critical point, while that correspondence does not occurs during the explosive percolation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Various physical systems have dynamics that can be modeled by percolation processes. Percolation is used to study issues ranging from fluid diffusion through disordered media to fragmentation of a computer network caused by hacker attacks. A common feature of all of these systems is the presence of two non-coexistent regimes associated to certain properties of the system. For example: the disordered media can allow or not allow the flow of the fluid depending on its porosity. The change from one regime to another characterizes the percolation phase transition. The standard way of analyzing this transition uses the order parameter, a variable related to some characteristic of the system that exhibits zero value in one of the regimes and a nonzero value in the other. The proposal introduced in this thesis is that this phase transition can be investigated without the explicit use of the order parameter, but rather through the Shannon entropy. This entropy is a measure of the uncertainty degree in the information content of a probability distribution. The proposal is evaluated in the context of cluster formation in random graphs, and we apply the method to both classical percolation (Erd¨os- R´enyi) and explosive percolation. It is based in the computation of the entropy contained in the cluster size probability distribution and the results show that the transition critical point relates to the derivatives of the entropy. Furthermore, the difference between the smooth and abrupt aspects of the classical and explosive percolation transitions, respectively, is reinforced by the observation that the entropy has a maximum value in the classical transition critical point, while that correspondence does not occurs during the explosive percolation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Currently, one of the biggest challenges for the field of data mining is to perform cluster analysis on complex data. Several techniques have been proposed but, in general, they can only achieve good results within specific areas providing no consensus of what would be the best way to group this kind of data. In general, these techniques fail due to non-realistic assumptions about the true probability distribution of the data. Based on this, this thesis proposes a new measure based on Cross Information Potential that uses representative points of the dataset and statistics extracted directly from data to measure the interaction between groups. The proposed approach allows us to use all advantages of this information-theoretic descriptor and solves the limitations imposed on it by its own nature. From this, two cost functions and three algorithms have been proposed to perform cluster analysis. As the use of Information Theory captures the relationship between different patterns, regardless of assumptions about the nature of this relationship, the proposed approach was able to achieve a better performance than the main algorithms in literature. These results apply to the context of synthetic data designed to test the algorithms in specific situations and to real data extracted from problems of different fields

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we present a new clustering method that groups up points of a data set in classes. The method is based in a algorithm to link auxiliary clusters that are obtained using traditional vector quantization techniques. It is described some approaches during the development of the work that are based in measures of distances or dissimilarities (divergence) between the auxiliary clusters. This new method uses only two a priori information, the number of auxiliary clusters Na and a threshold distance dt that will be used to decide about the linkage or not of the auxiliary clusters. The number os classes could be automatically found by the method, that do it based in the chosen threshold distance dt, or it is given as additional information to help in the choice of the correct threshold. Some analysis are made and the results are compared with traditional clustering methods. In this work different dissimilarities metrics are analyzed and a new one is proposed based on the concept of negentropy. Besides grouping points of a set in classes, it is proposed a method to statistical modeling the classes aiming to obtain a expression to the probability of a point to belong to one of the classes. Experiments with several values of Na e dt are made in tests sets and the results are analyzed aiming to study the robustness of the method and to consider heuristics to the choice of the correct threshold. During this work it is explored the aspects of information theory applied to the calculation of the divergences. It will be explored specifically the different measures of information and divergence using the Rényi entropy. The results using the different metrics are compared and commented. The work also has appendix where are exposed real applications using the proposed method

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Discussions about pollution caused by vehicles emission are old and have been developed along the years. The search for cleaner technologies and frequent weather alterations have been inducing industries and government organizations to impose limits much more rigorous to the contaminant content in fuels, which have an direct impact in atmospheric emissions. Nowadays, the quality of fuels, in relation to the sulfur content, is carried out through the process of hydrodesulfurization. Adsorption processes also represent an interesting alternative route to the removal of sulfur content. Both processes are simpler and operate to atmospheric temperatures and pressures. This work studies the synthesis and characterization of aluminophosphate impregnate with zinc, molybdenum or both, and its application in the sulfur removal from the gasoline through the adsorption process, using a pattern gasoline containing isooctane and thiophene. The adsorbents were characterized by x-ray diffraction, differential thermal analysis (DTG), x-ray fluorescence and scanning electron microscopy (SEM). The specific area, volume and pore diameter were determined by BET (Brunauer- Emmet-Teller) and the t-plot method. The sulfur was quantified by elementary analysis using ANTEK 9000 NS. The adsorption process was evaluated as function of the temperature variation and initial sulfur content through the adsorption isotherm and its thermodynamic parameters. The parameters of entropy (ΔS), enthalpy variation (ΔH) and free Gibbs energy (ΔG) were calculated through the graph ln(Kd) versus 1/T. Langmuir, Freundlich and Langmuir-Freundlich models were adjusted to the experimental data, and the last one had presented better results. The thermodynamic tests were accomplished in different temperatures, such as 30, 40 and 50ºC, where it was concluded the adsorption process is spontaneous and exothermic. The kinetic of adsorption was studied by 24 h and it showed that the capability adsorption to the adsorbents studied respect the following order: MoZnPO > MoPO > ZnPO > AlPO. The maximum adsorption capacity was 4.91 mg/g for MoZnPO with an adsorption efficiency of 49%.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The nonionic surfactants when in aqueous solution, have the property of separating into two phases, one called diluted phase, with low concentration of surfactant, and the other one rich in surfactants called coacervate. The application of this kind of surfactant in extraction processes from aqueous solutions has been increasing over time, which implies the need for knowledge of the thermodynamic properties of these surfactants. In this study were determined the cloud point of polyethoxylated surfactants from nonilphenolpolietoxylated family (9,5 , 10 , 11, 12 and 13), the family from octilphenolpolietoxylated (10 e 11) and polyethoxylated lauryl alcohol (6 , 7, 8 and 9) varying the degree of ethoxylation. The method used to determine the cloud point was the observation of the turbidity of the solution heating to a ramp of 0.1 ° C / minute and for the pressure studies was used a cell high-pressure maximum ( 300 bar). Through the experimental data of the studied surfactants were used to the Flory - Huggins models, UNIQUAC and NRTL to describe the curves of cloud point, and it was studied the influence of NaCl concentration and pressure of the systems in the cloud point. This last parameter is important for the processes of oil recovery in which surfactant in solution are used in high pressures. While the effect of NaCl allows obtaining cloud points for temperatures closer to the room temperature, it is possible to use in processes without temperature control. The numerical method used to adjust the parameters was the Levenberg - Marquardt. For the model Flory- Huggins parameter settings were determined as enthalpy of the mixing, mixing entropy and the number of aggregations. For the UNIQUAC and NRTL models were adjusted interaction parameters aij using a quadratic dependence with temperature. The parameters obtained had good adjust to the experimental data RSMD < 0.3 %. The results showed that both, ethoxylation degree and pressure increase the cloudy points, whereas the NaCl decrease

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the present work we use a Tsallis maximum entropy distribution law to fit the observations of projected rotational velocity measurements of stars in the Pleiades open cluster. This new distribution funtion which generalizes the Ma.xwel1-Boltzmann one is derived from the non-extensivity of the Boltzmann-Gibbs entropy. We also present a oomparison between results from the generalized distribution and the Ma.xwellia.n law, and show that the generalized distribution fits more closely the observational data. In addition, we present a oomparison between the q values of the generalized distribution determined for the V sin i distribution of the main sequence stars (Pleiades) and ones found for the observed distribution of evolved stars (subgiants). We then observe a correlation between the q values and the star evolution stage for a certain range of stel1ar mass

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we study a connection between a non-Gaussian statistics, the Kaniadakis statistics, and Complex Networks. We show that the degree distribution P(k)of a scale free-network, can be calculated using a maximization of information entropy in the context of non-gaussian statistics. As an example, a numerical analysis based on the preferential attachment growth model is discussed, as well as a numerical behavior of the Kaniadakis and Tsallis degree distribution is compared. We also analyze the diffusive epidemic process (DEP) on a regular lattice one-dimensional. The model is composed of A (healthy) and B (sick) species that independently diffusive on lattice with diffusion rates DA and DB for which the probabilistic dynamical rule A + B → 2B and B → A. This model belongs to the category of non-equilibrium systems with an absorbing state and a phase transition between active an inactive states. We investigate the critical behavior of the DEP using an auto-adaptive algorithm to find critical points: the method of automatic searching for critical points (MASCP). We compare our results with the literature and we find that the MASCP successfully finds the critical exponents 1/ѵ and 1/zѵ in all the cases DA =DB, DA DB. The simulations show that the DEP has the same critical exponents as are expected from field-theoretical arguments. Moreover, we find that, contrary to a renormalization group prediction, the system does not show a discontinuous phase transition in the regime o DA >DB.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The standard kinetic theory for a nonrelativistic diluted gas is generalized in the spirit of the nonextensive statistic distribution introduced by Tsallis. The new formalism depends on an arbitrary q parameter measuring the degree of nonextensivity. In the limit q = 1, the extensive Maxwell-Boltzmann theory is recovered. Starting from a purely kinetic deduction of the velocity q-distribution function, the Boltzmann H-teorem is generalized for including the possibility of nonextensive out of equilibrium effects. Based on this investigation, it is proved that Tsallis' distribution is the necessary and sufficient condition defining a thermodynamic equilibrium state in the nonextensive context. This result follows naturally from the generalized transport equation and also from the extended H-theorem. Two physical applications of the nonextensive effects have been considered. Closed analytic expressions were obtained for the Doppler broadening of spectral lines from an excited gas, as well as, for the dispersion relations describing the eletrostatic oscillations in a diluted electronic plasma. In the later case, a comparison with the experimental results strongly suggests a Tsallis distribution with the q parameter smaller than unity. A complementary study is related to the thermodynamic behavior of a relativistic imperfect simple fluid. Using nonequilibrium thermodynamics, we show how the basic primary variables, namely: the energy momentum tensor, the particle and entropy fluxes depend on the several dissipative processes present in the fluid. The temperature variation law for this moving imperfect fluid is also obtained, and the Eckart and Landau-Lifshitz formulations are recovered as particular cases

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this work we have studied the effects of random biquadratic and random fields in spin-glass models using the replica method. The effect of a random biquadratic coupling was studied in two spin-1 spin-glass models: in one case the interactions occur between pairs of spins, whereas in the second one the interactions occur between p spins and the limit p > oo is considered. Both couplings (spin glass and biquadratic) have zero-mean Gaussian probability distributions. In the first model, the replica-symmetric assumption reveals that the system presents two pha¬ses, namely, paramagnetic and spin-glass, separated by a continuous transition line. The stability analysis of the replica-symmetric solution yields, besides the usual instability associated with the spin-glass ordering, a new phase due to the random biquadratic cou¬plings between the spins. For the case p oo, the replica-symmetric assumption yields again only two phases, namely, paramagnetic and quadrupolar. In both these phases the spin-glass parameter is zero. Besides, it is shown that they are stable under the Almeida-Thouless stability analysis. One of them presents negative entropy at low temperatures. We developed one step of replica simmetry breaking and noticed that a new phase, the biquadratic glass phase, emerge. In this way we have obtained the correct phase diagram, with.three first-order transition lines. These lines merges in a common triple point. The effects of random fields were studied in the Sherrington-Kirkpatrick model consi¬dered in the presence of an external random magnetic field following a trimodal distribu¬tion {P{hi) = p+S(hi - h0) +Po${hi) +pS(hi + h0))- It is shown that the border of the ferromagnetic phase may present, for conveniently chosen values of p0 and hQ, first-order phase transitions, as well as tricritical points at finite temperatures. It is verified that the first-order phase transitions are directly related to the dilution in the fields: the extensions of these transitions are reduced for increasing values of po- In fact, the threshold value pg, above which all phase transitions are continuous, is calculated analytically. The stability analysis of the replica-symmetric solution is performed and the regions of validity of such a solution are identified

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Significant observational effort has been directed to unveiling the nature of the so-called dark energy. However, given the large number of theoretical possibilities, it is possible that this a task cannot be based only on observational data. In this thesis we investigate the dark energy via a thermodynamics approach, i.e., we discuss some thermodynamic properties of this energy component assuming a general time-dependent equation-of-state (EoS) parameter w(a) = w0 + waf(a), where w0 and wa are constants and f(a) may assume different forms. We show that very restrictive bounds can be placed on the w0 - wa space when current observational data are combined with the thermodynamic constraints derived. Moreover, we include a non-zero chemical potential μ and a varying EoS parameter of the type ω(a) = ω0 + F(a), therefore more general, in this thermodynamical description. We derive generalized expressions for the entropy density and chemical potential, noting that the dark energy temperature T and μ evolve in the same way in the course of the cosmic expansion. The positiveness of entropy S is used to impose thermodynamic bounds on the EoS parameter ω(a). In particular, we find that a phantom-like behavior ω(a) < −1 is allowed only when the chemical potential is a negative quantity (μ < 0). Thermodynamically speaking, a complete treatment has been proposed, when we address the interaction between matter and energy dark

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The work is to make a brief discussion of methods to estimate the parameters of the Generalized Pareto distribution (GPD). Being addressed the following techniques: Moments (moments), Maximum Likelihood (MLE), Biased Probability Weighted Moments (PWMB), Unbiased Probability Weighted Moments (PWMU), Mean Power Density Divergence (MDPD), Median (MED), Pickands (PICKANDS), Maximum Penalized Likelihood (MPLE), Maximum Goodness-of-fit (MGF) and the Maximum Entropy (POME) technique, the focus of this manuscript. By way of illustration adjustments were made for the Generalized Pareto distribution, for a sequence of earthquakes intraplacas which occurred in the city of João Câmara in the northeastern region of Brazil, which was monitored continuously for two years (1987 and 1988). It was found that the MLE and POME were the most efficient methods, giving them basically mean squared errors. Based on the threshold of 1.5 degrees was estimated the seismic risk for the city, and estimated the level of return to earthquakes of intensity 1.5°, 2.0°, 2.5°, 3.0° and the most intense earthquake never registered in the city, which occurred in November 1986 with magnitude of about 5.2º