35 resultados para Entropy of Tsallis
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
The time evolution of the out-of-equilibrium Mott insulator is investigated numerically through calculations of space-time-resolved density and entropy profiles resulting from the release of a gas of ultracold fermionic atoms from an optical trap. For adiabatic, moderate and sudden switching-off of the trapping potential, the out-of-equilibrium dynamics of the Mott insulator is found to differ profoundly from that of the band insulator and the metallic phase, displaying a self-induced stability that is robust within a wide range of densities, system sizes and interaction strengths. The connection between the entanglement entropy and changes of phase, known for equilibrium situations, is found to extend to the out-of-equilibrium regime. Finally, the relation between the system`s long time behavior and the thermalization limit is analyzed. Copyright (C) EPLA, 2011
Resumo:
Complex networks obtained from real-world networks are often characterized by incompleteness and noise, consequences of imperfect sampling as well as artifacts in the acquisition process. Because the characterization, analysis and modeling of complex systems underlain by complex networks are critically affected by the quality and completeness of the respective initial structures, it becomes imperative to devise methodologies for identifying and quantifying the effects of the sampling on the network structure. One way to evaluate these effects is through an analysis of the sensitivity of complex network measurements to perturbations in the topology of the network. In this paper, measurement sensibility is quantified in terms of the relative entropy of the respective distributions. Three particularly important kinds of progressive perturbations to the network are considered, namely, edge suppression, addition and rewiring. The measurements allowing the best balance of stability (smaller sensitivity to perturbations) and discriminability (separation between different network topologies) are identified with respect to each type of perturbation. Such an analysis includes eight different measurements applied on six different complex networks models and three real-world networks. This approach allows one to choose the appropriate measurements in order to obtain accurate results for networks where sampling bias cannot be avoided-a very frequent situation in research on complex networks.
Resumo:
In this work we investigate the dynamical Casimir effect in a nonideal cavity by deriving an effective Hamiltonian. We first compute a general expression for the average number of particle creation, applicable for any law of motion of the cavity boundary, under the only restriction of small velocities. We also compute a general expression for the linear entropy of an arbitrary state prepared in a selected mode, also applicable for any law of motion of a slow moving boundary. As an application of our results we have analyzed both the average number of particle creation and linear entropy within a particular oscillatory motion of the cavity boundary. On the basis of these expressions we develop a comprehensive analysis of the resonances in the number of particle creation in the nonideal dynamical Casimir effect. We also demonstrate the occurrence of resonances in the loss of purity of the initial state and estimate the decoherence times associated with these resonances. Since our results were obtained in the framework of the perturbation theory, they are restricted, under resonant conditions, to a short-time approximation. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
The effects of alkali treatment on the structural characteristics of cotton linters and sisal cellulose samples have been studied. Mercerization results in a decrease in the indices of crystallinity and the degrees of polymerization, and an increase in the alpha-cellulose contents of the samples. The relevance of the structural properties of cellulose to its dissolution is probed by studying the kinetics of cellulose decrystallization, prior to its solubilization in LiCl/N,N-dimethylacetamide (DMAc). Our data show that the decrystallization rate constants and activation parameters are only slightly dependent on the physico-chemical properties of the starting celluloses. This multi-step reaction is accompanied by a small enthalpy and large, negative, entropy of activation. These results are analyzed in terms of the interactions within the biopolymer chains during decrystallization, as well as those between the two ions of the electrolyte and both DMAc and cellulose.
Resumo:
The thermodynamic properties of dark energy fluids described by an equation of state parameter omega = p/rho are rediscussed in the context of FRW type geometries. Contrarily to previous claims, it is argued here that the phantom regime omega < -1 is not physically possible since that both the temperature and the entropy of every physical fluids must be always positive definite. This means that one cannot appeal to negative temperature in order to save the phantom dark energy hypothesis as has been recently done in the literature. Such a result remains true as long as the chemical potential is zero. However, if the phantom fluid is endowed with a non-null chemical potential, the phantom field hypothesis becomes thermodynamically consistent, that is, there are macroscopic equilibrium states with T > 0 and S > 0 in the course of the Universe expansion. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Texture is one of the most important visual attributes for image analysis. It has been widely used in image analysis and pattern recognition. A partially self-avoiding deterministic walk has recently been proposed as an approach for texture analysis with promising results. This approach uses walkers (called tourists) to exploit the gray scale image contexts in several levels. Here, we present an approach to generate graphs out of the trajectories produced by the tourist walks. The generated graphs embody important characteristics related to tourist transitivity in the image. Computed from these graphs, the statistical position (degree mean) and dispersion (entropy of two vertices with the same degree) measures are used as texture descriptors. A comparison with traditional texture analysis methods is performed to illustrate the high performance of this novel approach. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Canalizing genes possess such broad regulatory power, and their action sweeps across a such a wide swath of processes that the full set of affected genes are not highly correlated under normal conditions. When not active, the controlling gene will not be predictable to any significant degree by its subject genes, either alone or in groups, since their behavior will be highly varied relative to the inactive controlling gene. When the controlling gene is active, its behavior is not well predicted by any one of its targets, but can be very well predicted by groups of genes under its control. To investigate this question, we introduce in this paper the concept of intrinsically multivariate predictive (IMP) genes, and present a mathematical study of IMP in the context of binary genes with respect to the coefficient of determination (CoD), which measures the predictive power of a set of genes with respect to a target gene. A set of predictor genes is said to be IMP for a target gene if all properly contained subsets of the predictor set are bad predictors of the target but the full predictor set predicts the target with great accuracy. We show that logic of prediction, predictive power, covariance between predictors, and the entropy of the joint probability distribution of the predictors jointly affect the appearance of IMP genes. In particular, we show that high-predictive power, small covariance among predictors, a large entropy of the joint probability distribution of predictors, and certain logics, such as XOR in the 2-predictor case, are factors that favor the appearance of IMP. The IMP concept is applied to characterize the behavior of the gene DUSP1, which exhibits control over a central, process-integrating signaling pathway, thereby providing preliminary evidence that IMP can be used as a criterion for discovery of canalizing genes.
Resumo:
Measurements of X-ray diffraction, electrical resistivity, and magnetization are reported across the Jahn-Teller phase transition in LaMnO(3). Using a thermodynamic equation, we obtained the pressure derivative of the critical temperature (T(JT)), dT(JT)/dP = -28.3 K GPa(-1). This approach also reveals that 5.7(3)J(mol K)(-1) comes from the volume change and 0.8(2)J(mol K)(-1) from the magnetic exchange interaction change across the phase transition. Around T(JT), a robust increase in the electrical conductivity takes place and the electronic entropy change, which is assumed to be negligible for the majority of electronic systems, was found to be 1.8(3)J(mol K)(-1).
Resumo:
An entropy-based image segmentation approach is introduced and applied to color images obtained from Google Earth. Segmentation refers to the process of partitioning a digital image in order to locate different objects and regions of interest. The application to satellite images paves the way to automated monitoring of ecological catastrophes, urban growth, agricultural activity, maritime pollution, climate changing and general surveillance. Regions representing aquatic, rural and urban areas are identified and the accuracy of the proposed segmentation methodology is evaluated. The comparison with gray level images revealed that the color information is fundamental to obtain an accurate segmentation. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Three different types of maltodextrin encapsulated dehydrated blackberry fruit powders were obtained using vibrofluidized bed drying (VF), spray drying (SD), vacuum drying (VD), and freeze drying (FD). Moisture equilibrium data of blackberry pulp powders with 18% maltodextrin were determined at 20, 30, 40, and 50 degrees C using the static gravimetric method for the water activity range of 0.06-0.90. Experimental equilibrium moisture content data versus water activity were fit to the Guggenheim-Anderson-de Boer (GAB) model. Agreement was found between experimental and calculated values. The isosteric heat of sorption of water was determined using the Clausius-Clapeyron equation from the equilibrium data; isosteric heats of sorption were found to increase with increasing temperature and could be adjusted by an exponential relationship. For freeze dried, vibrofluidized, and vacuum dried pulp powder samples, the isosteric heats of sorption were lower (more negative) than those calculated for spray dried samples. The enthalpy-entropy compensation theory was applied to sorption isotherms and plots of Delta H versus Delta S provided the isokinetic temperatures, indicating an enthalpy-controlled sorption process.
Resumo:
The main goal of this work was to evaluate thermodynamic parameters of the soybean oil extraction process using ethanol as solvent. The experimental treatments were as follows: aqueous solvents with water contents varying from 0 to 13% (mass basis) and extraction temperature varying from 50 to 100 degrees C. The distribution coefficients of oil at equilibrium have been used to calculate enthalpy, entropy and free energy changes. The results indicate that oil extraction process with ethanol is feasible and spontaneous, mainly under higher temperature. Also, the influence of water level in the solvent and temperature were analysed using the response surface methodology (RSM). It can be noted that the extraction yield was highly affected by both independent variables. A joint analysis of thermodynamic and RSM indicates the optimal level of solvent hydration and temperature to perform the extraction process.
An improved estimate of leaf area index based on the histogram analysis of hemispherical photographs
Resumo:
Leaf area index (LAI) is a key parameter that affects the surface fluxes of energy, mass, and momentum over vegetated lands, but observational measurements are scarce, especially in remote areas with complex canopy structure. In this paper we present an indirect method to calculate the LAI based on the analyses of histograms of hemispherical photographs. The optimal threshold value (OTV), the gray-level required to separate the background (sky) and the foreground (leaves), was analytically calculated using the entropy crossover method (Sahoo, P.K., Slaaf, D.W., Albert, T.A., 1997. Threshold selection using a minimal histogram entropy difference. Optical Engineering 36(7) 1976-1981). The OTV was used to calculate the LAI using the well-known gap fraction method. This methodology was tested in two different ecosystems, including Amazon forest and pasturelands in Brazil. In general, the error between observed and calculated LAI was similar to 6%. The methodology presented is suitable for the calculation of LAI since it is responsive to sky conditions, automatic, easy to implement, faster than commercially available software, and requires less data storage. (C) 2008 Elsevier B.V. All rights reserved.
Resumo:
Evidence of jet precession in many galactic and extragalactic sources has been reported in the literature. Much of this evidence is based on studies of the kinematics of the jet knots, which depends on the correct identification of the components to determine their respective proper motions and position angles on the plane of the sky. Identification problems related to fitting procedures, as well as observations poorly sampled in time, may influence the follow-up of the components in time, which consequently might contribute to a misinterpretation of the data. In order to deal with these limitations, we introduce a very powerful statistical tool to analyse jet precession: the cross-entropy method for continuous multi-extremal optimization. Only based on the raw data of the jet components (right ascension and declination offsets from the core), the cross-entropy method searches for the precession model parameters that better represent the data. In this work we present a large number of tests to validate this technique, using synthetic precessing jets built from a given set of precession parameters. With the aim of recovering these parameters, we applied the cross-entropy method to our precession model, varying exhaustively the quantities associated with the method. Our results have shown that even in the most challenging tests, the cross-entropy method was able to find the correct parameters within a 1 per cent level. Even for a non-precessing jet, our optimization method could point out successfully the lack of precession.
Resumo:
We present a new technique for obtaining model fittings to very long baseline interferometric images of astrophysical jets. The method minimizes a performance function proportional to the sum of the squared difference between the model and observed images. The model image is constructed by summing N(s) elliptical Gaussian sources characterized by six parameters: two-dimensional peak position, peak intensity, eccentricity, amplitude, and orientation angle of the major axis. We present results for the fitting of two main benchmark jets: the first constructed from three individual Gaussian sources, the second formed by five Gaussian sources. Both jets were analyzed by our cross-entropy technique in finite and infinite signal-to-noise regimes, the background noise chosen to mimic that found in interferometric radio maps. Those images were constructed to simulate most of the conditions encountered in interferometric images of active galactic nuclei. We show that the cross-entropy technique is capable of recovering the parameters of the sources with a similar accuracy to that obtained from the very traditional Astronomical Image Processing System Package task IMFIT when the image is relatively simple (e. g., few components). For more complex interferometric maps, our method displays superior performance in recovering the parameters of the jet components. Our methodology is also able to show quantitatively the number of individual components present in an image. An additional application of the cross-entropy technique to a real image of a BL Lac object is shown and discussed. Our results indicate that our cross-entropy model-fitting technique must be used in situations involving the analysis of complex emission regions having more than three sources, even though it is substantially slower than current model-fitting tasks (at least 10,000 times slower for a single processor, depending on the number of sources to be optimized). As in the case of any model fitting performed in the image plane, caution is required in analyzing images constructed from a poorly sampled (u, v) plane.
Resumo:
Non-linear methods for estimating variability in time-series are currently of widespread use. Among such methods are approximate entropy (ApEn) and sample approximate entropy (SampEn). The applicability of ApEn and SampEn in analyzing data is evident and their use is increasing. However, consistency is a point of concern in these tools, i.e., the classification of the temporal organization of a data set might indicate a relative less ordered series in relation to another when the opposite is true. As highlighted by their proponents themselves, ApEn and SampEn might present incorrect results due to this lack of consistency. In this study, we present a method which gains consistency by using ApEn repeatedly in a wide range of combinations of window lengths and matching error tolerance. The tool is called volumetric approximate entropy, vApEn. We analyze nine artificially generated prototypical time-series with different degrees of temporal order (combinations of sine waves, logistic maps with different control parameter values, random noises). While ApEn/SampEn clearly fail to consistently identify the temporal order of the sequences, vApEn correctly do. In order to validate the tool we performed shuffled and surrogate data analysis. Statistical analysis confirmed the consistency of the method. (C) 2008 Elsevier Ltd. All rights reserved.