924 resultados para Renyi’s entropy
Resumo:
Herein, we demonstrate the physical and chemical characterizations of the supramolecular complex formed between beta-cyclodextrin (beta CD) and bradykinin potentiating nonapeptide (BPP9a), an endogenous toxin found in Bothrops jararaca. Circular dichroism results indicate a conformational change in the BPP9a secondary structure upon its complexation with beta CD. Nuclear magnetic resonance results, mainly from NOESY experiments, and theoretical calculations showed a favorable interaction between the tryptophan residue of BPP9a and the beta CD cavity. Thermodynamic inclusion parameters were investigated by isothermal titration calorimetry, demonstrating that beta CD/BPP9a complex formation is an exothermic process that results in a reduction in entropy. Additionally, in vitro degradation study of BPP9a against trypsin (37 degrees C, pH 7.2) showed higher stability of peptide in presence of beta CD. This beta CD/BPP9a complex, which presents new chemical properties arising from the peptide inclusion process, may be useful as an antihypertensive drug in oral pharmaceutical formulations. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This paper compares the effectiveness of the Tsallis entropy over the classic Boltzmann-Gibbs-Shannon entropy for general pattern recognition, and proposes a multi-q approach to improve pattern analysis using entropy. A series of experiments were carried out for the problem of classifying image patterns. Given a dataset of 40 pattern classes, the goal of our image case study is to assess how well the different entropies can be used to determine the class of a newly given image sample. Our experiments show that the Tsallis entropy using the proposed multi-q approach has great advantages over the Boltzmann-Gibbs-Shannon entropy for pattern classification, boosting image recognition rates by a factor of 3. We discuss the reasons behind this success, shedding light on the usefulness of the Tsallis entropy and the multi-q approach. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The amount of information exchanged per unit of time between two nodes in a dynamical network or between two data sets is a powerful concept for analysing complex systems. This quantity, known as the mutual information rate (MIR), is calculated from the mutual information, which is rigorously defined only for random systems. Moreover, the definition of mutual information is based on probabilities of significant events. This work offers a simple alternative way to calculate the MIR in dynamical (deterministic) networks or between two time series (not fully deterministic), and to calculate its upper and lower bounds without having to calculate probabilities, but rather in terms of well known and well defined quantities in dynamical systems. As possible applications of our bounds, we study the relationship between synchronisation and the exchange of information in a system of two coupled maps and in experimental networks of coupled oscillators.
Resumo:
Financial markets can be viewed as a highly complex evolving system that is very sensitive to economic instabilities. The complex organization of the market can be represented in a suitable fashion in terms of complex networks, which can be constructed from stock prices such that each pair of stocks is connected by a weighted edge that encodes the distance between them. In this work, we propose an approach to analyze the topological and dynamic evolution of financial networks based on the stock correlation matrices. An entropy-related measurement is adopted to quantify the robustness of the evolving financial market organization. It is verified that the network topological organization suffers strong variation during financial instabilities and the networks in such periods become less robust. A statistical robust regression model is proposed to quantity the relationship between the network structure and resilience. The obtained coefficients of such model indicate that the average shortest path length is the measurement most related to network resilience coefficient. This result indicates that a collective behavior is observed between stocks during financial crisis. More specifically, stocks tend to synchronize their price evolution, leading to a high correlation between pair of stock prices, which contributes to the increase in distance between them and, consequently, decrease the network resilience. (C) 2012 American Institute of Physics. [doi:10.1063/1.3683467]
Resumo:
We explore the meaning of information about quantities of interest. Our approach is divided in two scenarios: the analysis of observations and the planning of an experiment. First, we review the Sufficiency, Conditionality and Likelihood principles and how they relate to trivial experiments. Next, we review Blackwell Sufficiency and show that sampling without replacement is Blackwell Sufficient for sampling with replacement. Finally, we unify the two scenarios presenting an extension of the relationship between Blackwell Equivalence and the Likelihood Principle.
Resumo:
Renyi and von Neumann entropies quantifying the amount of entanglement in ground states of critical spin chains are known to satisfy a universal law which is given by the conformal field theory (CFT) describing their scaling regime. This law can be generalized to excitations described by primary fields in CFT, as was done by Alcaraz et al in 2011 (see reference [1], of which this work is a completion). An alternative derivation is presented, together with numerical verifications of our results in different models belonging to the c = 1, 1/2 universality classes. Oscillations of the Renyi entropy in excited states are also discussed.
Resumo:
We analyzed the effectiveness of linear short- and long-term variability time domain parameters, an index of sympatho-vagal balance (SDNN/RMSSD) and entropy in differentiating fetal heart rate patterns (fHRPs) on the fetal heart rate (fHR) series of 5, 3 and 2 min duration reconstructed from 46 fetal magnetocardiograms. Gestational age (GA) varied from 21 to 38 weeks. FHRPs were classified based on the fHR standard deviation. In sleep states, we observed that vagal influence increased with GA, and entropy significantly increased (decreased) with GA (SDNN/RMSSD), demonstrating that a prevalence of vagal activity with autonomous nervous system maturation may be associated with increased sleep state complexity. In active wakefulness, we observed a significant negative (positive) correlation of short-term (long-term) variability parameters with SDNN/RMSSD. ANOVA statistics demonstrated that long-term irregularity and standard deviation of normal-to-normal beat intervals (SDNN) best differentiated among fHRPs. Our results confirm that short-and long-term variability parameters are useful to differentiate between quiet and active states, and that entropy improves the characterization of sleep states. All measures differentiated fHRPs more effectively on very short HR series, as a result of the fMCG high temporal resolution and of the intrinsic timescales of the events that originate the different fHRPs.
Resumo:
The present paper presents a theoretical analysis of a cross flow heat exchanger with a new flow arrangement comprehending several tube rows. The thermal performance of the proposed flow arrangement is compared with the thermal performance of a typical counter cross flow arrangement that is used in chemical, refrigeration, automotive and air conditioning industries. The thermal performance comparison has been performed in terms of the following parameters: heat exchanger effectiveness and efficiency, dimensionless entropy generation, entransy dissipation number, and dimensionless local temperature differences. It is also shown that the uniformity of the temperature difference field leads to a higher thermal performance of the heat exchanger. In the present case this is accomplished thorough a different organization of the in-tube fluid circuits in the heat exchanger. The relation between the recently introduced "entransy dissipation number" and the conventional thermal effectiveness has been obtained in terms of the "number of transfer units". A case study has been solved to quantitatively to obtain the temperature difference distribution over two rows units involving the proposed arrangement and the counter cross flow one. It has been shown that the proposed arrangement presents better thermal performance regardless the comparison parameter. (C) 2012 Elsevier Masson SAS. All rights reserved.
Resumo:
Exact results on particle densities as well as correlators in two models of immobile particles, containing either a single species or else two distinct species, are derived. The models evolve following a descent dynamics through pair annihilation where each particle interacts once at most throughout its entire history. The resulting large number of stationary states leads to a non-vanishing configurational entropy. Our results are established for arbitrary initial conditions and are derived via a generating function method. The single-species model is the dual of the 1D zero-temperature kinetic Ising model with Kimball-Deker-Haake dynamics. In this way, both in finite and semi-infinite chains and also the Bethe lattice can be analysed. The relationship with the random sequential adsorption of dimers and weakly tapped granular materials is discussed.
Resumo:
We discuss a new interacting model for the cosmological dark sector in which the attenuated dilution of cold dark matter scales as a(-3)f(a), where f(a) is an arbitrary function of the cosmic scale factor a. From thermodynamic arguments, we show that f(a) is proportional to the entropy source of the particle creation process. In order to investigate the cosmological consequences of this kind of interacting models, we expand f(a) in a power series, and viable cosmological solutions are obtained. Finally, we use current observational data to place constraints on the interacting function f(a).
Resumo:
We investigate how the initial geometry of a heavy-ion collision is transformed into final flow observables by solving event-by-event ideal hydrodynamics with realistic fluctuating initial conditions. We study quantitatively to what extent anisotropic flow (nu(n)) is determined by the initial eccentricity epsilon(n) for a set of realistic simulations, and we discuss which definition of epsilon(n) gives the best estimator of nu(n). We find that the common practice of using an r(2) weight in the definition of epsilon(n) in general results in a poorer predictor of nu(n) than when using r(n) weight, for n > 2. We similarly study the importance of additional properties of the initial state. For example, we show that in order to correctly predict nu(4) and nu(5) for noncentral collisions, one must take into account nonlinear terms proportional to epsilon(2)(2) and epsilon(2)epsilon(3), respectively. We find that it makes no difference whether one calculates the eccentricities over a range of rapidity or in a single slice at z = 0, nor is it important whether one uses an energy or entropy density weight. This knowledge will be important for making a more direct link between experimental observables and hydrodynamic initial conditions, the latter being poorly constrained at present.
Resumo:
In this work, we probe the stability of a z = 3 three-dimensional Lifshitz black hole by using scalar and spinorial perturbations. We found an analytical expression for the quasinormal frequencies of the scalar probe field, which perfectly agree with the behavior of the quasinormal modes obtained numerically. The results for the numerical analysis of the spinorial perturbations reinforce the conclusion of the scalar analysis, i.e., the model is stable under scalar and spinor perturbations. As an application we found the area spectrum of the Lifshitz black hole, which turns out to be equally spaced.
Resumo:
Native bees are important providers of pollination services, but there are cumulative evidences of their decline. Global changes such as habitat losses, invasions of exotic species and climate change have been suggested as the main causes of the decline of pollinators. In this study, the influence of climate change on the distribution of 10 species of Brazilian bees was estimated with species distribution modelling. We used Maxent algorithm (maximum entropy) and two different scenarios, an optimistic and a pessimistic, to the years 2050 and 2080. We also evaluated the percentage reduction of species habitat based on the future scenarios of climate change through Geographic Information System (GIS). Results showed that the total area of suitable habitats decreased for all species but one under the different future scenarios. The greatest reductions in habitat area were found for Melipona bicolor bicolor and Melipona scutellaris, which occur predominantly in areas related originally to Atlantic Moist Forest. The species analysed have been reported to be pollinators of some regional crops and the consequence of their decrease for these crops needs further clarification. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
A chaotic encryption algorithm is proposed based on the "Life-like" cellular automata (CA), which acts as a pseudo-random generator (PRNG). The paper main focus is to use chaos theory to cryptography. Thus, CA was explored to look for this "chaos" property. This way, the manuscript is more concerning on tests like: Lyapunov exponent, Entropy and Hamming distance to measure the chaos in CA, as well as statistic analysis like DIEHARD and ENT suites. Our results achieved higher randomness quality than others ciphers in literature. These results reinforce the supposition of a strong relationship between chaos and the randomness quality. Thus, the "chaos" property of CA is a good reason to be employed in cryptography, furthermore, for its simplicity, low cost of implementation and respectable encryption power. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
Statistical methods have been widely employed to assess the capabilities of credit scoring classification models in order to reduce the risk of wrong decisions when granting credit facilities to clients. The predictive quality of a classification model can be evaluated based on measures such as sensitivity, specificity, predictive values, accuracy, correlation coefficients and information theoretical measures, such as relative entropy and mutual information. In this paper we analyze the performance of a naive logistic regression model (Hosmer & Lemeshow, 1989) and a logistic regression with state-dependent sample selection model (Cramer, 2004) applied to simulated data. Also, as a case study, the methodology is illustrated on a data set extracted from a Brazilian bank portfolio. Our simulation results so far revealed that there is no statistically significant difference in terms of predictive capacity between the naive logistic regression models and the logistic regression with state-dependent sample selection models. However, there is strong difference between the distributions of the estimated default probabilities from these two statistical modeling techniques, with the naive logistic regression models always underestimating such probabilities, particularly in the presence of balanced samples. (C) 2012 Elsevier Ltd. All rights reserved.