924 resultados para Shannon´s entropy
Resumo:
The realization that statistical physics methods can be applied to analyze written texts represented as complex networks has led to several developments in natural language processing, including automatic summarization and evaluation of machine translation. Most importantly, so far only a few metrics of complex networks have been used and therefore there is ample opportunity to enhance the statistics-based methods as new measures of network topology and dynamics are created. In this paper, we employ for the first time the metrics betweenness, vulnerability and diversity to analyze written texts in Brazilian Portuguese. Using strategies based on diversity metrics, a better performance in automatic summarization is achieved in comparison to previous work employing complex networks. With an optimized method the Rouge score (an automatic evaluation method used in summarization) was 0.5089, which is the best value ever achieved for an extractive summarizer with statistical methods based on complex networks for Brazilian Portuguese. Furthermore, the diversity metric can detect keywords with high precision, which is why we believe it is suitable to produce good summaries. It is also shown that incorporating linguistic knowledge through a syntactic parser does enhance the performance of the automatic summarizers, as expected, but the increase in the Rouge score is only minor. These results reinforce the suitability of complex network methods for improving automatic summarizers in particular, and treating text in general. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
We present the results of an operational use of experimentally measured optical tomograms to determine state characteristics (purity) avoiding any reconstruction of quasiprobabilities. We also develop a natural way how to estimate the errors (including both statistical and systematic ones) by an analysis of the experimental data themselves. Precision of the experiment can be increased by postselecting the data with minimal (systematic) errors. We demonstrate those techniques by considering coherent and photon-added coherent states measured via the time-domain improved homodyne detection. The operational use and precision of the data allowed us to check purity-dependent uncertainty relations and uncertainty relations for Shannon and Renyi entropies.
Resumo:
Background: Heavy-flavor production in p + p collisions is a good test of perturbative-quantum-chromodynamics (pQCD) calculations. Modification of heavy-flavor production in heavy-ion collisions relative to binary-collision scaling from p + p results, quantified with the nuclear-modification factor (R-AA), provides information on both cold-and hot-nuclear-matter effects. Midrapidity heavy-flavor R-AA measurements at the Relativistic Heavy Ion Collider have challenged parton-energy-loss models and resulted in upper limits on the viscosity-entropy ratio that are near the quantum lower bound. Such measurements have not been made in the forward-rapidity region. Purpose: Determine transverse-momentum (p(T)) spectra and the corresponding R-AA for muons from heavy-flavor meson decay in p + p and Cu + Cu collisions at root s(NN) = 200 GeV and y = 1.65. Method: Results are obtained using the semileptonic decay of heavy-flavor mesons into negative muons. The PHENIX muon-arm spectrometers measure the p(T) spectra of inclusive muon candidates. Backgrounds, primarily due to light hadrons, are determined with a Monte Carlo calculation using a set of input hadron distributions tuned to match measured-hadron distributions in the same detector and statistically subtracted. Results: The charm-production cross section in p + p collisions at root s = 200 GeV, integrated over p(T) and in the rapidity range 1.4 < y < 1.9, is found to be d(sigma e (e) over bar)/dy = 0.139 +/- 0.029 (stat)(-0.058)(+0.051) (syst) mb. This result is consistent with a perturbative fixed-order-plus-next-to-leading-log calculation within scale uncertainties and is also consistent with expectations based on the corresponding midrapidity charm-production cross section measured by PHENIX. The R-AA for heavy-flavor muons in Cu + Cu collisions is measured in three centrality bins for 1 < p(T) < 4 GeV/c. Suppression relative to binary-collision scaling (R-AA < 1) increases with centrality. Conclusions: Within experimental and theoretical uncertainties, the measured charm yield in p + p collisions is consistent with state-of-the-art pQCD calculations. Suppression in central Cu + Cu collisions suggests the presence of significant cold-nuclear-matter effects and final-state energy loss.
Resumo:
Exergy analysis is applied to assess the energy conversion processes that take place in the human body, aiming at developing indicators of health and performance based on the concepts of exergy destroyed rate and exergy efficiency. The thermal behavior of the human body is simulated by a model composed of 15 cylinders with elliptical cross section representing: head, neck, trunk, arms, forearms, hands, thighs, legs, and feet. For each, a combination of tissues is considered. The energy equation is solved for each cylinder, being possible to obtain transitory response from the body due to a variation in environmental conditions. With this model, it is possible to obtain heat and mass flow rates to the environment due to radiation, convection, evaporation and respiration. The exergy balances provide the exergy variation due to heat and mass exchange over the body, and the exergy variation over time for each compartments tissue and blood, the sum of which leads to the total variation of the body. Results indicate that exergy destroyed and exergy efficiency decrease over lifespan and the human body is more efficient and destroys less exergy in lower relative humidities and higher temperatures. (C) 2012 Elsevier Ltd. All rights reserved.
Resumo:
The jet quenching parameter of an anisotropic plasma depends on the relative orientation between the anisotropic direction, the direction of motion of the parton, and the direction along which the momentum broadening is measured. We calculate the jet quenching parameter of an anisotropic, strongly coupled N = 4 plasma by means of its gravity dual. We present the results for arbitrary orientations and arbitrary values of the anisotropy. The anisotropic value can be larger or smaller than the isotropic one, and this depends on whether the comparison is made at equal temperatures or at equal entropy densities. We compare our results to analogous calculations for the real-world quark-gluon plasma and find agreement in some cases and disagreement in others.
Resumo:
We introduce a five-parameter continuous model, called the McDonald inverted beta distribution, to extend the two-parameter inverted beta distribution and provide new four- and three-parameter sub-models. We give a mathematical treatment of the new distribution including expansions for the density function, moments, generating and quantile functions, mean deviations, entropy and reliability. The model parameters are estimated by maximum likelihood and the observed information matrix is derived. An application of the new model to real data shows that it can give consistently a better fit than other important lifetime models. (C) 2012 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.
Resumo:
For any continuous baseline G distribution [G. M. Cordeiro and M. de Castro, A new family of generalized distributions, J. Statist. Comput. Simul. 81 (2011), pp. 883-898], proposed a new generalized distribution (denoted here with the prefix 'Kw-G'(Kumaraswamy-G)) with two extra positive parameters. They studied some of its mathematical properties and presented special sub-models. We derive a simple representation for the Kw-Gdensity function as a linear combination of exponentiated-G distributions. Some new distributions are proposed as sub-models of this family, for example, the Kw-Chen [Z.A. Chen, A new two-parameter lifetime distribution with bathtub shape or increasing failure rate function, Statist. Probab. Lett. 49 (2000), pp. 155-161], Kw-XTG [M. Xie, Y. Tang, and T.N. Goh, A modified Weibull extension with bathtub failure rate function, Reliab. Eng. System Safety 76 (2002), pp. 279-285] and Kw-Flexible Weibull [M. Bebbington, C. D. Lai, and R. Zitikis, A flexible Weibull extension, Reliab. Eng. System Safety 92 (2007), pp. 719-726]. New properties of the Kw-G distribution are derived which include asymptotes, shapes, moments, moment generating function, mean deviations, Bonferroni and Lorenz curves, reliability, Renyi entropy and Shannon entropy. New properties of the order statistics are investigated. We discuss the estimation of the parameters by maximum likelihood. We provide two applications to real data sets and discuss a bivariate extension of the Kw-G distribution.
Resumo:
This article introduces generalized beta-generated (GBG) distributions. Sub-models include all classical beta-generated, Kumaraswamy-generated and exponentiated distributions. They are maximum entropy distributions under three intuitive conditions, which show that the classical beta generator skewness parameters only control tail entropy and an additional shape parameter is needed to add entropy to the centre of the parent distribution. This parameter controls skewness without necessarily differentiating tail weights. The GBG class also has tractable properties: we present various expansions for moments, generating function and quantiles. The model parameters are estimated by maximum likelihood and the usefulness of the new class is illustrated by means of some real data sets. (c) 2011 Elsevier B.V. All rights reserved.
Resumo:
Known as the "king of spices", black pepper (Piper nigrum), a perennial crop of the tropics, is economically the most important and the most widely used spice crop in the world. To understand its suitable bioclimatic distribution, maximum entropy based on ecological niche modeling was used to model the bioclimatic niches of the species in its Asian range. Based on known occurrences, bioclimatic areas with higher probabilities are mainly located in the eastern and western coasts of the Indian Peninsula, the east of Sumatra Island, some areas in the Malay Archipelago, and the southeast coastal areas of China. Some undocumented places were also predicted as suitable areas. According to the jackknife procedure, the minimum temperature of the coldest month, the mean monthly temperature range, and the precipitation of the wettest month were identified as highly effective factors in the distribution of black pepper and could possibly account for the crop's distribution pattern. Such climatic requirements inhibited this species from dispersing and gaining a larger geographical range.
Resumo:
In the past decades, all of the efforts at quantifying systems complexity with a general tool has usually relied on using Shannon's classical information framework to address the disorder of the system through the Boltzmann-Gibbs-Shannon entropy, or one of its extensions. However, in recent years, there were some attempts to tackle the quantification of algorithmic complexities in quantum systems based on the Kolmogorov algorithmic complexity, obtaining some discrepant results against the classical approach. Therefore, an approach to the complexity measure is proposed here, using the quantum information formalism, taking advantage of the generality of the classical-based complexities, and being capable of expressing these systems' complexity on other framework than its algorithmic counterparts. To do so, the Shiner-Davison-Landsberg (SDL) complexity framework is considered jointly with linear entropy for the density operators representing the analyzed systems formalism along with the tangle for the entanglement measure. The proposed measure is then applied in a family of maximally entangled mixed state.
Resumo:
Herein, we demonstrate the physical and chemical characterizations of the supramolecular complex formed between beta-cyclodextrin (beta CD) and bradykinin potentiating nonapeptide (BPP9a), an endogenous toxin found in Bothrops jararaca. Circular dichroism results indicate a conformational change in the BPP9a secondary structure upon its complexation with beta CD. Nuclear magnetic resonance results, mainly from NOESY experiments, and theoretical calculations showed a favorable interaction between the tryptophan residue of BPP9a and the beta CD cavity. Thermodynamic inclusion parameters were investigated by isothermal titration calorimetry, demonstrating that beta CD/BPP9a complex formation is an exothermic process that results in a reduction in entropy. Additionally, in vitro degradation study of BPP9a against trypsin (37 degrees C, pH 7.2) showed higher stability of peptide in presence of beta CD. This beta CD/BPP9a complex, which presents new chemical properties arising from the peptide inclusion process, may be useful as an antihypertensive drug in oral pharmaceutical formulations. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
This paper compares the effectiveness of the Tsallis entropy over the classic Boltzmann-Gibbs-Shannon entropy for general pattern recognition, and proposes a multi-q approach to improve pattern analysis using entropy. A series of experiments were carried out for the problem of classifying image patterns. Given a dataset of 40 pattern classes, the goal of our image case study is to assess how well the different entropies can be used to determine the class of a newly given image sample. Our experiments show that the Tsallis entropy using the proposed multi-q approach has great advantages over the Boltzmann-Gibbs-Shannon entropy for pattern classification, boosting image recognition rates by a factor of 3. We discuss the reasons behind this success, shedding light on the usefulness of the Tsallis entropy and the multi-q approach. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The amount of information exchanged per unit of time between two nodes in a dynamical network or between two data sets is a powerful concept for analysing complex systems. This quantity, known as the mutual information rate (MIR), is calculated from the mutual information, which is rigorously defined only for random systems. Moreover, the definition of mutual information is based on probabilities of significant events. This work offers a simple alternative way to calculate the MIR in dynamical (deterministic) networks or between two time series (not fully deterministic), and to calculate its upper and lower bounds without having to calculate probabilities, but rather in terms of well known and well defined quantities in dynamical systems. As possible applications of our bounds, we study the relationship between synchronisation and the exchange of information in a system of two coupled maps and in experimental networks of coupled oscillators.
Resumo:
Financial markets can be viewed as a highly complex evolving system that is very sensitive to economic instabilities. The complex organization of the market can be represented in a suitable fashion in terms of complex networks, which can be constructed from stock prices such that each pair of stocks is connected by a weighted edge that encodes the distance between them. In this work, we propose an approach to analyze the topological and dynamic evolution of financial networks based on the stock correlation matrices. An entropy-related measurement is adopted to quantify the robustness of the evolving financial market organization. It is verified that the network topological organization suffers strong variation during financial instabilities and the networks in such periods become less robust. A statistical robust regression model is proposed to quantity the relationship between the network structure and resilience. The obtained coefficients of such model indicate that the average shortest path length is the measurement most related to network resilience coefficient. This result indicates that a collective behavior is observed between stocks during financial crisis. More specifically, stocks tend to synchronize their price evolution, leading to a high correlation between pair of stock prices, which contributes to the increase in distance between them and, consequently, decrease the network resilience. (C) 2012 American Institute of Physics. [doi:10.1063/1.3683467]
Resumo:
We explore the meaning of information about quantities of interest. Our approach is divided in two scenarios: the analysis of observations and the planning of an experiment. First, we review the Sufficiency, Conditionality and Likelihood principles and how they relate to trivial experiments. Next, we review Blackwell Sufficiency and show that sampling without replacement is Blackwell Sufficient for sampling with replacement. Finally, we unify the two scenarios presenting an extension of the relationship between Blackwell Equivalence and the Likelihood Principle.