924 resultados para Renyi’s entropy


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Assessment of the suitability of anthropogenic landscapes for wildlife species is crucial for setting priorities for biodiversity conservation. This study aimed to analyse the environmental suitability of a highly fragmented region of the Brazilian Atlantic Forest, one of the world's 25 recognized biodiversity hotspots, for forest bird species. Eight forest bird species were selected for the analyses, based on point counts (n = 122) conducted in April-September 2006 and January-March 2009. Six additional variables (landscape diversity, distance from forest and streams, aspect, elevation and slope) were modelled in Maxent for (1) actual and (2) simulated land cover, based on the forest expansion required by existing Brazilian forest legislation. Models were evaluated by bootstrap or jackknife methods and their performance was assessed by AUC, omission error, binomial probability or p value. All predictive models were statistically significant, with high AUC values and low omission errors. A small proportion of the actual landscape (24.41 +/- 6.31%) was suitable for forest bird species. The simulated landscapes lead to an increase of c. 30% in total suitable areas. In average, models predicted a small increase (23.69 +/- 6.95%) in the area of suitable native forest for bird species. Being close to forest increased the environmental suitability of landscapes for all bird species; landscape diversity was also a significant factor for some species. In conclusion, this study demonstrates that species distribution modelling (SDM) successfully predicted bird distribution across a heterogeneous landscape at fine spatial resolution, as all models were biologically relevant and statistically significant. The use of landscape variables as predictors contributed significantly to the results, particularly for species distributions over small extents and at fine scales. This is the first study to evaluate the environmental suitability of the remaining Brazilian Atlantic Forest for bird species in an agricultural landscape, and provides important additional data for regional environmental planning.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The brain's structural and functional systems, protein-protein interaction, and gene networks are examples of biological systems that share some features of complex networks, such as highly connected nodes, modularity, and small-world topology. Recent studies indicate that some pathologies present topological network alterations relative to norms seen in the general population. Therefore, methods to discriminate the processes that generate the different classes of networks (e. g., normal and disease) might be crucial for the diagnosis, prognosis, and treatment of the disease. It is known that several topological properties of a network (graph) can be described by the distribution of the spectrum of its adjacency matrix. Moreover, large networks generated by the same random process have the same spectrum distribution, allowing us to use it as a "fingerprint". Based on this relationship, we introduce and propose the entropy of a graph spectrum to measure the "uncertainty" of a random graph and the Kullback-Leibler and Jensen-Shannon divergences between graph spectra to compare networks. We also introduce general methods for model selection and network model parameter estimation, as well as a statistical procedure to test the nullity of divergence between two classes of complex networks. Finally, we demonstrate the usefulness of the proposed methods by applying them to (1) protein-protein interaction networks of different species and (2) on networks derived from children diagnosed with Attention Deficit Hyperactivity Disorder (ADHD) and typically developing children. We conclude that scale-free networks best describe all the protein-protein interactions. Also, we show that our proposed measures succeeded in the identification of topological changes in the network while other commonly used measures (number of edges, clustering coefficient, average path length) failed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The kinetics of the homogeneous acylation of microcrystalline cellulose, MCC, with carboxylic acid anhydrides with different acyl chain-length (Nc; ethanoic to hexanoic) in LiCl/N,N-dimethylacetamide have been studied by conductivity measurements from 65 to 85 A degrees C. We have employed cyclohexylmethanol, CHM, and trans-1,2-cyclohexanediol, CHD, as model compounds for the hydroxyl groups of the anhydroglucose unit of cellulose. The ratios of rate constants of acylation of primary (CHM; Prim-OH) and secondary (CHD; Sec-OH) groups have been employed, after correction, in order to split the overall rate constants of the reaction of MCC into contributions from the discrete OH groups. For the model compounds, we have found that k((Prim-OH))/k((Sec-OH)) > 1, akin to reactions of cellulose under heterogeneous conditions; this ratio increases as a function of increasing Nc. The overall, and partial rate constants of the acylation of MCC decrease from ethanoic- to butanoic-anhydride and then increase for pentanoic- and hexanoic anhydride, due to subtle changes in- and compensations of the enthalpy and entropy of activation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We calculate the relic abundance of mixed axion/neutralino cold dark matter which arises in R-parity conserving supersymmetric (SUSY) models wherein the strong CP problem is solved by the Peccei-Quinn (PQ) mechanism with a concommitant axion/saxion/axino supermultiplet. By numerically solving the coupled Boltzmann equations, we include the combined effects of 1. thermal axino production with cascade decays to a neutralino LSP, 2. thermal saxion production and production via coherent oscillations along with cascade decays and entropy injection, 3. thermal neutralino production and re-annihilation after both axino and saxion decays, 4. gravitino production and decay and 5. axion production both thermally and via oscillations. For SUSY models with too high a standard neutralino thermal abundance, we find the combined effect of SUSY PQ particles is not enough to lower the neutralino abundance down to its measured value, while at the same time respecting bounds on late-decaying neutral particles from BBN. However, models with a standard neutralino underabundance can now be allowed with either neutralino or axion domination of dark matter, and furthermore, these models can allow the PQ breaking scale f(a) to be pushed up into the 10(14) - 10(15) GeV range, which is where it is typically expected to be in string theory models.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

(Isothermal seed germination of Adenanthera pavonina). This work reports aspects of seed germination at different temperatures of Adenanthera pavonina L., a woody Southeast Asian Leguminosae. Germination was studied by measuring the final percentages, the rate, the rate variance and the synchronisation of the individual seeds calculated by the minimal informational entropy of frequencies distribution of seed germination. Overlapping the germinability range with the range for the highest values of germination rates and the minimal informational entropy of frequencies distribution of seed germination, we found that the best temperature for the germination of A. pavonina seeds is 35 degrees C. The slope mu of the Arrhenius plot of the germination rates is positive for T < 35 degrees C and negative for T > 35 degrees C. The activation enthalpies, estimated from closely-spaced points, shows that vertical bar Delta H-vertical bar < 12 Cal mol(-1) occur for temperatures in the range between 25 degrees C and 40 degrees C. The ecological implication of these results are that this species may germinate very fast in tropical areas during the summer season. This may be an advantage to the establishment of this species under the climatic conditions in those areas.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The realization that statistical physics methods can be applied to analyze written texts represented as complex networks has led to several developments in natural language processing, including automatic summarization and evaluation of machine translation. Most importantly, so far only a few metrics of complex networks have been used and therefore there is ample opportunity to enhance the statistics-based methods as new measures of network topology and dynamics are created. In this paper, we employ for the first time the metrics betweenness, vulnerability and diversity to analyze written texts in Brazilian Portuguese. Using strategies based on diversity metrics, a better performance in automatic summarization is achieved in comparison to previous work employing complex networks. With an optimized method the Rouge score (an automatic evaluation method used in summarization) was 0.5089, which is the best value ever achieved for an extractive summarizer with statistical methods based on complex networks for Brazilian Portuguese. Furthermore, the diversity metric can detect keywords with high precision, which is why we believe it is suitable to produce good summaries. It is also shown that incorporating linguistic knowledge through a syntactic parser does enhance the performance of the automatic summarizers, as expected, but the increase in the Rouge score is only minor. These results reinforce the suitability of complex network methods for improving automatic summarizers in particular, and treating text in general. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We present the results of an operational use of experimentally measured optical tomograms to determine state characteristics (purity) avoiding any reconstruction of quasiprobabilities. We also develop a natural way how to estimate the errors (including both statistical and systematic ones) by an analysis of the experimental data themselves. Precision of the experiment can be increased by postselecting the data with minimal (systematic) errors. We demonstrate those techniques by considering coherent and photon-added coherent states measured via the time-domain improved homodyne detection. The operational use and precision of the data allowed us to check purity-dependent uncertainty relations and uncertainty relations for Shannon and Renyi entropies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background: Heavy-flavor production in p + p collisions is a good test of perturbative-quantum-chromodynamics (pQCD) calculations. Modification of heavy-flavor production in heavy-ion collisions relative to binary-collision scaling from p + p results, quantified with the nuclear-modification factor (R-AA), provides information on both cold-and hot-nuclear-matter effects. Midrapidity heavy-flavor R-AA measurements at the Relativistic Heavy Ion Collider have challenged parton-energy-loss models and resulted in upper limits on the viscosity-entropy ratio that are near the quantum lower bound. Such measurements have not been made in the forward-rapidity region. Purpose: Determine transverse-momentum (p(T)) spectra and the corresponding R-AA for muons from heavy-flavor meson decay in p + p and Cu + Cu collisions at root s(NN) = 200 GeV and y = 1.65. Method: Results are obtained using the semileptonic decay of heavy-flavor mesons into negative muons. The PHENIX muon-arm spectrometers measure the p(T) spectra of inclusive muon candidates. Backgrounds, primarily due to light hadrons, are determined with a Monte Carlo calculation using a set of input hadron distributions tuned to match measured-hadron distributions in the same detector and statistically subtracted. Results: The charm-production cross section in p + p collisions at root s = 200 GeV, integrated over p(T) and in the rapidity range 1.4 < y < 1.9, is found to be d(sigma e (e) over bar)/dy = 0.139 +/- 0.029 (stat)(-0.058)(+0.051) (syst) mb. This result is consistent with a perturbative fixed-order-plus-next-to-leading-log calculation within scale uncertainties and is also consistent with expectations based on the corresponding midrapidity charm-production cross section measured by PHENIX. The R-AA for heavy-flavor muons in Cu + Cu collisions is measured in three centrality bins for 1 < p(T) < 4 GeV/c. Suppression relative to binary-collision scaling (R-AA < 1) increases with centrality. Conclusions: Within experimental and theoretical uncertainties, the measured charm yield in p + p collisions is consistent with state-of-the-art pQCD calculations. Suppression in central Cu + Cu collisions suggests the presence of significant cold-nuclear-matter effects and final-state energy loss.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Exergy analysis is applied to assess the energy conversion processes that take place in the human body, aiming at developing indicators of health and performance based on the concepts of exergy destroyed rate and exergy efficiency. The thermal behavior of the human body is simulated by a model composed of 15 cylinders with elliptical cross section representing: head, neck, trunk, arms, forearms, hands, thighs, legs, and feet. For each, a combination of tissues is considered. The energy equation is solved for each cylinder, being possible to obtain transitory response from the body due to a variation in environmental conditions. With this model, it is possible to obtain heat and mass flow rates to the environment due to radiation, convection, evaporation and respiration. The exergy balances provide the exergy variation due to heat and mass exchange over the body, and the exergy variation over time for each compartments tissue and blood, the sum of which leads to the total variation of the body. Results indicate that exergy destroyed and exergy efficiency decrease over lifespan and the human body is more efficient and destroys less exergy in lower relative humidities and higher temperatures. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The jet quenching parameter of an anisotropic plasma depends on the relative orientation between the anisotropic direction, the direction of motion of the parton, and the direction along which the momentum broadening is measured. We calculate the jet quenching parameter of an anisotropic, strongly coupled N = 4 plasma by means of its gravity dual. We present the results for arbitrary orientations and arbitrary values of the anisotropy. The anisotropic value can be larger or smaller than the isotropic one, and this depends on whether the comparison is made at equal temperatures or at equal entropy densities. We compare our results to analogous calculations for the real-world quark-gluon plasma and find agreement in some cases and disagreement in others.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

We introduce a five-parameter continuous model, called the McDonald inverted beta distribution, to extend the two-parameter inverted beta distribution and provide new four- and three-parameter sub-models. We give a mathematical treatment of the new distribution including expansions for the density function, moments, generating and quantile functions, mean deviations, entropy and reliability. The model parameters are estimated by maximum likelihood and the observed information matrix is derived. An application of the new model to real data shows that it can give consistently a better fit than other important lifetime models. (C) 2012 The Franklin Institute. Published by Elsevier Ltd. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

For any continuous baseline G distribution [G. M. Cordeiro and M. de Castro, A new family of generalized distributions, J. Statist. Comput. Simul. 81 (2011), pp. 883-898], proposed a new generalized distribution (denoted here with the prefix 'Kw-G'(Kumaraswamy-G)) with two extra positive parameters. They studied some of its mathematical properties and presented special sub-models. We derive a simple representation for the Kw-Gdensity function as a linear combination of exponentiated-G distributions. Some new distributions are proposed as sub-models of this family, for example, the Kw-Chen [Z.A. Chen, A new two-parameter lifetime distribution with bathtub shape or increasing failure rate function, Statist. Probab. Lett. 49 (2000), pp. 155-161], Kw-XTG [M. Xie, Y. Tang, and T.N. Goh, A modified Weibull extension with bathtub failure rate function, Reliab. Eng. System Safety 76 (2002), pp. 279-285] and Kw-Flexible Weibull [M. Bebbington, C. D. Lai, and R. Zitikis, A flexible Weibull extension, Reliab. Eng. System Safety 92 (2007), pp. 719-726]. New properties of the Kw-G distribution are derived which include asymptotes, shapes, moments, moment generating function, mean deviations, Bonferroni and Lorenz curves, reliability, Renyi entropy and Shannon entropy. New properties of the order statistics are investigated. We discuss the estimation of the parameters by maximum likelihood. We provide two applications to real data sets and discuss a bivariate extension of the Kw-G distribution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This article introduces generalized beta-generated (GBG) distributions. Sub-models include all classical beta-generated, Kumaraswamy-generated and exponentiated distributions. They are maximum entropy distributions under three intuitive conditions, which show that the classical beta generator skewness parameters only control tail entropy and an additional shape parameter is needed to add entropy to the centre of the parent distribution. This parameter controls skewness without necessarily differentiating tail weights. The GBG class also has tractable properties: we present various expansions for moments, generating function and quantiles. The model parameters are estimated by maximum likelihood and the usefulness of the new class is illustrated by means of some real data sets. (c) 2011 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Known as the "king of spices", black pepper (Piper nigrum), a perennial crop of the tropics, is economically the most important and the most widely used spice crop in the world. To understand its suitable bioclimatic distribution, maximum entropy based on ecological niche modeling was used to model the bioclimatic niches of the species in its Asian range. Based on known occurrences, bioclimatic areas with higher probabilities are mainly located in the eastern and western coasts of the Indian Peninsula, the east of Sumatra Island, some areas in the Malay Archipelago, and the southeast coastal areas of China. Some undocumented places were also predicted as suitable areas. According to the jackknife procedure, the minimum temperature of the coldest month, the mean monthly temperature range, and the precipitation of the wettest month were identified as highly effective factors in the distribution of black pepper and could possibly account for the crop's distribution pattern. Such climatic requirements inhibited this species from dispersing and gaining a larger geographical range.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the past decades, all of the efforts at quantifying systems complexity with a general tool has usually relied on using Shannon's classical information framework to address the disorder of the system through the Boltzmann-Gibbs-Shannon entropy, or one of its extensions. However, in recent years, there were some attempts to tackle the quantification of algorithmic complexities in quantum systems based on the Kolmogorov algorithmic complexity, obtaining some discrepant results against the classical approach. Therefore, an approach to the complexity measure is proposed here, using the quantum information formalism, taking advantage of the generality of the classical-based complexities, and being capable of expressing these systems' complexity on other framework than its algorithmic counterparts. To do so, the Shiner-Davison-Landsberg (SDL) complexity framework is considered jointly with linear entropy for the density operators representing the analyzed systems formalism along with the tangle for the entanglement measure. The proposed measure is then applied in a family of maximally entangled mixed state.