955 resultados para Conditional entropy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Multi-objective particle swarm optimization (MOPSO) is a search algorithm based on social behavior. Most of the existing multi-objective particle swarm optimization schemes are based on Pareto optimality and aim to obtain a representative non-dominated Pareto front for a given problem. Several approaches have been proposed to study the convergence and performance of the algorithm, particularly by accessing the final results. In the present paper, a different approach is proposed, by using Shannon entropy to analyzethe MOPSO dynamics along the algorithm execution. The results indicate that Shannon entropy can be used as an indicator of diversity and convergence for MOPSO problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Catastrophic events, such as wars and terrorist attacks, tornadoes and hurricanes, earthquakes, tsunamis, floods and landslides, are always accompanied by a large number of casualties. The size distribution of these casualties has separately been shown to follow approximate power law (PL) distributions. In this paper, we analyze the statistical distributions of the number of victims of catastrophic phenomena, in particular, terrorism, and find double PL behavior. This means that the data sets are better approximated by two PLs instead of a single one. We plot the PL parameters, corresponding to several events, and observe an interesting pattern in the charts, where the lines that connect each pair of points defining the double PLs are almost parallel to each other. A complementary data analysis is performed by means of the computation of the entropy. The results reveal relationships hidden in the data that may trigger a future comprehensive explanation of this type of phenomena.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the chromosome information of twenty five species, namely, mammals, fishes, birds, insects, nematodes, fungus, and one plant. A quantifying scheme inspired in the state space representation of dynamical systems is formulated. Based on this algorithm, the information of each chromosome is converted into a bidimensional distribution. The plots are then analyzed and characterized by means of Shannon entropy. The large volume of information is integrated by averaging the lengths and entropy quantities of each species. The results can be easily visualized revealing quantitative global genomic information.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper investigates the adoption of entropy for analyzing the dynamics of a multiple independent particles system. Several entropy definitions and types of particle dynamics with integer and fractional behavior are studied. The results reveal the adequacy of the entropy concept in the analysis of complex dynamical systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

When considering time series data of variables describing agent interactions in social neurobiological systems, measures of regularity can provide a global understanding of such system behaviors. Approximate entropy (ApEn) was introduced as a nonlinear measure to assess the complexity of a system behavior by quantifying the regularity of the generated time series. However, ApEn is not reliable when assessing and comparing the regularity of data series with short or inconsistent lengths, which often occur in studies of social neurobiological systems, particularly in dyadic human movement systems. Here, the authors present two normalized, nonmodified measures of regularity derived from the original ApEn, which are less dependent on time series length. The validity of the suggested measures was tested in well-established series (random and sine) prior to their empirical application, describing the dyadic behavior of athletes in team games. The authors consider one of the ApEn normalized measures to generate the 95th percentile envelopes that can be used to test whether a particular social neurobiological system is highly complex (i.e., generates highly unpredictable time series). Results demonstrated that suggested measures may be considered as valid instruments for measuring and comparing complexity in systems that produce time series with inconsistent lengths.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dynamics of catalytic networks have been widely studied over the last decades because of their implications in several fields like prebiotic evolution, virology, neural networks, immunology or ecology. One of the most studied mathematical bodies for catalytic networks was initially formulated in the context of prebiotic evolution, by means of the hypercycle theory. The hypercycle is a set of self-replicating species able to catalyze other replicator species within a cyclic architecture. Hypercyclic organization might arise from a quasispecies as a way to increase the informational containt surpassing the so-called error threshold. The catalytic coupling between replicators makes all the species to behave like a single and coherent evolutionary multimolecular unit. The inherent nonlinearities of catalytic interactions are responsible for the emergence of several types of dynamics, among them, chaos. In this article we begin with a brief review of the hypercycle theory focusing on its evolutionary implications as well as on different dynamics associated to different types of small catalytic networks. Then we study the properties of chaotic hypercycles with error-prone replication with symbolic dynamics theory, characterizing, by means of the theory of topological Markov chains, the topological entropy and the periods of the orbits of unimodal-like iterated maps obtained from the strange attractor. We will focus our study on some key parameters responsible for the structure of the catalytic network: mutation rates, autocatalytic and cross-catalytic interactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A new method, based on linear correlation and phase diagrams was successfully developed for processes like the sedimentary process, where the deposition phase can have different time duration - represented by repeated values in a series - and where the erosion can play an important rule deleting values of a series. The sampling process itself can be the cause of repeated values - large strata twice sampled - or deleted values: tiny strata fitted between two consecutive samples. What we developed was a mathematical procedure which, based upon the depth chemical composition evolution, allows the establishment of frontiers as well as the periodicity of different sedimentary environments. The basic tool isn't more than a linear correlation analysis which allow us to detect the existence of eventual evolution rules, connected with cyclical phenomena within time series (considering the space assimilated to time), with the final objective of prevision. A very interesting discovery was the phenomenon of repeated sliding windows that represent quasi-cycles of a series of quasi-periods. An accurate forecast can be obtained if we are inside a quasi-cycle (it is possible to predict the other elements of the cycle with the probability related with the number of repeated and deleted points). We deal with an innovator methodology, reason why it's efficiency is being tested in some case studies, with remarkable results that shows it's efficacy. Keywords: sedimentary environments, sequence stratigraphy, data analysis, time-series, conditional probability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this work, we associate a p-periodic nonautonomous graph to each p-periodic nonautonomous Lorenz system with finite critical orbits. We develop Perron-Frobenius theory for nonautonomous graphs and use it to calculate their entropy. Finally, we prove that the topological entropy of a p-periodic nonautonomous Lorenz system is equal to the entropy of its associated nonautonomous graph.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Biomédica

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The 30th ACM/SIGAPP Symposium On Applied Computing (SAC 2015). 13 to 17, Apr, 2015, Embedded Systems. Salamanca, Spain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper studies the statistical distributions of worldwide earthquakes from year 1963 up to year 2012. A Cartesian grid, dividing Earth into geographic regions, is considered. Entropy and the Jensen–Shannon divergence are used to analyze and compare real-world data. Hierarchical clustering and multi-dimensional scaling techniques are adopted for data visualization. Entropy-based indices have the advantage of leading to a single parameter expressing the relationships between the seismic data. Classical and generalized (fractional) entropy and Jensen–Shannon divergence are tested. The generalized measures lead to a clear identification of patterns embedded in the data and contribute to better understand earthquake distributions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complex industrial plants exhibit multiple interactions among smaller parts and with human operators. Failure in one part can propagate across subsystem boundaries causing a serious disaster. This paper analyzes the industrial accident data series in the perspective of dynamical systems. First, we process real world data and show that the statistics of the number of fatalities reveal features that are well described by power law (PL) distributions. For early years, the data reveal double PL behavior, while, for more recent time periods, a single PL fits better into the experimental data. Second, we analyze the entropy of the data series statistics over time. Third, we use the Kullback–Leibler divergence to compare the empirical data and multidimensional scaling (MDS) techniques for data analysis and visualization. Entropy-based analysis is adopted to assess complexity, having the advantage of yielding a single parameter to express relationships between the data. The classical and the generalized (fractional) entropy and Kullback–Leibler divergence are used. The generalized measures allow a clear identification of patterns embedded in the data.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A Work Project, presented as part of the requirements for the Award of a Masters Degree in Finance from the NOVA – School of Business and Economics

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Thermodynamic stability of black holes, described by the Rényi formula as equilibrium compatible entropy function, is investigated. It is shown that within this approach, asymptotically flat, Schwarzschild black holes can be in stable equilibrium with thermal radiation at a fixed temperature. This implies that the canonical ensemble exists just like in anti-de Sitter space, and nonextensive effects can stabilize the black holes in a very similar way as it is done by the gravitational potential of an anti-de Sitter space. Furthermore, it is also shown that a Hawking–Page-like black hole phase transition occurs at a critical temperature which depends on the q-parameter of the Rényi formula.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The receiver-operating characteristic (ROC) curve is the most widely used measure for evaluating the performance of a diagnostic biomarker when predicting a binary disease outcome. The ROC curve displays the true positive rate (or sensitivity) and the false positive rate (or 1-specificity) for different cut-off values used to classify an individual as healthy or diseased. In time-to-event studies, however, the disease status (e.g. death or alive) of an individual is not a fixed characteristic, and it varies along the study. In such cases, when evaluating the performance of the biomarker, several issues should be taken into account: first, the time-dependent nature of the disease status; and second, the presence of incomplete data (e.g. censored data typically present in survival studies). Accordingly, to assess the discrimination power of continuous biomarkers for time-dependent disease outcomes, time-dependent extensions of true positive rate, false positive rate, and ROC curve have been recently proposed. In this work, we present new nonparametric estimators of the cumulative/dynamic time-dependent ROC curve that allow accounting for the possible modifying effect of current or past covariate measures on the discriminatory power of the biomarker. The proposed estimators can accommodate right-censored data, as well as covariate-dependent censoring. The behavior of the estimators proposed in this study will be explored through simulations and illustrated using data from a cohort of patients who suffered from acute coronary syndrome.