892 resultados para hierarchical entropy


Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper presents a novel segmentation method for cuboidal cell nuclei in images of prostate tissue stained with hematoxylin and eosin. The proposed method allows segmenting normal, hyperplastic and cancerous prostate images in three steps: pre-processing, segmentation of cuboidal cell nuclei and post-processing. The pre-processing step consists of applying contrast stretching to the red (R) channel to highlight the contrast of cuboidal cell nuclei. The aim of the second step is to apply global thresholding based on minimum cross entropy to generate a binary image with candidate regions for cuboidal cell nuclei. In the post-processing step, false positives are removed using the connected component method. The proposed segmentation method was applied to an image bank with 105 samples and measures of sensitivity, specificity and accuracy were compared with those provided by other segmentation approaches available in the specialized literature. The results are promising and demonstrate that the proposed method allows the segmentation of cuboidal cell nuclei with a mean accuracy of 97%. © 2013 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The influenza virus has been a challenge to science due to its ability to withstand new environmental conditions. Taking into account the development of virus sequence databases, computational approaches can be helpful to understand virus behavior over time. Furthermore, they can suggest new directions to deal with influenza. This work presents triplet entropy analysis as a potential phylodynamic tool to quantify nucleotide organization of viral sequences. The application of this measure to segments of hemagglutinin (HA) and neuraminidase (NA) of H1N1 and H3N2 virus subtypes has shown some variability effects along timeline, inferring about virus evolution. Sequences were divided by year and compared for virus subtype (H1N1 and H3N2). The nonparametric Mann-Whitney test was used for comparison between groups. Results show that differentiation in entropy precedes differentiation in GC content for both groups. Considering the HA fragment, both triplet entropy as well as GC concentration show intersection in 2009, year of the recent pandemic. Some conclusions about possible flu evolutionary lines were drawn. © 2013 Elsevier B.V.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Visual perception and action are strongly linked with parallel processing channels connecting the retina, the lateral geniculate nucleus, and the input layers of the primary visual cortex. Achromatic vision is provided by at least two of such channels formed by the M and P neurons. These cell pathways are similarly organized in primates having different lifestyles, including species that are diurnal, nocturnal, and which exhibit a variety of color vision phenotypes. We describe the M and P cell properties by 3D Gábor functions and their 3D Fourier transform. The M and P cells occupy different loci in the Gábor information diagram or Fourier Space. This separation allows the M and P pathways to transmit visual signals with distinct 6D joint entropy for space, spatial frequency, time, and temporal frequency. By combining the M and P impacts on the cortical neurons beyond V1 input layers, the cortical pathways are able to process aspects of visual stimuli with a better precision than it would be possible using the M or P pathway alone. This performance fulfils the requirements of different behavioral tasks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study evaluated alternatives for producing erosion susceptibility maps, considering different weight combinations for an environment's attributes, according to four different points of views. The attributes considered were landform, steepness, soils, rocks and land occupation. Considered alternatives were: (1) equal weights, more traditional approach, (2) different weights, according to a previous study in the area, (3) different weights, based on other works in the literature, and (4) different weights based on the analytical hierarchical process. The area studied included the Prosa Basin located in Campo Grande-Mato Grosso do Sul State, Brazil. The results showed that the assessed alternatives can be used together or in different stages of studies aiming at urban planning and decision-making on the interventions to be applied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Recently, a group of researchers proposed the concept of entransy by analogy with the electrical energy stored in a capacitor, the entransy being a measure of the ability of a body or a system to transfer heat. In comparative terms, the entransy dissipation rate is related with the loss of heat transfer ability just like the exergy destruction rate is proportional to the loss of work ability, being these losses caused by the irreversibilities related to the thermodynamic processes. Some authors have questioned the need for the concept of entransy, claiming that this concept is only an extension of a well established theory of heat transfer. The objective of this work is show the equivalence between the application of the concepts of entransy and entropy generation rate, which can be verified using various application examples. The application examples used here are the thermodynamic modeling of three physical models of solar energy collectors and a physical model of a sensible heat storage system. Analytical results are shown and compared. The results showed that the application of the concept of entransy provided identical expressions obtained by the concept of entropy generation, indicating a duplication of concepts. (C) 2014 Elsevier Ltd. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The technologies are rapidly developing, but some of them present in the computers, as for instance their processing capacity, are reaching their physical limits. It is up to quantum computation offer solutions to these limitations and issues that may arise. In the field of information security, encryption is of paramount importance, being then the development of quantum methods instead of the classics, given the computational power offered by quantum computing. In the quantum world, the physical states are interrelated, thus occurring phenomenon called entanglement. This study presents both a theoretical essay on the merits of quantum mechanics, computing, information, cryptography and quantum entropy, and some simulations, implementing in C language the effects of entropy of entanglement of photons in a data transmission, using Von Neumann entropy and Tsallis entropy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analyses of ecological data should account for the uncertainty in the process(es) that generated the data. However, accounting for these uncertainties is a difficult task, since ecology is known for its complexity. Measurement and/or process errors are often the only sources of uncertainty modeled when addressing complex ecological problems, yet analyses should also account for uncertainty in sampling design, in model specification, in parameters governing the specified model, and in initial and boundary conditions. Only then can we be confident in the scientific inferences and forecasts made from an analysis. Probability and statistics provide a framework that accounts for multiple sources of uncertainty. Given the complexities of ecological studies, the hierarchical statistical model is an invaluable tool. This approach is not new in ecology, and there are many examples (both Bayesian and non-Bayesian) in the literature illustrating the benefits of this approach. In this article, we provide a baseline for concepts, notation, and methods, from which discussion on hierarchical statistical modeling in ecology can proceed. We have also planted some seeds for discussion and tried to show where the practical difficulties lie. Our thesis is that hierarchical statistical modeling is a powerful way of approaching ecological analysis in the presence of inevitable but quantifiable uncertainties, even if practical issues sometimes require pragmatic compromises.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

For many tree species, mating system analyses have indicated potential variations in the selfing rate and paternity correlation among fruits within individuals, among individuals within populations, among populations, and from one flowering event to another. In this study, we used eight microsatellite markers to investigate mating systems at two hierarchical levels (fruits within individuals and individuals within populations) for the insect pollinated Neotropical tree Tabebuia roseo-alba. We found that T. roseo-alba has a mixed mating system with predominantly outcrossed mating. The outcrossing rates at the population level were similar across two T. roseo-alba populations; however, the rates varied considerably among individuals within populations. The correlated paternity results at different hierarchical levels showed that there is a high probability of shared paternal parentage when comparing seeds within fruits and among fruits within plants and full-sibs occur in much higher proportion within fruits than among fruits. Significant levels of fixation index were found in both populations and biparental inbreeding is believed to be the main cause of the observed inbreeding. The number of pollen donors contributing to mating was low. Furthermore, open-pollinated seeds varied according to relatedness, including half-sibs, full-sibs, self-sibs and self- half-sibs. In both populations, the effective population size within a family (seed-tree and its offspring) was lower than expected for panmictic populations. Thus, seeds for ex situ conservation genetics, progeny tests and reforestation must be collected from a large number of seed-trees to guarantee an adequate effective population in the sample.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Complexity in time series is an intriguing feature of living dynamical systems, with potential use for identification of system state. Although various methods have been proposed for measuring physiologic complexity, uncorrelated time series are often assigned high values of complexity, errouneously classifying them as a complex physiological signals. Here, we propose and discuss a method for complex system analysis based on generalized statistical formalism and surrogate time series. Sample entropy (SampEn) was rewritten inspired in Tsallis generalized entropy, as function of q parameter (qSampEn). qSDiff curves were calculated, which consist of differences between original and surrogate series qSampEn. We evaluated qSDiff for 125 real heart rate variability (HRV) dynamics, divided into groups of 70 healthy, 44 congestive heart failure (CHF), and 11 atrial fibrillation (AF) subjects, and for simulated series of stochastic and chaotic process. The evaluations showed that, for nonperiodic signals, qSDiff curves have a maximum point (qSDiff(max)) for q not equal 1. Values of q where the maximum point occurs and where qSDiff is zero were also evaluated. Only qSDiff(max) values were capable of distinguish HRV groups (p-values 5.10 x 10(-3); 1.11 x 10(-7), and 5.50 x 10(-7) for healthy vs. CHF, healthy vs. AF, and CHF vs. AF, respectively), consistently with the concept of physiologic complexity, and suggests a potential use for chaotic system analysis. (C) 2012 American Institute of Physics. [http://dx.doi.org/10.1063/1.4758815]

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We present a stochastic approach to nonequilibrium thermodynamics based on the expression of the entropy production rate advanced by Schnakenberg for systems described by a master equation. From the microscopic Schnakenberg expression we get the macroscopic bilinear form for the entropy production rate in terms of fluxes and forces. This is performed by placing the system in contact with two reservoirs with distinct sets of thermodynamic fields and by assuming an appropriate form for the transition rate. The approach is applied to an interacting lattice gas model in contact with two heat and particle reservoirs. On a square lattice, a continuous symmetry breaking phase transition takes place such that at the nonequilibrium ordered phase a heat flow sets in even when the temperatures of the reservoirs are the same. The entropy production rate is found to have a singularity at the critical point of the linear-logarithm type.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Item response theory (IRT) comprises a set of statistical models which are useful in many fields, especially when there is an interest in studying latent variables (or latent traits). Usually such latent traits are assumed to be random variables and a convenient distribution is assigned to them. A very common choice for such a distribution has been the standard normal. Recently, Azevedo et al. [Bayesian inference for a skew-normal IRT model under the centred parameterization, Comput. Stat. Data Anal. 55 (2011), pp. 353-365] proposed a skew-normal distribution under the centred parameterization (SNCP) as had been studied in [R. B. Arellano-Valle and A. Azzalini, The centred parametrization for the multivariate skew-normal distribution, J. Multivariate Anal. 99(7) (2008), pp. 1362-1382], to model the latent trait distribution. This approach allows one to represent any asymmetric behaviour concerning the latent trait distribution. Also, they developed a Metropolis-Hastings within the Gibbs sampling (MHWGS) algorithm based on the density of the SNCP. They showed that the algorithm recovers all parameters properly. Their results indicated that, in the presence of asymmetry, the proposed model and the estimation algorithm perform better than the usual model and estimation methods. Our main goal in this paper is to propose another type of MHWGS algorithm based on a stochastic representation (hierarchical structure) of the SNCP studied in [N. Henze, A probabilistic representation of the skew-normal distribution, Scand. J. Statist. 13 (1986), pp. 271-275]. Our algorithm has only one Metropolis-Hastings step, in opposition to the algorithm developed by Azevedo et al., which has two such steps. This not only makes the implementation easier but also reduces the number of proposal densities to be used, which can be a problem in the implementation of MHWGS algorithms, as can be seen in [R.J. Patz and B.W. Junker, A straightforward approach to Markov Chain Monte Carlo methods for item response models, J. Educ. Behav. Stat. 24(2) (1999), pp. 146-178; R. J. Patz and B. W. Junker, The applications and extensions of MCMC in IRT: Multiple item types, missing data, and rated responses, J. Educ. Behav. Stat. 24(4) (1999), pp. 342-366; A. Gelman, G.O. Roberts, and W.R. Gilks, Efficient Metropolis jumping rules, Bayesian Stat. 5 (1996), pp. 599-607]. Moreover, we consider a modified beta prior (which generalizes the one considered in [3]) and a Jeffreys prior for the asymmetry parameter. Furthermore, we study the sensitivity of such priors as well as the use of different kernel densities for this parameter. Finally, we assess the impact of the number of examinees, number of items and the asymmetry level on the parameter recovery. Results of the simulation study indicated that our approach performed equally as well as that in [3], in terms of parameter recovery, mainly using the Jeffreys prior. Also, they indicated that the asymmetry level has the highest impact on parameter recovery, even though it is relatively small. A real data analysis is considered jointly with the development of model fitting assessment tools. The results are compared with the ones obtained by Azevedo et al. The results indicate that using the hierarchical approach allows us to implement MCMC algorithms more easily, it facilitates diagnosis of the convergence and also it can be very useful to fit more complex skew IRT models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using the density matrix renormalization group, we calculated the finite-size corrections of the entanglement alpha-Renyi entropy of a single interval for several critical quantum chains. We considered models with U(1) symmetry such as the spin-1/2 XXZ and spin-1 Fateev-Zamolodchikov models, as well as models with discrete symmetries such as the Ising, the Blume-Capel, and the three-state Potts models. These corrections contain physically relevant information. Their amplitudes, which depend on the value of a, are related to the dimensions of operators in the conformal field theory governing the long-distance correlations of the critical quantum chains. The obtained results together with earlier exact and numerical ones allow us to formulate some general conjectures about the operator responsible for the leading finite-size correction of the alpha-Renyi entropies. We conjecture that the exponent of the leading finite-size correction of the alpha-Renyi entropies is p(alpha) = 2X(epsilon)/alpha for alpha > 1 and p(1) = nu, where X-epsilon denotes the dimensions of the energy operator of the model and nu = 2 for all the models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The nonequilibrium stationary state of an irreversible spherical model is investigated on hypercubic lattices. The model is defined by Langevin equations similar to the reversible case, but with asymmetric transition rates. In spite of being irreversible, we have succeeded in finding an explicit form for the stationary probability distribution, which turns out to be of the Boltzmann-Gibbs type. This enables one to evaluate the exact form of the entropy production rate at the stationary state, which is non-zero if the dynamical rules of the transition rates are asymmetric.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The mechanisms responsible for containing activity in systems represented by networks are crucial in various phenomena, for example, in diseases such as epilepsy that affect the neuronal networks and for information dissemination in social networks. The first models to account for contained activity included triggering and inhibition processes, but they cannot be applied to social networks where inhibition is clearly absent. A recent model showed that contained activity can be achieved with no need of inhibition processes provided that the network is subdivided into modules (communities). In this paper, we introduce a new concept inspired in the Hebbian theory, through which containment of activity is achieved by incorporating a dynamics based on a decaying activity in a random walk mechanism preferential to the node activity. Upon selecting the decay coefficient within a proper range, we observed sustained activity in all the networks tested, namely, random, Barabasi-Albert and geographical networks. The generality of this finding was confirmed by showing that modularity is no longer needed if the dynamics based on the integrate-and-fire dynamics incorporated the decay factor. Taken together, these results provide a proof of principle that persistent, restrained network activation might occur in the absence of any particular topological structure. This may be the reason why neuronal activity does not spread out to the entire neuronal network, even when no special topological organization exists.