33 resultados para TSALLIS ENTROPY


Relevância:

10.00% 10.00%

Publicador:

Resumo:

We studied the distribution of Palearctic green toads (Bufo viridis subgroup), an anuran species group with three ploidy levels, inhabiting the Central Asian Amudarya River drainage. Various approaches (one-way, multivariate, components variance analyses and maximum entropy modelling) were used to estimate the effect of altitude, precipitation, temperature and land vegetation covers on the distribution of toads. It is usually assumed that polyploid species occur in regions with harsher climatic conditions (higher latitudes, elevations, etc.), but for the green toads complex, we revealed a more intricate situation. The diploid species (Bufo shaartusiensis and Bufo turanensis) inhabit the arid lowlands (from 44 to 789 m a.s.l.), while tetraploid Bufo pewzowi were recorded in mountainous regions (340-3492 m a.s.l.) with usually lower temperatures and higher precipitation rates than in the region inhabited by diploid species. The triploid species Bufo baturae was found in the Pamirs (Tajikistan) at the highest altitudes (2503-3859 m a.s.l.) under the harshest climatic conditions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

1. As trees in a given cohort progress through ontogeny, many individuals die. This risk of mortality is unevenly distributed across species because of many processes such as habitat filtering, interspecific competition and negative density dependence. Here, we predict and test the patterns that such ecological processes should inscribe on both species and phylogenetic diversity as plants recruit from saplings to the canopy. 2. We compared species and phylogenetic diversity of sapling and tree communities at two sites in French Guiana. We surveyed 2084 adult trees in four 1-ha tree plots and 943 saplings in sixteen 16-m2 subplots nested within the tree plots. Species diversity was measured using Fisher's alpha (species richness) and Simpson's index (species evenness). Phylogenetic diversity was measured using Faith's phylogenetic diversity (phylogenetic richness) and Rao's quadratic entropy index (phylogenetic evenness). The phylogenetic diversity indices were inferred using four phylogenetic hypotheses: two based on rbcLa plastid DNA sequences obtained from the inventoried individuals with different branch lengths, a global phylogeny available from the Angiosperm Phylogeny Group, and a combination of both. 3. Taxonomic identification of the saplings was performed by combining morphological and DNA barcoding techniques using three plant DNA barcodes (psbA-trnH, rpoC1 and rbcLa). DNA barcoding enabled us to increase species assignment and to assign unidentified saplings to molecular operational taxonomic units. 4. Species richness was similar between saplings and trees, but in about half of our comparisons, species evenness was higher in trees than in saplings. This suggests that negative density dependence plays an important role during the sapling-to-tree transition. 5. Phylogenetic richness increased between saplings and trees in about half of the comparisons. Phylogenetic evenness increased significantly between saplings and trees in a few cases (4 out of 16) and only with the most resolved phylogeny. These results suggest that negative density dependence operates largely independently of the phylogenetic structure of communities. 6. Synthesis. By contrasting species richness and evenness across size classes, we suggest that negative density dependence drives shifts in composition during the sapling-to-tree transition. In addition, we found little evidence for a change in phylogenetic diversity across age classes, suggesting that the observed patterns are not phylogenetically constrained.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background and Aims: The NS5A protein of the HCV is known tobe involved in viral replication and assembly and probably in theresistance to Interferon based-therapy. Previous studies identifiedinsertions or deletions from 1 to 12 nucleotides in several genomicregions. In a multicenter study (17 French and 1 Swiss laboratoriesof virology), we identified for the first time a 31 amino acidsinsertion leading to a duplication of the V3 domain in the NS5Aregion with a high prevalence. Quasispecies of each strain withduplication were characterized and the inserted V3 domain wasidentified.Methods: Between 2006 and 2008, 1067 patients chronicallyinfected with a 1b HCV were consecutively included in the study.We first amplified the V3 region by RT-PCR to detect duplication(919 samples successfully amplified). The entire NS5A region wasthen amplified, cloned and sequenced in strains bearing theduplication. V3 sequences (called R1 and R2) from each clonewere analyzed with BioEdit and compared to a V3 consensussequence (C) built from the Database Los Alamos Hepatitis C.Entropy was determined at each position.Results: V3 duplications were identified in 25 patients representinga prevalence of 2.72%. We sequenced 2043 clones from which776 had a complete coding NS5A sequence (corresponding toa mean of 30 clones per patient). At the intra-individual level,6 to 17 variants were identified per V3 region, with a maximum of3 different amino acids. At the inter-individual level, a differenceof 7 and 2 amino acids was observed between C and R1 and R2sequences, respectively. Moreover few positions presented entropyhigher than 1 (4 for the R1, 2 for the R2 and 2 for the C). Among allthe sequenced clones, more than 60% were defective virus (partialfragment of NS5A or stop codon).Conclusions: We identified a duplication of the V3 domain ingenotype 1b HCV with a high prevalence. The R2 domain, which wasthe most similar to the C region, might probably be the "original"domain, whereas R1 should be the inserted domain. Phylogeneticanalyses are under process to confirm this hypothesis.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper, we propose two active learning algorithms for semiautomatic definition of training samples in remote sensing image classification. Based on predefined heuristics, the classifier ranks the unlabeled pixels and automatically chooses those that are considered the most valuable for its improvement. Once the pixels have been selected, the analyst labels them manually and the process is iterated. Starting with a small and nonoptimal training set, the model itself builds the optimal set of samples which minimizes the classification error. We have applied the proposed algorithms to a variety of remote sensing data, including very high resolution and hyperspectral images, using support vector machines. Experimental results confirm the consistency of the methods. The required number of training samples can be reduced to 10% using the methods proposed, reaching the same level of accuracy as larger data sets. A comparison with a state-of-the-art active learning method, margin sampling, is provided, highlighting advantages of the methods proposed. The effect of spatial resolution and separability of the classes on the quality of the selection of pixels is also discussed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Visual analysis of electroencephalography (EEG) background and reactivity during therapeutic hypothermia provides important outcome information, but is time-consuming and not always consistent between reviewers. Automated EEG analysis may help quantify the brain damage. Forty-six comatose patients in therapeutic hypothermia, after cardiac arrest, were included in the study. EEG background was quantified with burst-suppression ratio (BSR) and approximate entropy, both used to monitor anesthesia. Reactivity was detected through change in the power spectrum of signal before and after stimulation. Automatic results obtained almost perfect agreement (discontinuity) to substantial agreement (background reactivity) with a visual score from EEG-certified neurologists. Burst-suppression ratio was more suited to distinguish continuous EEG background from burst-suppression than approximate entropy in this specific population. Automatic EEG background and reactivity measures were significantly related to good and poor outcome. We conclude that quantitative EEG measurements can provide promising information regarding current state of the patient and clinical outcome, but further work is needed before routine application in a clinical setting.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

DNA condensation observed in vitro with the addition of polyvalent counterions is due to intermolecular attractive forces. We introduce a quantitative model of these forces in a Brownian dynamics simulation in addition to a standard mean-field Poisson-Boltzmann repulsion. The comparison of a theoretical value of the effective diameter calculated from the second virial coefficient in cylindrical geometry with some experimental results allows a quantitative evaluation of the one-parameter attractive potential. We show afterward that with a sufficient concentration of divalent salt (typically approximately 20 mM MgCl(2)), supercoiled DNA adopts a collapsed form where opposing segments of interwound regions present zones of lateral contact. However, under the same conditions the same plasmid without torsional stress does not collapse. The condensed molecules present coexisting open and collapsed plectonemic regions. Furthermore, simulations show that circular DNA in 50% methanol solutions with 20 mM MgCl(2) aggregates without the requirement of torsional energy. This confirms known experimental results. Finally, a simulated DNA molecule confined in a box of variable size also presents some local collapsed zones in 20 mM MgCl(2) above a critical concentration of the DNA. Conformational entropy reduction obtained either by supercoiling or by confinement seems thus to play a crucial role in all forms of condensation of DNA.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Neo-Darwinism, variation and natural selection are the two evolutionary mechanisms which propel biological evolution. Our previous article presented a histogram model [1] consisting in populations of individuals whose number changed under the influence of variation and/or fitness, the total population remaining constant. Individuals are classified into bins, and the content of each bin is calculated generation after generation by an Excel spreadsheet. Here, we apply the histogram model to a stable population with fitness F(1)=1.00 in which one or two fitter mutants emerge. In a first scenario, a single mutant emerged in the population whose fitness was greater than 1.00. The simulations ended when the original population was reduced to a single individual. The histogram model was validated by excellent agreement between its predictions and those of a classical continuous function (Eqn. 1) which predicts the number of generations needed for a favorable mutation to spread throughout a population. But in contrast to Eqn. 1, our histogram model is adaptable to more complex scenarios, as demonstrated here. In the second and third scenarios, the original population was present at time zero together with two mutants which differed from the original population by two higher and distinct fitness values. In the fourth scenario, the large original population was present at time zero together with one fitter mutant. After a number of generations, when the mutant offspring had multiplied, a second mutant was introduced whose fitness was even greater. The histogram model also allows Shannon entropy (SE) to be monitored continuously as the information content of the total population decreases or increases. The results of these simulations illustrate, in a graphically didactic manner, the influence of natural selection, operating through relative fitness, in the emergence and dominance of a fitter mutant.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Understanding the relative importance of historical and environmental processes in the structure and composition of communities is one of the longest quests in ecological research. Increasingly, researchers are relying on the functional and phylogenetic β-diversity of natural communities to provide concise explanations on the mechanistic basis of community assembly and the drivers of trait variation among species. The present study investigated how plant functional and phylogenetic β-diversity change along key environmental and spatial gradients in the Western Swiss Alps. Methods Using the quadratic diversity measure based on six functional traits: specific leaf area (SLA), leaf dry matter content (LDMC), plant height (H), leaf carbon content (C), leaf nitrogen content (N), and leaf carbon to nitrogen content (C/N) alongside a species-resolved phylogenetic tree, we relate variations in climate, spatial geographic, land use and soil gradients to plant functional and phylogenetic turnover in mountain communities of the Western Swiss Alps. Important findings Our study highlights two main points. First, climate and land use factors play an important role in mountain plant community turnover. Second, the overlap between plant functional and phylogenetic turnover along these gradients correlates with the low phylogenetic signal in traits, suggesting that in mountain landscapes, trait lability is likely an important factor in driving plant community assembly. Overall, we demonstrate the importance of climate and land use factors in plant functional and phylogenetic community turnover, and provide valuable complementary insights into understanding patterns of β-diversity along several ecological gradients.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In Neo-Darwinism, variation and natural selection are the two evolutionary mechanisms which propel biological evolution. Our previous reports presented a histogram model to simulate the evolution of populations of individuals classified into bins according to an unspecified, quantifiable phenotypic character, and whose number in each bin changed generation after generation under the influence of fitness, while the total population was maintained constant. The histogram model also allowed Shannon entropy (SE) to be monitored continuously as the information content of the total population decreased or increased. Here, a simple Perl (Practical Extraction and Reporting Language) application was developed to carry out these computations, with the critical feature of an added random factor in the percent of individuals whose offspring moved to a vicinal bin. The results of the simulations demonstrate that the random factor mimicking variation increased considerably the range of values covered by Shannon entropy, especially when the percentage of changed offspring was high. This increase in information content is interpreted as facilitated adaptability of the population.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hsp70s are highly conserved ATPase molecular chaperones mediating the correct folding of de novo synthesized proteins, the translocation of proteins across membranes, the disassembly of some native protein oligomers, and the active unfolding and disassembly of stress-induced protein aggregates. Here, we bring thermodynamic arguments and biochemical evidences for a unifying mechanism named entropic pulling, based on entropy loss due to excluded-volume effects, by which Hsp70 molecules can convert the energy of ATP hydrolysis into a force capable of accelerating the local unfolding of various protein substrates and, thus, perform disparate cellular functions. By means of entropic pulling, individual Hsp70 molecules can accelerate unfolding and pulling of translocating polypeptides into mitochondria in the absence of a molecular fulcrum, thus settling former contradictions between the power-stroke and the Brownian ratchet models for Hsp70-mediated protein translocation across membranes. Moreover, in a very different context devoid of membrane and components of the import pore, the same physical principles apply to the forceful unfolding, solubilization, and assisted native refolding of stable protein aggregates by individual Hsp70 molecules, thus providing a mechanism for Hsp70-mediated protein disaggregation.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hsp70s are conserved molecular chaperones that can prevent protein aggregation, actively unfold, solubilize aggregates, pull translocating proteins across membranes and remodel native proteins complexes. Disparate mechanisms have been proposed for the various modes of Hsp70 action: passive prevention of aggregation by kinetic partitioning, peptide-bond isomerase, Brownian ratcheting or active power-stroke pulling. Recently, we put forward a unifying mechanism named 'entropic pulling', which proposed that Hsp70 uses the energy of ATP hydrolysis to recruit a force of entropic origin to locally unfold aggregates or pull proteins across membranes. The entropic pulling mechanism reproduces the expected phenomenology that inspired the other disparate mechanisms and is, moreover, simple.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A wide range of modelling algorithms is used by ecologists, conservation practitioners, and others to predict species ranges from point locality data. Unfortunately, the amount of data available is limited for many taxa and regions, making it essential to quantify the sensitivity of these algorithms to sample size. This is the first study to address this need by rigorously evaluating a broad suite of algorithms with independent presence-absence data from multiple species and regions. We evaluated predictions from 12 algorithms for 46 species (from six different regions of the world) at three sample sizes (100, 30, and 10 records). We used data from natural history collections to run the models, and evaluated the quality of model predictions with area under the receiver operating characteristic curve (AUC). With decreasing sample size, model accuracy decreased and variability increased across species and between models. Novel modelling methods that incorporate both interactions between predictor variables and complex response shapes (i.e. GBM, MARS-INT, BRUTO) performed better than most methods at large sample sizes but not at the smallest sample sizes. Other algorithms were much less sensitive to sample size, including an algorithm based on maximum entropy (MAXENT) that had among the best predictive power across all sample sizes. Relative to other algorithms, a distance metric algorithm (DOMAIN) and a genetic algorithm (OM-GARP) had intermediate performance at the largest sample size and among the best performance at the lowest sample size. No algorithm predicted consistently well with small sample size (n < 30) and this should encourage highly conservative use of predictions based on small sample size and restrict their use to exploratory modelling.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The quality of environmental data analysis and propagation of errors are heavily affected by the representativity of the initial sampling design [CRE 93, DEU 97, KAN 04a, LEN 06, MUL07]. Geostatistical methods such as kriging are related to field samples, whose spatial distribution is crucial for the correct detection of the phenomena. Literature about the design of environmental monitoring networks (MN) is widespread and several interesting books have recently been published [GRU 06, LEN 06, MUL 07] in order to clarify the basic principles of spatial sampling design (monitoring networks optimization) based on Support Vector Machines was proposed. Nonetheless, modelers often receive real data coming from environmental monitoring networks that suffer from problems of non-homogenity (clustering). Clustering can be related to the preferential sampling or to the impossibility of reaching certain regions.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

BACKGROUND: So far, none of the existing methods on Murray's law deal with the non-Newtonian behavior of blood flow although the non-Newtonian approach for blood flow modelling looks more accurate. MODELING: In the present paper, Murray's law which is applicable to an arterial bifurcation, is generalized to a non-Newtonian blood flow model (power-law model). When the vessel size reaches the capillary limitation, blood can be modeled using a non-Newtonian constitutive equation. It is assumed two different constraints in addition to the pumping power: the volume constraint or the surface constraint (related to the internal surface of the vessel). For a seek of generality, the relationships are given for an arbitrary number of daughter vessels. It is shown that for a cost function including the volume constraint, classical Murray's law remains valid (i.e. SigmaR(c) = cste with c = 3 is verified and is independent of n, the dimensionless index in the viscosity equation; R being the radius of the vessel). On the contrary, for a cost function including the surface constraint, different values of c may be calculated depending on the value of n. RESULTS: We find that c varies for blood from 2.42 to 3 depending on the constraint and the fluid properties. For the Newtonian model, the surface constraint leads to c = 2.5. The cost function (based on the surface constraint) can be related to entropy generation, by dividing it by the temperature. CONCLUSION: It is demonstrated that the entropy generated in all the daughter vessels is greater than the entropy generated in the parent vessel. Furthermore, it is shown that the difference of entropy generation between the parent and daughter vessels is smaller for a non-Newtonian fluid than for a Newtonian fluid.