965 resultados para Boolean Functions, Nonlinearity, Evolutionary Computation, Equivalence Classes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

It has been proved, for several classes of continuous and discrete dynamical systems, that the presence of a positive (resp. negative) circuit in the interaction graph of a system is a necessary condition for the presence of multiple stable states (resp. a cyclic attractor). A positive (resp. negative) circuit is said to be functional when it "generates" several stable states (resp. a cyclic attractor). However, there are no definite mathematical frameworks translating the underlying meaning of "generates." Focusing on Boolean networks, we recall and propose some definitions concerning the notion of functionality along with associated mathematical results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper is concerned with the realism of mechanisms that implementsocial choice functions in the traditional sense. Will agents actually playthe equilibrium assumed by the analysis? As an example, we study theconvergence and stability properties of Sj\"ostr\"om's (1994) mechanism, onthe assumption that boundedly rational players find their way to equilibriumusing monotonic learning dynamics and also with fictitious play. Thismechanism implements most social choice functions in economic environmentsusing as a solution concept the iterated elimination of weakly dominatedstrategies (only one round of deletion of weakly dominated strategies isneeded). There are, however, many sets of Nash equilibria whose payoffs maybe very different from those desired by the social choice function. Withmonotonic dynamics we show that many equilibria in all the sets ofequilibria we describe are the limit points of trajectories that havecompletely mixed initial conditions. The initial conditions that lead tothese equilibria need not be very close to the limiting point. Furthermore,even if the dynamics converge to the ``right'' set of equilibria, it stillcan converge to quite a poor outcome in welfare terms. With fictitious play,if the agents have completely mixed prior beliefs, beliefs and play convergeto the outcome the planner wants to implement.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gene copies that stem from the mRNAs of parental source genes have long been viewed as evolutionary dead-ends with little biological relevance. Here we review a range of recent studies that have unveiled a significant number of functional retroposed gene copies in both mammalian and some non-mammalian genomes. These studies have not only revealed previously unknown mechanisms for the emergence of new genes and their functions but have also provided fascinating general insights into molecular and evolutionary processes that have shaped genomes. For example, analyses of chromosomal gene movement patterns via RNA-based gene duplication have shed fresh light on the evolutionary origin and biology of our sex chromosomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Insect gustatory and odorant receptors (GRs and ORs) form a superfamily of novel transmembrane proteins, which are expressed in chemosensory neurons that detect environmental stimuli. Here we identify homologues of GRs (Gustatory receptor-like (Grl) genes) in genomes across Protostomia, Deuterostomia and non-Bilateria. Surprisingly, two Grls in the cnidarian Nematostella vectensis, NvecGrl1 and NvecGrl2, are expressed early in development, in the blastula and gastrula, but not at later stages when a putative chemosensory organ forms. NvecGrl1 transcripts are detected around the aboral pole, considered the equivalent to the head-forming region of Bilateria. Morpholino-mediated knockdown of NvecGrl1 causes developmental patterning defects of this region, leading to animals lacking the apical sensory organ. A deuterostome Grl from the sea urchin Strongylocentrotus purpuratus displays similar patterns of developmental expression. These results reveal an early evolutionary origin of the insect chemosensory receptor family and raise the possibility that their ancestral role was in embryonic development.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Pedotransfer functions (PTF) were developed to estimate the parameters (α, n, θr and θs) of the van Genuchten model (1980) to describe soil water retention curves. The data came from various sources, mainly from studies conducted by universities in Northeast Brazil, by the Brazilian Agricultural Research Corporation (Embrapa) and by a corporation for the development of the São Francisco and Parnaíba river basins (Codevasf), totaling 786 retention curves, which were divided into two data sets: 85 % for the development of PTFs, and 15 % for testing and validation, considered independent data. Aside from the development of general PTFs for all soils together, specific PTFs were developed for the soil classes Ultisols, Oxisols, Entisols, and Alfisols by multiple regression techniques, using a stepwise procedure (forward and backward) to select the best predictors. Two types of PTFs were developed: the first included all predictors (soil density, proportions of sand, silt, clay, and organic matter), and the second only the proportions of sand, silt and clay. The evaluation of adequacy of the PTFs was based on the correlation coefficient (R) and Willmott index (d). To evaluate the PTF for the moisture content at specific pressure heads, we used the root mean square error (RMSE). The PTF-predicted retention curve is relatively poor, except for the residual water content. The inclusion of organic matter as a PTF predictor improved the prediction of parameter a of van Genuchten. The performance of soil-class-specific PTFs was not better than of the general PTF. Except for the water content of saturated soil estimated by particle size distribution, the tested models for water content prediction at specific pressure heads proved satisfactory. Predictions of water content at pressure heads more negative than -0.6 m, using a PTF considering particle size distribution, are only slightly lower than those obtained by PTFs including bulk density and organic matter content.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Studies on water retention and availability are scarce for subtropical or humid temperate climate regions of the southern hemisphere. The aims of this study were to evaluate the relations of the soil physical, chemical, and mineralogical properties with water retention and availability for the generation and validation of continuous point pedotransfer functions (PTFs) for soils of the State of Santa Catarina (SC) in the South of Brazil. Horizons of 44 profiles were sampled in areas under different cover crops and regions of SC, to determine: field capacity (FC, 10 kPa), permanent wilting point (PWP, 1,500 kPa), available water content (AW, by difference), saturated hydraulic conductivity, bulk density, aggregate stability, particle size distribution (seven classes), organic matter content, and particle density. Chemical and mineralogical properties were obtained from the literature. Spearman's rank correlation analysis and path analysis were used in the statistical analyses. The point PTFs for estimation of FC, PWP and AW were generated for the soil surface and subsurface through multiple regression analysis, followed by robust regression analysis, using two sets of predictive variables. Soils with finer texture and/or greater organic matter content retain more moisture, and organic matter is the property that mainly controls the water availability to plants in soil surface horizons. Path analysis was useful in understanding the relationships between soil properties for FC, PWP and AW. The predictive power of the generated PTFs to estimate FC and PWP was good for all horizons, while AW was best estimated by more complex models with better prediction for the surface horizons of soils in Santa Catarina.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Preface The starting point for this work and eventually the subject of the whole thesis was the question: how to estimate parameters of the affine stochastic volatility jump-diffusion models. These models are very important for contingent claim pricing. Their major advantage, availability T of analytical solutions for characteristic functions, made them the models of choice for many theoretical constructions and practical applications. At the same time, estimation of parameters of stochastic volatility jump-diffusion models is not a straightforward task. The problem is coming from the variance process, which is non-observable. There are several estimation methodologies that deal with estimation problems of latent variables. One appeared to be particularly interesting. It proposes the estimator that in contrast to the other methods requires neither discretization nor simulation of the process: the Continuous Empirical Characteristic function estimator (EGF) based on the unconditional characteristic function. However, the procedure was derived only for the stochastic volatility models without jumps. Thus, it has become the subject of my research. This thesis consists of three parts. Each one is written as independent and self contained article. At the same time, questions that are answered by the second and third parts of this Work arise naturally from the issues investigated and results obtained in the first one. The first chapter is the theoretical foundation of the thesis. It proposes an estimation procedure for the stochastic volatility models with jumps both in the asset price and variance processes. The estimation procedure is based on the joint unconditional characteristic function for the stochastic process. The major analytical result of this part as well as of the whole thesis is the closed form expression for the joint unconditional characteristic function for the stochastic volatility jump-diffusion models. The empirical part of the chapter suggests that besides a stochastic volatility, jumps both in the mean and the volatility equation are relevant for modelling returns of the S&P500 index, which has been chosen as a general representative of the stock asset class. Hence, the next question is: what jump process to use to model returns of the S&P500. The decision about the jump process in the framework of the affine jump- diffusion models boils down to defining the intensity of the compound Poisson process, a constant or some function of state variables, and to choosing the distribution of the jump size. While the jump in the variance process is usually assumed to be exponential, there are at least three distributions of the jump size which are currently used for the asset log-prices: normal, exponential and double exponential. The second part of this thesis shows that normal jumps in the asset log-returns should be used if we are to model S&P500 index by a stochastic volatility jump-diffusion model. This is a surprising result. Exponential distribution has fatter tails and for this reason either exponential or double exponential jump size was expected to provide the best it of the stochastic volatility jump-diffusion models to the data. The idea of testing the efficiency of the Continuous ECF estimator on the simulated data has already appeared when the first estimation results of the first chapter were obtained. In the absence of a benchmark or any ground for comparison it is unreasonable to be sure that our parameter estimates and the true parameters of the models coincide. The conclusion of the second chapter provides one more reason to do that kind of test. Thus, the third part of this thesis concentrates on the estimation of parameters of stochastic volatility jump- diffusion models on the basis of the asset price time-series simulated from various "true" parameter sets. The goal is to show that the Continuous ECF estimator based on the joint unconditional characteristic function is capable of finding the true parameters. And, the third chapter proves that our estimator indeed has the ability to do so. Once it is clear that the Continuous ECF estimator based on the unconditional characteristic function is working, the next question does not wait to appear. The question is whether the computation effort can be reduced without affecting the efficiency of the estimator, or whether the efficiency of the estimator can be improved without dramatically increasing the computational burden. The efficiency of the Continuous ECF estimator depends on the number of dimensions of the joint unconditional characteristic function which is used for its construction. Theoretically, the more dimensions there are, the more efficient is the estimation procedure. In practice, however, this relationship is not so straightforward due to the increasing computational difficulties. The second chapter, for example, in addition to the choice of the jump process, discusses the possibility of using the marginal, i.e. one-dimensional, unconditional characteristic function in the estimation instead of the joint, bi-dimensional, unconditional characteristic function. As result, the preference for one or the other depends on the model to be estimated. Thus, the computational effort can be reduced in some cases without affecting the efficiency of the estimator. The improvement of the estimator s efficiency by increasing its dimensionality faces more difficulties. The third chapter of this thesis, in addition to what was discussed above, compares the performance of the estimators with bi- and three-dimensional unconditional characteristic functions on the simulated data. It shows that the theoretical efficiency of the Continuous ECF estimator based on the three-dimensional unconditional characteristic function is not attainable in practice, at least for the moment, due to the limitations on the computer power and optimization toolboxes available to the general public. Thus, the Continuous ECF estimator based on the joint, bi-dimensional, unconditional characteristic function has all the reasons to exist and to be used for the estimation of parameters of the stochastic volatility jump-diffusion models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: The degree of metal binding specificity in metalloproteins such as metallothioneins (MTs) can be crucial for their functional accuracy. Unlike most other animal species, pulmonate molluscs possess homometallic MT isoforms loaded with Cu+ or Cd2+. They have, so far, been obtained as native metal-MT complexes from snail tissues, where they are involved in the metabolism of the metal ion species bound to the respective isoform. However, it has not as yet been discerned if their specific metal occupation is the result of a rigid control of metal availability, or isoform expression programming in the hosting tissues or of structural differences of the respective peptides determining the coordinative options for the different metal ions. In this study, the Roman snail (Helix pomatia) Cu-loaded and Cd-loaded isoforms (HpCuMT and HpCdMT) were used as model molecules in order t o elucidate the biochemical and evolutionary mechanisms permitting pulmonate MTs to achieve specificity for their cognate metal ion. Results: HpCuMT and HpCdMT were recombinantly synthesized in the presence of Cd2+, Zn2+ or Cu2+ and corresponding metal complexes analysed by electrospray mass spectrometry and circular dichroism (CD) and ultra violet-visible (UV-Vis) spectrophotometry. Both MT isoforms were only able to form unique, homometallic and stable complexes (Cd6-HpCdMT and Cu12-HpCuMT) with their cognate metal ions. Yeast complementation assays demonstrated that the two isoforms assumed metal-specific functions, in agreement with their binding preferences, in heterologous eukaryotic environments. In the snail organism, the functional metal specificity of HpCdMT and HpCuMT was contributed by metal-specific transcription programming and cell-specific expression. Sequence elucidation and phylogenetic analysis of MT isoforms from a number of snail species revealed that they possess an unspecific and two metal-specific MT isoforms, whose metal specificity was achieved exclusively by evolutionary modulation of non-cysteine amino acid positions. Conclusion: The Roman snail HpCdMT and HpCuMT isoforms can thus be regarded as prototypes of isoform families that evolved genuine metal-specificity within pulmonate molluscs. Diversification into these isoforms may have been initiated by gene duplication, followed by speciation and selection towards opposite needs for protecting copper-dominated metabolic pathways from nonessential cadmium. The mechanisms enabling these proteins to be metal-specific could also be relevant for other metalloproteins.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In recent years, protein-ligand docking has become a powerful tool for drug development. Although several approaches suitable for high throughput screening are available, there is a need for methods able to identify binding modes with high accuracy. This accuracy is essential to reliably compute the binding free energy of the ligand. Such methods are needed when the binding mode of lead compounds is not determined experimentally but is needed for structure-based lead optimization. We present here a new docking software, called EADock, that aims at this goal. It uses an hybrid evolutionary algorithm with two fitness functions, in combination with a sophisticated management of the diversity. EADock is interfaced with the CHARMM package for energy calculations and coordinate handling. A validation was carried out on 37 crystallized protein-ligand complexes featuring 11 different proteins. The search space was defined as a sphere of 15 A around the center of mass of the ligand position in the crystal structure, and on the contrary to other benchmarks, our algorithm was fed with optimized ligand positions up to 10 A root mean square deviation (RMSD) from the crystal structure, excluding the latter. This validation illustrates the efficiency of our sampling strategy, as correct binding modes, defined by a RMSD to the crystal structure lower than 2 A, were identified and ranked first for 68% of the complexes. The success rate increases to 78% when considering the five best ranked clusters, and 92% when all clusters present in the last generation are taken into account. Most failures could be explained by the presence of crystal contacts in the experimental structure. Finally, the ability of EADock to accurately predict binding modes on a real application was illustrated by the successful docking of the RGD cyclic pentapeptide on the alphaVbeta3 integrin, starting far away from the binding pocket.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Visual perception is initiated in the photoreceptor cells of the retina via the phototransduction system.This system has shown marked evolution during mammalian divergence in such complex attributes as activation time and recovery time. We have performed a molecular evolutionary analysis of proteins involved in mammalianphototransduction in order to unravel how the action of natural selection has been distributed throughout thesystem to evolve such traits. We found selective pressures to be non-randomly distributed according to both a simple protein classification scheme and a protein-interaction network representation of the signaling pathway. Proteins which are topologically central in the signaling pathway, such as the G proteins, as well as retinoid cycle chaperones and proteins involved in photoreceptor cell-type determination, were found to be more constrained in their evolution. Proteins peripheral to the pathway, such as ion channels and exchangers, as well as the retinoid cycle enzymes, have experienced a relaxation of selective pressures. Furthermore, signals of positive selection were detected in two genes: the short-wave (blue) opsin (OPN1SW) in hominids and the rod-specific Na+/Ca2+,K+ ion exchanger (SLC24A1) in rodents. The functions of the proteins involved in phototransduction and the topology of the interactions between them have imposed non-random constraints on their evolution. Thus, in shaping or conserving system-level phototransduction traits, natural selection has targeted the underlying proteins in a concerted manner.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In plants, an oligogene family encodes NADP-malic enzymes (NADP-me), which are responsible for various functions and exhibit different kinetics and expression patterns. In particular, a chloroplast isoform of NADP-me plays a key role in one of the three biochemical subtypes of C4 photosynthesis, an adaptation to warm environments that evolved several times independently during angiosperm diversification. By combining genomic and phylogenetic approaches, this study aimed at identifying the molecular mechanisms linked to the recurrent evolutions of C4-specific NADP-me in grasses (Poaceae). Genes encoding NADP-me (nadpme) were retrieved from genomes of model grasses and isolated from a large sample of C3 and C4 grasses. Genomic and phylogenetic analyses showed that 1) the grass nadpme gene family is composed of four main lineages, one of which is expressed in plastids (nadpme-IV), 2) C4-specific NADP-me evolved at least five times independently from nadpme-IV, and 3) some codons driven by positive selection underwent parallel changes during the multiple C4 origins. The C4 NADP-me being expressed in chloroplasts probably constrained its recurrent evolutions from the only plastid nadpme lineage and this common starting point limited the number of evolutionary paths toward a C4 optimized enzyme, resulting in genetic convergence. In light of the history of nadpme genes, an evolutionary scenario of the C4 phenotype using NADP-me is discussed.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Given that retroposed copies of genes are presumed to lack the regulatory elements required for their expression, retroposition has long been considered a mechanism without functional relevance. However, through an in silico assay for transcriptional activity, we identify here >1,000 transcribed retrocopies in the human genome, of which at least approximately 120 have evolved into bona fide genes. Among these, approximately 50 retrogenes have evolved functions in testes, more than half of which were recruited as functional autosomal counterparts of X-linked genes during spermatogenesis. Generally, retrogenes emerge "out of the testis," because they are often initially transcribed in testis and later evolve stronger and sometimes more diverse spatial expression patterns. We find a significant excess of transcribed retrocopies close to other genes or within introns, suggesting that retrocopies can exploit the regulatory elements and/or open chromatin of neighboring genes to become transcribed. In direct support of this hypothesis, we identify 36 retrocopy-host gene fusions, including primate-specific chimeric genes. Strikingly, 27 intergenic retrogenes have acquired untranslated exons de novo during evolution to achieve high expression levels. Notably, our screen for highly transcribed retrocopies also uncovered a retrogene linked to a human recessive disorder, gelatinous drop-like corneal dystrophy, a form of blindness. These functional implications for retroposition notwithstanding, we find that the insertion of retrocopies into genes is generally deleterious, because it may interfere with the transcription of host genes. Our results demonstrate that natural selection has been fundamental in shaping the retrocopy repertoire of the human genome.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

La biologie de la conservation est communément associée à la protection de petites populations menacées d?extinction. Pourtant, il peut également être nécessaire de soumettre à gestion des populations surabondantes ou susceptibles d?une trop grande expansion, dans le but de prévenir les effets néfastes de la surpopulation. Du fait des différences tant quantitatives que qualitatives entre protection des petites populations et contrôle des grandes, il est nécessaire de disposer de modèles et de méthodes distinctes. L?objectif de ce travail a été de développer des modèles prédictifs de la dynamique des grandes populations, ainsi que des logiciels permettant de calculer les paramètres de ces modèles et de tester des scénarios de gestion. Le cas du Bouquetin des Alpes (Capra ibex ibex) - en forte expansion en Suisse depuis sa réintroduction au début du XXème siècle - servit d?exemple. Cette tâche fut accomplie en trois étapes : En premier lieu, un modèle de dynamique locale, spécifique au Bouquetin, fut développé : le modèle sous-jacent - structuré en classes d?âge et de sexe - est basé sur une matrice de Leslie à laquelle ont été ajoutées la densité-dépendance, la stochasticité environnementale et la chasse de régulation. Ce modèle fut implémenté dans un logiciel d?aide à la gestion - nommé SIM-Ibex - permettant la maintenance de données de recensements, l?estimation automatisée des paramètres, ainsi que l?ajustement et la simulation de stratégies de régulation. Mais la dynamique d?une population est influencée non seulement par des facteurs démographiques, mais aussi par la dispersion et la colonisation de nouveaux espaces. Il est donc nécessaire de pouvoir modéliser tant la qualité de l?habitat que les obstacles à la dispersion. Une collection de logiciels - nommée Biomapper - fut donc développée. Son module central est basé sur l?Analyse Factorielle de la Niche Ecologique (ENFA) dont le principe est de calculer des facteurs de marginalité et de spécialisation de la niche écologique à partir de prédicteurs environnementaux et de données d?observation de l?espèce. Tous les modules de Biomapper sont liés aux Systèmes d?Information Géographiques (SIG) ; ils couvrent toutes les opérations d?importation des données, préparation des prédicteurs, ENFA et calcul de la carte de qualité d?habitat, validation et traitement des résultats ; un module permet également de cartographier les barrières et les corridors de dispersion. Le domaine d?application de l?ENFA fut exploré par le biais d?une distribution d?espèce virtuelle. La comparaison à une méthode couramment utilisée pour construire des cartes de qualité d?habitat, le Modèle Linéaire Généralisé (GLM), montra qu?elle était particulièrement adaptée pour les espèces cryptiques ou en cours d?expansion. Les informations sur la démographie et le paysage furent finalement fusionnées en un modèle global. Une approche basée sur un automate cellulaire fut choisie, tant pour satisfaire aux contraintes du réalisme de la modélisation du paysage qu?à celles imposées par les grandes populations : la zone d?étude est modélisée par un pavage de cellules hexagonales, chacune caractérisée par des propriétés - une capacité de soutien et six taux d?imperméabilité quantifiant les échanges entre cellules adjacentes - et une variable, la densité de la population. Cette dernière varie en fonction de la reproduction et de la survie locale, ainsi que de la dispersion, sous l?influence de la densité-dépendance et de la stochasticité. Un logiciel - nommé HexaSpace - fut développé pour accomplir deux fonctions : 1° Calibrer l?automate sur la base de modèles de dynamique (par ex. calculés par SIM-Ibex) et d?une carte de qualité d?habitat (par ex. calculée par Biomapper). 2° Faire tourner des simulations. Il permet d?étudier l?expansion d?une espèce envahisseuse dans un paysage complexe composé de zones de qualité diverses et comportant des obstacles à la dispersion. Ce modèle fut appliqué à l?histoire de la réintroduction du Bouquetin dans les Alpes bernoises (Suisse). SIM-Ibex est actuellement utilisé par les gestionnaires de la faune et par les inspecteurs du gouvernement pour préparer et contrôler les plans de tir. Biomapper a été appliqué à plusieurs espèces (tant végétales qu?animales) à travers le Monde. De même, même si HexaSpace fut initialement conçu pour des espèces animales terrestres, il pourrait aisément être étndu à la propagation de plantes ou à la dispersion d?animaux volants. Ces logiciels étant conçus pour, à partir de données brutes, construire un modèle réaliste complexe, et du fait qu?ils sont dotés d?une interface d?utilisation intuitive, ils sont susceptibles de nombreuses applications en biologie de la conservation. En outre, ces approches peuvent également s?appliquer à des questions théoriques dans les domaines de l?écologie des populations et du paysage.<br/><br/>Conservation biology is commonly associated to small and endangered population protection. Nevertheless, large or potentially large populations may also need human management to prevent negative effects of overpopulation. As there are both qualitative and quantitative differences between small population protection and large population controlling, distinct methods and models are needed. The aim of this work was to develop theoretical models to predict large population dynamics, as well as computer tools to assess the parameters of these models and to test management scenarios. The alpine Ibex (Capra ibex ibex) - which experienced a spectacular increase since its reintroduction in Switzerland at the beginning of the 20th century - was used as paradigm species. This task was achieved in three steps: A local population dynamics model was first developed specifically for Ibex: the underlying age- and sex-structured model is based on a Leslie matrix approach with addition of density-dependence, environmental stochasticity and culling. This model was implemented into a management-support software - named SIM-Ibex - allowing census data maintenance, parameter automated assessment and culling strategies tuning and simulating. However population dynamics is driven not only by demographic factors, but also by dispersal and colonisation of new areas. Habitat suitability and obstacles modelling had therefore to be addressed. Thus, a software package - named Biomapper - was developed. Its central module is based on the Ecological Niche Factor Analysis (ENFA) whose principle is to compute niche marginality and specialisation factors from a set of environmental predictors and species presence data. All Biomapper modules are linked to Geographic Information Systems (GIS); they cover all operations of data importation, predictor preparation, ENFA and habitat suitability map computation, results validation and further processing; a module also allows mapping of dispersal barriers and corridors. ENFA application domain was then explored by means of a simulated species distribution. It was compared to a common habitat suitability assessing method, the Generalised Linear Model (GLM), and was proven better suited for spreading or cryptic species. Demography and landscape informations were finally merged into a global model. To cope with landscape realism and technical constraints of large population modelling, a cellular automaton approach was chosen: the study area is modelled by a lattice of hexagonal cells, each one characterised by a few fixed properties - a carrying capacity and six impermeability rates quantifying exchanges between adjacent cells - and one variable, population density. The later varies according to local reproduction/survival and dispersal dynamics, modified by density-dependence and stochasticity. A software - named HexaSpace - was developed, which achieves two functions: 1° Calibrating the automaton on the base of local population dynamics models (e.g., computed by SIM-Ibex) and a habitat suitability map (e.g. computed by Biomapper). 2° Running simulations. It allows studying the spreading of an invading species across a complex landscape made of variously suitable areas and dispersal barriers. This model was applied to the history of Ibex reintroduction in Bernese Alps (Switzerland). SIM-Ibex is now used by governmental wildlife managers to prepare and verify culling plans. Biomapper has been applied to several species (both plants and animals) all around the World. In the same way, whilst HexaSpace was originally designed for terrestrial animal species, it could be easily extended to model plant propagation or flying animals dispersal. As these softwares were designed to proceed from low-level data to build a complex realistic model and as they benefit from an intuitive user-interface, they may have many conservation applications. Moreover, theoretical questions in the fields of population and landscape ecology might also be addressed by these approaches.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

An analytical approach for the interpretation of multicomponent heterogeneous adsorption or complexation isotherms in terms of multidimensional affinity spectra is presented. Fourier transform, applied to analyze the corresponding integral equation, leads to an inversion formula which allows the computation of the multicomponent affinity spectrum underlying a given competitive isotherm. Although a different mathematical methodology is used, this procedure can be seen as the extension to multicomponent systems of the classical Sips’s work devoted to monocomponent systems. Furthermore, a methodology which yields analytical expressions for the main statistical properties (mean free energies of binding and covariance matrix) of multidimensional affinity spectra is reported. Thus, the level of binding correlation between the different components can be quantified. It has to be highlighted that the reported methodology does not require the knowledge of the affinity spectrum to calculate the means, variances, and covariance of the binding energies of the different components. Nonideal competitive consistent adsorption isotherm, widely used in metal/proton competitive complexation to environmental macromolecules, and Frumkin competitive isotherms are selected to illustrate the application of the reported results. Explicit analytical expressions for the affinity spectrum as well as for the matrix correlation are obtained for the NICCA case. © 2004 American Institute of Physics.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The perceived low levels of genetic diversity, poor interspecific competitive and defensive ability, and loss of dispersal capacities of insular lineages have driven the view that oceanic islands are evolutionary dead ends. Focusing on the Atlantic bryophyte flora distributed across the archipelagos of the Azores, Madeira, the Canary Islands, Western Europe, and northwestern Africa, we used an integrative approach with species distribution modeling and population genetic analyses based on approximate Bayesian computation to determine whether this view applies to organisms with inherent high dispersal capacities. Genetic diversity was found to be higher in island than in continental populations, contributing to mounting evidence that, contrary to theoretical expectations, island populations are not necessarily genetically depauperate. Patterns of genetic variation among island and continental populations consistently fitted those simulated under a scenario of de novo foundation of continental populations from insular ancestors better than those expected if islands would represent a sink or a refugium of continental biodiversity. We, suggest that the northeastern Atlantic archipelagos have played a key role as a stepping stone for transoceanic migrants. Our results challenge the traditional notion that oceanic islands are the end of the colonization road and illustrate the significant role of oceanic islands as reservoirs of novel biodiversity for the assembly of continental floras.