965 resultados para Boolean Functions, Nonlinearity, Evolutionary Computation, Equivalence Classes


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We know little about the genomic events that led to the advent of a multicellular grade of organization in animals, one of the most dramatic transitions in evolution. Metazoan multicellularity is correlated with the evolution of embryogenesis, which presumably was underpinned by a gene regulatory network reliant on the differential activation of signaling pathways and transcription factors. Many transcription factor genes that play critical roles in bilaterian development largely appear to have evolved before the divergence of cnidarian and bilaterian lineages. In contrast, sponges seem to have a more limited suite of transcription factors, suggesting that the developmental regulatory gene repertoire changed markedly during early metazoan evolution. Using whole- genome information from the sponge Amphimedon queenslandica, a range of eumetazoans, and the choanoflagellate Monosiga brevicollis, we investigate the genesis and expansion of homeobox, Sox, T- box, and Fox transcription factor genes. Comparative analyses reveal that novel transcription factor domains ( such as Paired, POU, and T- box) arose very early in metazoan evolution, prior to the separation of extant metazoan phyla but after the divergence of choanoflagellate and metazoan lineages. Phylogenetic analyses indicate that transcription factor classes then gradually expanded at the base of Metazoa before the bilaterian radiation, with each class following a different evolutionary trajectory. Based on the limited number of transcription factors in the Amphimedon genome, we infer that the genome of the metazoan last common ancestor included fewer gene members in each class than are present in extant eumetazoans. Transcription factor orthologues present in sponge, cnidarian, and bilaterian genomes may represent part of the core metazoan regulatory network underlying the origin of animal development and multicellularity.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

1. The management of threatened species is an important practical way in which conservationists can intervene in the extinction process and reduce the loss of biodiversity. Understanding the causes of population declines (past, present and future) is pivotal to designing effective practical management. This is the declining-population paradigm identified by Caughley. 2. There are three broad classes of ecological tool used by conservationists to guide management decisions for threatened species: statistical models of habitat use, demographic models and behaviour-based models. Each of these is described here, illustrated with a case study and evaluated critically in terms of its practical application. 3. These tools are fundamentally different. Statistical models of habitat use and demographic models both use descriptions of patterns in abundance and demography, in relation to a range of factors, to inform management decisions. In contrast, behaviour-based models describe the evolutionary processes underlying these patterns, and derive such patterns from the strategies employed by individuals when competing for resources under a specific set of environmental conditions. 4. Statistical models of habitat use and demographic models have been used successfully to make management recommendations for declining populations. To do this, assumptions are made about population growth or vital rates that will apply when environmental conditions are restored, based on either past data collected under favourable environmental conditions or estimates of these parameters when the agent of decline is removed. As a result, they can only be used to make reliable quantitative predictions about future environments when a comparable environment has been experienced by the population of interest in the past. 5. Many future changes in the environment driven by management will not have been experienced by a population in the past. Under these circumstances, vital rates and their relationship with population density will change in the future in a way that is not predictable from past patterns. Reliable quantitative predictions about population-level responses then need to be based on an explicit consideration of the evolutionary processes operating at the individual level. 6. Synthesis and applications. It is argued that evolutionary theory underpins Caughley's declining-population paradigm, and that it needs to become much more widely used within mainstream conservation biology. This will help conservationists examine critically the reliability of the tools they have traditionally used to aid management decision-making. It will also give them access to alternative tools, particularly when predictions are required for changes in the environment that have not been experienced by a population in the past.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe and evaluate a new estimator of the effective population size (N-e), a critical parameter in evolutionary and conservation biology. This new "SummStat" N-e. estimator is based upon the use of summary statistics in an approximate Bayesian computation framework to infer N-e. Simulations of a Wright-Fisher population with known N-e show that the SummStat estimator is useful across a realistic range of individuals and loci sampled, generations between samples, and N-e values. We also address the paucity of information about the relative performance of N-e estimators by comparing the SUMMStat estimator to two recently developed likelihood-based estimators and a traditional moment-based estimator. The SummStat estimator is the least biased of the four estimators compared. In 32 of 36 parameter combinations investigated rising initial allele frequencies drawn from a Dirichlet distribution, it has the lowest bias. The relative mean square error (RMSE) of the SummStat estimator was generally intermediate to the others. All of the estimators had RMSE > 1 when small samples (n = 20, five loci) were collected a generation apart. In contrast, when samples were separated by three or more generations and Ne less than or equal to 50, the SummStat and likelihood-based estimators all had greatly reduced RMSE. Under the conditions simulated, SummStat confidence intervals were more conservative than the likelihood-based estimators and more likely to include true N-e. The greatest strength of the SummStat estimator is its flexible structure. This flexibility allows it to incorporate any, potentially informative summary statistic from Population genetic data.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Many evolutionary algorithm applications involve either fitness functions with high time complexity or large dimensionality (hence very many fitness evaluations will typically be needed) or both. In such circumstances, there is a dire need to tune various features of the algorithm well so that performance and time savings are optimized. However, these are precisely the circumstances in which prior tuning is very costly in time and resources. There is hence a need for methods which enable fast prior tuning in such cases. We describe a candidate technique for this purpose, in which we model a landscape as a finite state machine, inferred from preliminary sampling runs. In prior algorithm-tuning trials, we can replace the 'real' landscape with the model, enabling extremely fast tuning, saving far more time than was required to infer the model. Preliminary results indicate much promise, though much work needs to be done to establish various aspects of the conditions under which it can be most beneficially used. A main limitation of the method as described here is a restriction to mutation-only algorithms, but there are various ways to address this and other limitations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The climate belongs to the class of non-equilibrium forced and dissipative systems, for which most results of quasi-equilibrium statistical mechanics, including the fluctuation-dissipation theorem, do not apply. In this paper we show for the first time how the Ruelle linear response theory, developed for studying rigorously the impact of perturbations on general observables of non-equilibrium statistical mechanical systems, can be applied with great success to analyze the climatic response to general forcings. The crucial value of the Ruelle theory lies in the fact that it allows to compute the response of the system in terms of expectation values of explicit and computable functions of the phase space averaged over the invariant measure of the unperturbed state. We choose as test bed a classical version of the Lorenz 96 model, which, in spite of its simplicity, has a well-recognized prototypical value as it is a spatially extended one-dimensional model and presents the basic ingredients, such as dissipation, advection and the presence of an external forcing, of the actual atmosphere. We recapitulate the main aspects of the general response theory and propose some new general results. We then analyze the frequency dependence of the response of both local and global observables to perturbations having localized as well as global spatial patterns. We derive analytically several properties of the corresponding susceptibilities, such as asymptotic behavior, validity of Kramers-Kronig relations, and sum rules, whose main ingredient is the causality principle. We show that all the coefficients of the leading asymptotic expansions as well as the integral constraints can be written as linear function of parameters that describe the unperturbed properties of the system, such as its average energy. Some newly obtained empirical closure equations for such parameters allow to define such properties as an explicit function of the unperturbed forcing parameter alone for a general class of chaotic Lorenz 96 models. We then verify the theoretical predictions from the outputs of the simulations up to a high degree of precision. The theory is used to explain differences in the response of local and global observables, to define the intensive properties of the system, which do not depend on the spatial resolution of the Lorenz 96 model, and to generalize the concept of climate sensitivity to all time scales. We also show how to reconstruct the linear Green function, which maps perturbations of general time patterns into changes in the expectation value of the considered observable for finite as well as infinite time. Finally, we propose a simple yet general methodology to study general Climate Change problems on virtually any time scale by resorting to only well selected simulations, and by taking full advantage of ensemble methods. The specific case of globally averaged surface temperature response to a general pattern of change of the CO2 concentration is discussed. We believe that the proposed approach may constitute a mathematically rigorous and practically very effective way to approach the problem of climate sensitivity, climate prediction, and climate change from a radically new perspective.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Serine proteases are major components of viper venom and target various stages of the blood coagulation system in victims and prey. A better understanding of the diversity of serine proteases and other enzymes present in snake venom will help to understand how the complexity of snake venom has evolved and will aid the development of novel therapeutics for treating snake bites. Methodology and Principal Findings: Four serine protease-encoding genes from the venom gland transcriptome of Bitis gabonica rhinoceros were amplified and sequenced. Mass spectrometry suggests the four enzymes corresponding to these genes are present in the venom of B. g. rhinoceros. Two of the enzymes, rhinocerases 2 and 3 have substitutions to two of the serine protease catalytic triad residues and are thus unlikely to be catalytically active, though they may have evolved other toxic functions. The other two enzymes, rhinocerases 4 and 5, have classical serine protease catalytic triad residues and thus are likely to be catalytically active, however they have glycine rather than the more typical aspartic acid at the base of the primary specificity pocket (position 189). Based on a detailed analysis of these sequences we suggest that alternative splicing together with individual amino acid mutations may have been involved in their evolution. Changes within amino acid segments which were previously proposed to undergo accelerated change in venom serine proteases have also been observed. Conclusions and Significance: Our study provides further insight into the diversity of serine protease isoforms present within snake venom and discusses their possible functions and how they may have evolved. These multiple serine protease isoforms with different substrate specificities may enhance the envenomation effects and help the snake to adapt to new habitats and diets. Our findings have potential for helping the future development of improved therapeutics for snake bites.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We consider the general response theory recently proposed by Ruelle for describing the impact of small perturbations to the non-equilibrium steady states resulting from Axiom A dynamical systems. We show that the causality of the response functions entails the possibility of writing a set of Kramers-Kronig (K-K) relations for the corresponding susceptibilities at all orders of nonlinearity. Nonetheless, only a special class of directly observable susceptibilities obey K-K relations. Specific results are provided for the case of arbitrary order harmonic response, which allows for a very comprehensive K-K analysis and the establishment of sum rules connecting the asymptotic behavior of the harmonic generation susceptibility to the short-time response of the perturbed system. These results set in a more general theoretical framework previous findings obtained for optical systems and simple mechanical models, and shed light on the very general impact of considering the principle of causality for testing self-consistency: the described dispersion relations constitute unavoidable benchmarks that any experimental and model generated dataset must obey. The theory exposed in the present paper is dual to the time-dependent theory of perturbations to equilibrium states and to non-equilibrium steady states, and has in principle similar range of applicability and limitations. In order to connect the equilibrium and the non equilibrium steady state case, we show how to rewrite the classical response theory by Kubo so that response functions formally identical to those proposed by Ruelle, apart from the measure involved in the phase space integration, are obtained. These results, taking into account the chaotic hypothesis by Gallavotti and Cohen, might be relevant in several fields, including climate research. In particular, whereas the fluctuation-dissipation theorem does not work for non-equilibrium systems, because of the non-equivalence between internal and external fluctuations, K-K relations might be robust tools for the definition of a self-consistent theory of climate change.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud imagery is not currently used in numerical weather prediction (NWP) to extract the type of dynamical information that experienced forecasters have extracted subjectively for many years. For example, rapidly developing mid-latitude cyclones have characteristic signatures in the cloud imagery that are most fully appreciated from a sequence of images rather than from a single image. The Met Office is currently developing a technique to extract dynamical development information from satellite imagery using their full incremental 4D-Var (four-dimensional variational data assimilation) system. We investigate a simplified form of this technique in a fully nonlinear framework. We convert information on the vertical wind field, w(z), and profiles of temperature, T(z, t), and total water content, qt (z, t), as functions of height, z, and time, t, to a single brightness temperature by defining a 2D (vertical and time) variational assimilation testbed. The profiles of w, T and qt are updated using a simple vertical advection scheme. We define a basic cloud scheme to obtain the fractional cloud amount and, when combined with the temperature field, we convert this information into a brightness temperature, having developed a simple radiative transfer scheme. With the exception of some matrix inversion routines, all our code is developed from scratch. Throughout the development process we test all aspects of our 2D assimilation system, and then run identical twin experiments to try and recover information on the vertical velocity, from a sequence of observations of brightness temperature. This thesis contains a comprehensive description of our nonlinear models and assimilation system, and the first experimental results.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematics in Defence 2011 Abstract. We review transreal arithmetic and present transcomplex arithmetic. These arithmetics have no exceptions. This leads to incremental improvements in computer hardware and software. For example, the range of real numbers, encoded by floating-point bits, is doubled when all of the Not-a-Number(NaN) states, in IEEE 754 arithmetic, are replaced with real numbers. The task of programming such systems is simplified and made safer by discarding the unordered relational operator,leaving only the operators less-than, equal-to, and greater than. The advantages of using a transarithmetic in a computation, or transcomputation as we prefer to call it, may be had by making small changes to compilers and processor designs. However, radical change is possible by exploiting the reliability of transcomputations to make pipelined dataflow machines with a large number of cores. Our initial designs are for a machine with order one million cores. Such a machine can complete the execution of multiple in-line programs each clock tick

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A number of recent papers have employed the BDS test as a general test for mis-specification for linear and nonlinear models. We show that for a particular class of conditionally heteroscedastic models, the BDS test is unable to detect a common mis-specification. Our results also demonstrate that specific rather than portmanteau diagnostics are required to detect neglected asymmetry in volatility. However for both classes of tests reasonable power is only obtained using very large sample sizes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Explicitly orbital-dependent approximations to the exchange-correlation energy functional of density functional theory typically not only depend on the single-particle Kohn-Sham orbitals but also on their occupation numbers in the ground-state Slater determinant. The variational calculation of the corresponding exchange-correlation potentials with the optimized effective potential (OEP) method therefore also requires a variation of the occupation numbers with respect to a variation in the effective single-particle potential, which is usually not taken into account. Here it is shown under which circumstances this procedure is justified.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We provide a complete isomorphic classification of the Banach spaces of continuous functions on the compact spaces 2(m) circle plus [0, alpha], the topological sums of Cantor cubes 2(m), with m smaller than the first sequential cardinal, and intervals of ordinal numbers [0, alpha]. In particular, we prove that it is relatively consistent with ZFC that the only isomorphism classes of C(2(m) circle plus [0, alpha]) spaces with m >= N(0) and alpha >= omega(1) are the trivial ones. This result leads to some elementary questions on large cardinals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses Shannon's information theory to give a quantitative definition of information flow in systems that transform inputs to outputs. For deterministic systems, the definition is shown to specialise to a simpler form when the information source and the known inputs jointly determine the inputs. For this special case, the definition is related to the classical security condition of non-interference and an equivalence is established between non-interference and independence of random variables. Quantitative information flow for deterministic systems is then presented in relational form. With this presentation, it is shown how relational parametricity can be used to derive upper and lower bounds on information flows through families of functions defined in the second order lambda calculus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper investigates which properties money-demand functions have to satisfy to be consistent with multidimensional extensions of Lucasí(2000) versions of the Sidrauski (1967) and the shopping-time models. We also investigate how such classes of models relate to each other regarding the rationalization of money demands. We conclude that money demand functions rationalizable by the shoppingtime model are always rationalizable by the Sidrauski model, but that the converse is not true. The log-log money demand with an interest-rate elasticity greater than or equal to one and the semi-log money demand are counterexamples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A proposta básica de nosso estudo está na preocupação de identificar os princípios de uma modalidade de educação: o atendimento compensatório/remediativo, destinada às crianças das classes "desfavorecidas", os alunos "carentes", e a criação de um espaço necessário ao processo de educação dessas crianças, designado oficialmente de "Classes de Adaptação". Para dar conta de tal proposta, definimos como objeto de estudo o discurso pedagógico, como discurso dominante, dando relevo às formulações referentes à "carência", às questões levantadas em torno da criação das "Classes de Adaptação" e ao discernimento da função social de sua criação e utilização. A análise das formulações passa pela compreensão das categorias pertinentes: "marginalização cultural", "privação cultural", "marginalizado cultural", "cultura da pobreza". " Rede teórica" (cf. Foucault) que nos ajudou a pensar a importância do tema da "carência" (e do seu complemento, a "compensação"). A análise mostra que, a pretexto de "compensar" as "privações" (ou "carências") das crianças "desfavorecidas", pelas dificuldades de aprendizagem que apresentam na escola, essas crianças são encaminhadas às Classes de Adaptação, que visam disciplinar, ou seja, torná-los úteis e dóceis, em função do sistema de produção. O entendimento dessa perspectiva leva-nos a perceber as dificuldades que essas crianças apresentam na escola, não como "inadaptação cultural", concepção que em geral reproduz a versão da ideologia dominante, difundida pela escola, e sim como um problema político, em que a origem social tem um peso fundamental na sua identificação, enquanto "carentes".