81 resultados para Evolutionary computation
Resumo:
Starch is the most widespread and abundant storage carbohydrate in crops and its production is critical to both crop yield and quality. As regards the starch content in the seeds of crop plants, there are distinct difference between grasses (Poaceae) and dicots. However, few studies have described the evolutionary pattern of genes in the starch biosynthetic pathway in these two groups of plants. In this study, therefore, an attempt was made to compare the evolutionary rate, gene duplication and selective pattern of the key genes involved in this pathway between the two groups, using five grasses and five dicots as materials. The results showed (i) distinct differences in patterns of gene duplication and loss between grasses and dicots; duplication in grasses mainly occurred prior to the divergence of grasses, whereas duplication mostly occurred in individual species within the dicots; there is less gene loss in grasses than in dicots; (ii) a considerably higher evolutionary rate in grasses than in dicots in most gene families analyzed; (iii) evidence of a different selective pattern between grasses and dicots; positive selection may have occurred asymmetrically in grasses in some gene families, e.g. AGPase small subunit. Therefore, we deduced that gene duplication contributes to, and a higher evolutionary rate is associated with, the higher starch content in grasses. In addition, two novel aspects of the evolution of the starch biosynthetic pathway were observed.
Resumo:
Evolutionary developmental genetics brings together systematists, morphologists and developmental geneticists; it will therefore impact on each of these component disciplines. The goals and methods of phylogenetic analysis are reviewed here, and the contribution of evolutionary developmental genetics to morphological systematics, in terms of character conceptualisation and primary homology assessment, is discussed. Evolutionary developmental genetics, like its component disciplines phylogenetic systematics and comparative morphology, is concerned with homology concepts. Phylogenetic concepts of homology and their limitations are considered here, and the need for independent homology statements at different levels of biological organisation is evaluated. The role of systematics in evolutionary developmental genetics is outlined. Phylogenetic systematics and comparative morphology will suggest effective sampling strategies to developmental geneticists. Phylogenetic systematics provides hypotheses of character evolution (including parallel evolution and convergence), stimulating investigations into the evolutionary gains and losses of morphologies. Comparative morphology identifies those structures that are not easily amenable to typological categorisation, and that may be of particular interest in terms of developmental genetics. The concepts of latent homology and genetic recall may also prove useful in the evolutionary interpretation of developmental genetic data.
Resumo:
Why does music pervade our lives and those of all known human beings living today and in the recent past? Why do we feel compelled to engage in musical activity, or at least simply enjoy listening to music even if we choose not to actively participate? I argue that this is because musicality—communication using variations in pitch, rhythm, dynamics and timbre, by a combination of the voice, body (as in dance), and material culture—was essential to the lives of our pre-linguistic hominin ancestors. As a consequence we have inherited a desire to engage with music, even if this has no adaptive benefit for us today as a species whose communication system is dominated by spoken language. In this article I provide a summary of the arguments to support this view.
Resumo:
Whole-genome sequencing offers new insights into the evolution of bacterial pathogens and the etiology of bacterial disease. Staph- ylococcus aureus is a major cause of bacteria-associated mortality and invasive disease and is carried asymptomatically by 27% of adults. Eighty percent of bacteremias match the carried strain. How- ever, the role of evolutionary change in the pathogen during the progression from carriage to disease is incompletely understood. Here we use high-throughput genome sequencing to discover the genetic changes that accompany the transition from nasal carriage to fatal bloodstream infection in an individual colonized with meth- icillin-sensitive S. aureus. We found a single, cohesive population exhibiting a repertoire of 30 single-nucleotide polymorphisms and four insertion/deletion variants. Mutations accumulated at a steady rate over a 13-mo period, except for a cluster of mutations preceding the transition to disease. Although bloodstream bacteria differed by just eight mutations from the original nasally carried bacteria, half of those mutations caused truncation of proteins, including a prema- ture stop codon in an AraC-family transcriptional regulator that has been implicated in pathogenicity. Comparison with evolution in two asymptomatic carriers supported the conclusion that clusters of pro- tein-truncating mutations are highly unusual. Our results demon- strate that bacterial diversity in vivo is limited but nonetheless detectable by whole-genome sequencing, enabling the study of evolutionary dynamics within the host. Regulatory or structural changes that occur during carriage may be functionally important for pathogenesis; therefore identifying those changes is a crucial step in understanding the biological causes of invasive bacterial disease.
Resumo:
1. It has been postulated that climate warming may pose the greatest threat species in the tropics, where ectotherms have evolved more thermal specialist physiologies. Although species could rapidly respond to environmental change through adaptation, little is known about the potential for thermal adaptation, especially in tropical species. 2. In the light of the limited empirical evidence available and predictions from mutation-selection theory, we might expect tropical ectotherms to have limited genetic variance to enable adaptation. However, as a consequence of thermodynamic constraints, we might expect this disadvantage to be at least partially offset by a fitness advantage, that is, the ‘hotter-is-better’ hypothesis. 3. Using an established quantitative genetics model and metabolic scaling relationships, we integrate the consequences of the opposing forces of thermal specialization and thermodynamic constraints on adaptive potential by evaluating extinction risk under climate warming. We conclude that the potential advantage of a higher maximal development rate can in theory more than offset the potential disadvantage of lower genetic variance associated with a thermal specialist strategy. 4. Quantitative estimates of extinction risk are fundamentally very sensitive to estimates of generation time and genetic variance. However, our qualitative conclusion that the relative risk of extinction is likely to be lower for tropical species than for temperate species is robust to assumptions regarding the effects of effective population size, mutation rate and birth rate per capita. 5. With a view to improving ecological forecasts, we use this modelling framework to review the sensitivity of our predictions to the model’s underpinning theoretical assumptions and the empirical basis of macroecological patterns that suggest thermal specialization and fitness increase towards the tropics. We conclude by suggesting priority areas for further empirical research.
Resumo:
Evolutionary meta-algorithms for pulse shaping of broadband femtosecond duration laser pulses are proposed. The genetic algorithm searching the evolutionary landscape for desired pulse shapes consists of a population of waveforms (genes), each made from two concatenated vectors, specifying phases and magnitudes, respectively, over a range of frequencies. Frequency domain operators such as mutation, two-point crossover average crossover, polynomial phase mutation, creep and three-point smoothing as well as a time-domain crossover are combined to produce fitter offsprings at each iteration step. The algorithm applies roulette wheel selection; elitists and linear fitness scaling to the gene population. A differential evolution (DE) operator that provides a source of directed mutation and new wavelet operators are proposed. Using properly tuned parameters for DE, the meta-algorithm is used to solve a waveform matching problem. Tuning allows either a greedy directed search near the best known solution or a robust search across the entire parameter space.
Resumo:
Studying the pathogenesis of an infectious disease like colibacillosis requires an understanding of the responses of target hosts to the organism both as a pathogen and as a commensal. The mucosal immune system constitutes the primary line of defence against luminal micro-organisms. The immunoglobulin-superfamily-based adaptive immune system evolved in the earliest jawed vertebrates, and the adaptive and innate immune system of humans, mice, pigs and ruminants co-evolved in common ancestors for approximately 300 million years. The divergence occurred only 100 mya and, as a consequence, most of the fundamental immunological mechanisms are very similar. However, since pressure on the immune system comes from rapidly evolving pathogens, immune systems must also evolve rapidly to maintain the ability of the host to survive and reproduce. As a consequence, there are a number of areas of detail where mammalian immune systems have diverged markedly from each other, such that results obtained in one species are not always immediately transferable to another. Thus, animal models of specific diseases need to be selected carefully, and the results interpreted with caution. Selection is made simpler where specific host species like cattle and pigs can be both target species and reservoirs for human disease, as in infections with Escherichia coli.
Resumo:
Approximate Bayesian computation (ABC) methods make use of comparisons between simulated and observed summary statistics to overcome the problem of computationally intractable likelihood functions. As the practical implementation of ABC requires computations based on vectors of summary statistics, rather than full data sets, a central question is how to derive low-dimensional summary statistics from the observed data with minimal loss of information. In this article we provide a comprehensive review and comparison of the performance of the principal methods of dimension reduction proposed in the ABC literature. The methods are split into three nonmutually exclusive classes consisting of best subset selection methods, projection techniques and regularization. In addition, we introduce two new methods of dimension reduction. The first is a best subset selection method based on Akaike and Bayesian information criteria, and the second uses ridge regression as a regularization procedure. We illustrate the performance of these dimension reduction techniques through the analysis of three challenging models and data sets.
Resumo:
Many modern statistical applications involve inference for complex stochastic models, where it is easy to simulate from the models, but impossible to calculate likelihoods. Approximate Bayesian computation (ABC) is a method of inference for such models. It replaces calculation of the likelihood by a step which involves simulating artificial data for different parameter values, and comparing summary statistics of the simulated data with summary statistics of the observed data. Here we show how to construct appropriate summary statistics for ABC in a semi-automatic manner. We aim for summary statistics which will enable inference about certain parameters of interest to be as accurate as possible. Theoretical results show that optimal summary statistics are the posterior means of the parameters. Although these cannot be calculated analytically, we use an extra stage of simulation to estimate how the posterior means vary as a function of the data; and we then use these estimates of our summary statistics within ABC. Empirical results show that our approach is a robust method for choosing summary statistics that can result in substantially more accurate ABC analyses than the ad hoc choices of summary statistics that have been proposed in the literature. We also demonstrate advantages over two alternative methods of simulation-based inference.
Resumo:
The Code for Sustainable Homes (the Code) will require new homes in the United Kingdom to be ‘zero carbon’ from 2016. Drawing upon an evolutionary innovation perspective, this paper contributes to a gap in the literature by investigating which low and zero carbon technologies are actually being used by house builders, rather than the prevailing emphasis on the potentiality of these technologies. Using the results from a questionnaire three empirical contributions are made. First, house builders are selecting a narrow range of technologies. Second, these choices are made to minimise the disruption to their standard design and production templates (SDPTs). Finally, the coalescence around a small group of technologies is expected to intensify with solar-based technologies predicted to become more important. This paper challenges the dominant technical rationality in the literature that technical efficiency and cost benefits are the primary drivers for technology selection. These drivers play an important role but one which is mediated by the logic of maintaining the SDPTs of the house builders. This emphasises the need for construction diffusion of innovation theory to be problematized and developed within the context of business and market regimes constrained and reproduced by resilient technological trajectories.
Resumo:
Computational formalisms have been pushing the boundaries of the field of computing for the last 80 years and much debate has surrounded what computing entails; what it is, and what it is not. This paper seeks to explore the boundaries of the ideas of computation and provide a framework for enabling a constructive discussion of computational ideas. First, a review of computing is given, ranging from Turing Machines to interactive computing. Then, a variety of natural physical systems are considered for their computational qualities. From this exploration, a framework is presented under which all dynamical systems can be considered as instances of the class of abstract computational platforms. An abstract computational platform is defined by both its intrinsic dynamics and how it allows computation that is meaningful to an external agent through the configuration of constraints upon those dynamics. It is asserted that a platform’s computational expressiveness is directly related to the freedom with which constraints can be placed. Finally, the requirements for a formal constraint description language are considered and it is proposed that Abstract State Machines may provide a reasonable basis for such a language.
Resumo:
Through a close analysis of socio-biologist Sarah Blaffer Hrdy’s work on motherhood and ‘mirror neurons’ it is argued that Hrdy’s claims exemplify how research that ostensibly bases itself on neuroscience, including in literary studies ‘literary Darwinism’, relies after all not on scientific, but on political assumptions, namely on underlying, unquestioned claims about the autonomous, transparent, liberal agent of consumer capitalism. These underpinning assumptions, it is further argued, involve the suppression or overlooking of an alternative, prior tradition of feminist theory, including feminist science criticism.
Resumo:
This paper seeks to chronicle the roots of corporate governance form its narrow shareholder perspective to the current bourgeoning stakeholder approach while giving cognizance to institutional investors and their effective role in ESG in light of the King Report III of South Africa. It is aimed at a critical review of the extant literature from the shareholder Cadbury epoch to the present day King Report novelty. We aim to: (i) offer an analytical state of corporate governance in the Anglo-Saxon world, Middle East and North Africa (MENA), Far East Asia and Africa; and (ii) illuminate the lead role the king Report of South Africa is playing as the bellwether of the stakeholder approach to corporate governance as well as guiding the role of institutional investors in ESG.
Resumo:
There is accumulating evidence that macroevolutionary patterns of mammal evolution during the Cenozoic follow similar trajectories on different continents. This would suggest that such patterns are strongly determined by global abiotic factors, such as climate, or by basic eco-evolutionary processes such as filling of niches by specialization. The similarity of pattern would be expected to extend to the history of individual clades. Here, we investigate the temporal distribution of maximum size observed within individual orders globally and on separate continents. While the maximum size of individual orders of large land mammals show differences and comprise several families, the times at which orders reach their maximum size over time show strong congruence, peaking in the Middle Eocene, the Oligocene and the Plio-Pleistocene. The Eocene peak occurs when global temperature and land mammal diversity are high and is best explained as a result of niche expansion rather than abiotic forcing. Since the Eocene, there is a significant correlation between maximum size frequency and global temperature proxy. The Oligocene peak is not statistically significant and may in part be due to sampling issues. The peak in the Plio-Pleistocene occurs when global temperature and land mammal diversity are low, it is statistically the most robust one and it is best explained by global cooling. We conclude that the macroevolutionary patterns observed are a result of the interplay between eco-evolutionary processes and abiotic forcing