991 resultados para Evolutionary techniques


Relevância:

30.00% 30.00%

Publicador:

Resumo:

In a large number of problems the high dimensionality of the search space, the vast number of variables and the economical constrains limit the ability of classical techniques to reach the optimum of a function, known or unknown. In this thesis we investigate the possibility to combine approaches from advanced statistics and optimization algorithms in such a way to better explore the combinatorial search space and to increase the performance of the approaches. To this purpose we propose two methods: (i) Model Based Ant Colony Design and (ii) Naïve Bayes Ant Colony Optimization. We test the performance of the two proposed solutions on a simulation study and we apply the novel techniques on an appplication in the field of Enzyme Engineering and Design.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Two Amerindian populations from the Peruvian Amazon (Yanesha) and from rural lowlands of the Argentinean Gran Chaco (Wichi) were analyzed. They represent two case study of the South American genetic variability. The Yanesha represent a model of population isolated for long-time in the Amazon rainforest, characterized by environmental and altitudinal stratifications. The Wichi represent a model of population living in an area recently colonized by European populations (the Criollos are the population of the admixed descendents), whose aim is to depict the native ancestral gene pool and the degree of admixture, in relation to the very high prevalence of Chagas disease. The methods used for the genotyping are common, concerning the Y chromosome markers (male lineage) and the mitochondrial markers (maternal lineage). The determination of the phylogeographic diagnostic polymorphisms was carried out by the classical techniques of PCR, restriction enzymes, sequencing and specific mini-sequencing. New method for the detection of the protozoa Trypanosoma cruzi was developed by means of the nested PCR. The main results show patterns of genetic stratification in Yanesha forest communities, referable to different migrations at different times, estimated by Bayesian analyses. In particular Yanesha were considered as a population of transition between the Amazon basin and the Andean Cordillera, evaluating the potential migration routes and the separation of clusters of community in relation to different genetic bio-ancestry. As the Wichi, the gene pool analyzed appears clearly differentiated by the admixed sympatric Criollos, due to strict social practices (deeply analyzed with the support of cultural anthropological tools) that have preserved the native identity at a diachronic level. A pattern of distribution of the seropositivity in relation to the different phylogenetic lineages (the adaptation in evolutionary terms) does not appear, neither Amerindian nor European, but in relation to environmental and living conditions of the two distinct subpopulations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

With the discovery that DNA can be successfully recovered from museum collections, a new source of genetic information has been provided to extend our comprehension of the evolutionary history of species. However, historical specimens are often mislabeled or report incorrect information of origin, thus accurate identification of specimens is essential. Due to the highly damaged nature of ancient DNA many pitfalls exist and particular precautions need to be considered in order to perform genetic analysis. In this study we analyze 208 historical remains of pelagic fishes collected in the beginning of the 20th century. Through the adaptation of existing protocols, usually applied to human remains, we manage to successfully retrieve valuable genetic material from almost all of the examined samples using a guanidine and silica column-based approach. The combined use of two mitochondrial markers cytochrome-oxidase-1(mtDNA COI) and Control Region (mtDNA CR), and the nuclear marker first internal transcriber space (ITS1) allowed us to identify the majority of the examined specimens using traditional PCR and Sanger sequencing techniques. The creation of primers capable of amplifying heavily degraded DNA have great potential for future uses, both in ancient and in modern investigation. The methodologies developed in this study can in fact be applied for other ancient fish specimens as well as cooked or canned samples.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Academic and industrial research in the late 90s have brought about an exponential explosion of DNA sequence data. Automated expert systems are being created to help biologists to extract patterns, trends and links from this ever-deepening ocean of information. Two such systems aimed on retrieving and subsequently utilizing phylogenetically relevant information have been developed in this dissertation, the major objective of which was to automate the often difficult and confusing phylogenetic reconstruction process. ^ Popular phylogenetic reconstruction methods, such as distance-based methods, attempt to find an optimal tree topology (that reflects the relationships among related sequences and their evolutionary history) by searching through the topology space. Various compromises between the fast (but incomplete) and exhaustive (but computationally prohibitive) search heuristics have been suggested. An intelligent compromise algorithm that relies on a flexible “beam” search principle from the Artificial Intelligence domain and uses the pre-computed local topology reliability information to adjust the beam search space continuously is described in the second chapter of this dissertation. ^ However, sometimes even a (virtually) complete distance-based method is inferior to the significantly more elaborate (and computationally expensive) maximum likelihood (ML) method. In fact, depending on the nature of the sequence data in question either method might prove to be superior. Therefore, it is difficult (even for an expert) to tell a priori which phylogenetic reconstruction method—distance-based, ML or maybe maximum parsimony (MP)—should be chosen for any particular data set. ^ A number of factors, often hidden, influence the performance of a method. For example, it is generally understood that for a phylogenetically “difficult” data set more sophisticated methods (e.g., ML) tend to be more effective and thus should be chosen. However, it is the interplay of many factors that one needs to consider in order to avoid choosing an inferior method (potentially a costly mistake, both in terms of computational expenses and in terms of reconstruction accuracy.) ^ Chapter III of this dissertation details a phylogenetic reconstruction expert system that selects a superior proper method automatically. It uses a classifier (a Decision Tree-inducing algorithm) to map a new data set to the proper phylogenetic reconstruction method. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Essential biological processes are governed by organized, dynamic interactions between multiple biomolecular systems. Complexes are thus formed to enable the biological function and get dissembled as the process is completed. Examples of such processes include the translation of the messenger RNA into protein by the ribosome, the folding of proteins by chaperonins or the entry of viruses in host cells. Understanding these fundamental processes by characterizing the molecular mechanisms that enable then, would allow the (better) design of therapies and drugs. Such molecular mechanisms may be revealed trough the structural elucidation of the biomolecular assemblies at the core of these processes. Various experimental techniques may be applied to investigate the molecular architecture of biomolecular assemblies. High-resolution techniques, such as X-ray crystallography, may solve the atomic structure of the system, but are typically constrained to biomolecules of reduced flexibility and dimensions. In particular, X-ray crystallography requires the sample to form a three dimensional (3D) crystal lattice which is technically di‑cult, if not impossible, to obtain, especially for large, dynamic systems. Often these techniques solve the structure of the different constituent components within the assembly, but encounter difficulties when investigating the entire system. On the other hand, imaging techniques, such as cryo-electron microscopy (cryo-EM), are able to depict large systems in near-native environment, without requiring the formation of crystals. The structures solved by cryo-EM cover a wide range of resolutions, from very low level of detail where only the overall shape of the system is visible, to high-resolution that approach, but not yet reach, atomic level of detail. In this dissertation, several modeling methods are introduced to either integrate cryo-EM datasets with structural data from X-ray crystallography, or to directly interpret the cryo-EM reconstruction. Such computational techniques were developed with the goal of creating an atomic model for the cryo-EM data. The low-resolution reconstructions lack the level of detail to permit a direct atomic interpretation, i.e. one cannot reliably locate the atoms or amino-acid residues within the structure obtained by cryo-EM. Thereby one needs to consider additional information, for example, structural data from other sources such as X-ray crystallography, in order to enable such a high-resolution interpretation. Modeling techniques are thus developed to integrate the structural data from the different biophysical sources, examples including the work described in the manuscript I and II of this dissertation. At intermediate and high-resolution, cryo-EM reconstructions depict consistent 3D folds such as tubular features which in general correspond to alpha-helices. Such features can be annotated and later on used to build the atomic model of the system, see manuscript III as alternative. Three manuscripts are presented as part of the PhD dissertation, each introducing a computational technique that facilitates the interpretation of cryo-EM reconstructions. The first manuscript is an application paper that describes a heuristics to generate the atomic model for the protein envelope of the Rift Valley fever virus. The second manuscript introduces the evolutionary tabu search strategies to enable the integration of multiple component atomic structures with the cryo-EM map of their assembly. Finally, the third manuscript develops further the latter technique and apply it to annotate consistent 3D patterns in intermediate-resolution cryo-EM reconstructions. The first manuscript, titled An assembly model for Rift Valley fever virus, was submitted for publication in the Journal of Molecular Biology. The cryo-EM structure of the Rift Valley fever virus was previously solved at 27Å-resolution by Dr. Freiberg and collaborators. Such reconstruction shows the overall shape of the virus envelope, yet the reduced level of detail prevents the direct atomic interpretation. High-resolution structures are not yet available for the entire virus nor for the two different component glycoproteins that form its envelope. However, homology models may be generated for these glycoproteins based on similar structures that are available at atomic resolutions. The manuscript presents the steps required to identify an atomic model of the entire virus envelope, based on the low-resolution cryo-EM map of the envelope and the homology models of the two glycoproteins. Starting with the results of the exhaustive search to place the two glycoproteins, the model is built iterative by running multiple multi-body refinements to hierarchically generate models for the different regions of the envelope. The generated atomic model is supported by prior knowledge regarding virus biology and contains valuable information about the molecular architecture of the system. It provides the basis for further investigations seeking to reveal different processes in which the virus is involved such as assembly or fusion. The second manuscript was recently published in the of Journal of Structural Biology (doi:10.1016/j.jsb.2009.12.028) under the title Evolutionary tabu search strategies for the simultaneous registration of multiple atomic structures in cryo-EM reconstructions. This manuscript introduces the evolutionary tabu search strategies applied to enable a multi-body registration. This technique is a hybrid approach that combines a genetic algorithm with a tabu search strategy to promote the proper exploration of the high-dimensional search space. Similar to the Rift Valley fever virus, it is common that the structure of a large multi-component assembly is available at low-resolution from cryo-EM, while high-resolution structures are solved for the different components but lack for the entire system. Evolutionary tabu search strategies enable the building of an atomic model for the entire system by considering simultaneously the different components. Such registration indirectly introduces spatial constrains as all components need to be placed within the assembly, enabling the proper docked in the low-resolution map of the entire assembly. Along with the method description, the manuscript covers the validation, presenting the benefit of the technique in both synthetic and experimental test cases. Such approach successfully docked multiple components up to resolutions of 40Å. The third manuscript is entitled Evolutionary Bidirectional Expansion for the Annotation of Alpha Helices in Electron Cryo-Microscopy Reconstructions and was submitted for publication in the Journal of Structural Biology. The modeling approach described in this manuscript applies the evolutionary tabu search strategies in combination with the bidirectional expansion to annotate secondary structure elements in intermediate resolution cryo-EM reconstructions. In particular, secondary structure elements such as alpha helices show consistent patterns in cryo-EM data, and are visible as rod-like patterns of high density. The evolutionary tabu search strategy is applied to identify the placement of the different alpha helices, while the bidirectional expansion characterizes their length and curvature. The manuscript presents the validation of the approach at resolutions ranging between 6 and 14Å, a level of detail where alpha helices are visible. Up to resolution of 12 Å, the method measures sensitivities between 70-100% as estimated in experimental test cases, i.e. 70-100% of the alpha-helices were correctly predicted in an automatic manner in the experimental data. The three manuscripts presented in this PhD dissertation cover different computation methods for the integration and interpretation of cryo-EM reconstructions. The methods were developed in the molecular modeling software Sculptor (http://sculptor.biomachina.org) and are available for the scientific community interested in the multi-resolution modeling of cryo-EM data. The work spans a wide range of resolution covering multi-body refinement and registration at low-resolution along with annotation of consistent patterns at high-resolution. Such methods are essential for the modeling of cryo-EM data, and may be applied in other fields where similar spatial problems are encountered, such as medical imaging.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Angiosperm paleobotany has widened its horizons, incorporated new techniques, developed new databases, and accepted new questions that can now focus on the evolution of the group. The fossil record of early flowering plants is now playing an active role in addressing questions of angiosperm phylogeny, angiosperm origins, and angiosperm radiations. Three basic nodes of angiosperm radiations are identified: (i) the closed carpel and showy radially symmetrical flower, (ii) the bilateral flower, and (iii) fleshy fruits and nutritious nuts and seeds. These are all coevolutionary events and spread out through time during angiosperm evolution. The proposal is made that the genetics of the angiosperms pressured the evolution of the group toward reproductive systems that favored outcrossing. This resulted in the strongest selection in the angiosperms being directed toward the flower, fruits, and seeds. That is why these organs often provide the best systematic characters for the group.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We compare two methods in order to predict inflation rates in Europe. One method uses a standard back propagation neural network and the other uses an evolutionary approach, where the network weights and the network architecture is evolved. Results indicate that back propagation produces superior results. However, the evolving network still produces reasonable results with the advantage that the experimental set-up is minimal. Also of interest is the fact that the Divisia measure of money is superior as a predictive tool over simple sum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper compares two methods to predict in°ation rates in Europe. One method uses a standard back propagation neural network and the other uses an evolutionary approach, where the network weights and the network architecture are evolved. Results indicate that back propagation produces superior results. However, the evolving network still produces reasonable results with the advantage that the experimental set-up is minimal. Also of interest is the fact that the Divisia measure of money is superior as a predictive tool over simple sum.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A közgazdaságtanban az ágensalapú modellezés egyik alkalmazási területe a makro ökonómia. Ebben a tanulmányban néhány népszerű megtakarítási szabály létét feltételezve adaptív-evolúciós megközelítésben endogén módon próbálunk következtetni e szabályok relatív életképességére. Három különböző típusú ágenst vezetünk be: egy prudens, egy rövidlátó és egy, a permanensjövedelem-elméletnek megfelelően működőt. Rendkívül erős szelekciós nyomás mellett a prudens típus egyértelműen kiszorítja a másik kettőt. A második legéletképesebbnek a rövidlátó típus tűnik, de már közepes szelekciós nyomásnál sem tűnik el egyik típus sem. Szokásos tőkehatékonyság mellett a prudens típus túlzott beruházási tendenciát visz a gazdaságba, és a gazdaság az aranykori megtakarítási rátánál magasabbat ér el. A hitelkorlátok oldása még nagyobb mértékű túlzott beruházáshoz vezethet, a hitelek mennyiségének növekedése mellett a tőketulajdonosok mintegy "kizsákmányoltatják" magukat azokkal, akiknek nincs tőkejövedelmük. A hosszú távú átlagos fogyasztás szempontjából a három típus kiegyensúlyozott aránya adja a legjobb eredményt, ugyanakkor ez jóval nagyobb ingadozással jár, mint amikor csak prudens típusú háztartások léteznek. ____ Agent-based modelling techniques have been employed for some time in macroeconomics. This paper tests some popular saving rules in an adaptive-evolutionary context of looking at their relative survival values. The three types are prudent, short-sighted, and responsive to the permanent-income hypothesis. It is found that where selection pressure is very high, only the prudent type persists. The second most resilient seems to be the short-sighted type, but all three coexist even at medium levels of selection pressure. When the efficiency of capital approaches the level usually assumed in macroeconomics, the prudent type drives the economy towards excessive accumulation of capital, i. e. a long-term savings rate that exceeds the golden rule. If credit constraints are relaxed, this tendency strengthens as credit grows and capital-owners seem to allow themselves to be exploited" by workers. From the angle of average consumption, the best outcome is obtained from a random distribution of types, although this is accompanied by higher volatility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water-alternating-gas (WAG) is an enhanced oil recovery method combining the improved macroscopic sweep of water flooding with the improved microscopic displacement of gas injection. The optimal design of the WAG parameters is usually based on numerical reservoir simulation via trial and error, limited by the reservoir engineer’s availability. Employing optimisation techniques can guide the simulation runs and reduce the number of function evaluations. In this study, robust evolutionary algorithms are utilized to optimise hydrocarbon WAG performance in the E-segment of the Norne field. The first objective function is selected to be the net present value (NPV) and two global semi-random search strategies, a genetic algorithm (GA) and particle swarm optimisation (PSO) are tested on different case studies with different numbers of controlling variables which are sampled from the set of water and gas injection rates, bottom-hole pressures of the oil production wells, cycle ratio, cycle time, the composition of the injected hydrocarbon gas (miscible/immiscible WAG) and the total WAG period. In progressive experiments, the number of decision-making variables is increased, increasing the problem complexity while potentially improving the efficacy of the WAG process. The second objective function is selected to be the incremental recovery factor (IRF) within a fixed total WAG simulation time and it is optimised using the same optimisation algorithms. The results from the two optimisation techniques are analyzed and their performance, convergence speed and the quality of the optimal solutions found by the algorithms in multiple trials are compared for each experiment. The distinctions between the optimal WAG parameters resulting from NPV and oil recovery optimisation are also examined. This is the first known work optimising over this complete set of WAG variables. The first use of PSO to optimise a WAG project at the field scale is also illustrated. Compared to the reference cases, the best overall values of the objective functions found by GA and PSO were 13.8% and 14.2% higher, respectively, if NPV is optimised over all the above variables, and 14.2% and 16.2% higher, respectively, if IRF is optimised.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary robitics is a branch of artificial intelligence concerned with the automatic generation of autonomous robots. Usually the form of the robit is predefined an various computational techniques are used to control the machine's behaviour. One aspect is the spontaneous generation of walking in legged robots and this can be used to investigate the mechanical requiements for efficient walking in bipeds. This paper demonstrates a bipedal simulator that spontaneously generates walking and running gaits. The model can be customized to represent a range of hominoid morphologies and used to predict performance paramets such as preferred speed and metabolic energy cost. Because it does not require any motion capture data it is particularly suitable for investigating locomotion in fossil animals. The predictoins for modern humans are highly accurate in terms of energy cost for a given speend and thus the values predicted for other bipeds are likely to be good estimates. To illustrate this the cost of transport is calculated for Australopithecus afarensis. The model allows the degree of maximum extension at the knee to be varied causing the model to adopt walking gaits varying from chimpanzee-like to human=like. The energy costs associated with these gait choices can thus be calculated and this information used to evaluate possible locomotor strategies in early hominids

Relevância:

30.00% 30.00%

Publicador:

Resumo:

To understand the evolution of bipedalism among the homnoids in an ecological context we need to be able to estimate theenerrgetic cost of locomotion in fossil forms. Ideally such an estimate would be based entirely on morphology since, except for the rare instances where footprints are preserved, this is hte only primary source of evidence available. In this paper we use evolutionary robotics techniques (genetic algoritms, pattern generators and mechanical modeling) to produce a biomimentic simulation of bipedalism based on human body dimensions. The mechnaical simulation is a seven-segment, two-dimensional model with motive force provided by tension generators representing the major muscle groups acting around the lower-limb joints. Metabolic energy costs are calculated from the muscel model, and bipedal gait is generated using a finite-state pattern generator whose parameters are produced using a genetic algorithm with locomotor economy (maximum distance for a fixed energy cost) as the fitness criterion. The model is validated by comparing the values it generates with those for modern humans. The result (maximum efficiency of 200 J m-1) is within 15% of the experimentally derived value, which is very encouraging and suggests that this is a useful analytic technique for investigating the locomotor behaviour of fossil forms. Initial work suggests that in the future this technique could be used to estimate other locomotor parameters such as top speed. In addition, the animations produced by this technique are qualitatively very convincing, which suggests that this may also be a useful technique for visualizing bipedal locomotion.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Evolutionary algorithms alone cannot solve optimization problems very efficiently since there are many random (not very rational) decisions in these algorithms. Combination of evolutionary algorithms and other techniques have been proven to be an efficient optimization methodology. In this talk, I will explain the basic ideas of our three algorithms along this line (1): Orthogonal genetic algorithm which treats crossover/mutation as an experimental design problem, (2) Multiobjective evolutionary algorithm based on decomposition (MOEA/D) which uses decomposition techniques from traditional mathematical programming in multiobjective optimization evolutionary algorithm, and (3) Regular model based multiobjective estimation of distribution algorithms (RM-MEDA) which uses the regular property and machine learning methods for improving multiobjective evolutionary algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

International audience

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In Part 1 of this thesis, we propose that biochemical cooperativity is a fundamentally non-ideal process. We show quantal effects underlying biochemical cooperativity and highlight apparent ergodic breaking at small volumes. The apparent ergodic breaking manifests itself in a divergence of deterministic and stochastic models. We further predict that this divergence of deterministic and stochastic results is a failure of the deterministic methods rather than an issue of stochastic simulations.

Ergodic breaking at small volumes may allow these molecular complexes to function as switches to a greater degree than has previously been shown. We propose that this ergodic breaking is a phenomenon that the synapse might exploit to differentiate Ca$^{2+}$ signaling that would lead to either the strengthening or weakening of a synapse. Techniques such as lattice-based statistics and rule-based modeling are tools that allow us to directly confront this non-ideality. A natural next step to understanding the chemical physics that underlies these processes is to consider \textit{in silico} specifically atomistic simulation methods that might augment our modeling efforts.

In the second part of this thesis, we use evolutionary algorithms to optimize \textit{in silico} methods that might be used to describe biochemical processes at the subcellular and molecular levels. While we have applied evolutionary algorithms to several methods, this thesis will focus on the optimization of charge equilibration methods. Accurate charges are essential to understanding the electrostatic interactions that are involved in ligand binding, as frequently discussed in the first part of this thesis.