969 resultados para Evolutionary Algorithm
Resumo:
1. The management of threatened species is an important practical way in which conservationists can intervene in the extinction process and reduce the loss of biodiversity. Understanding the causes of population declines (past, present and future) is pivotal to designing effective practical management. This is the declining-population paradigm identified by Caughley. 2. There are three broad classes of ecological tool used by conservationists to guide management decisions for threatened species: statistical models of habitat use, demographic models and behaviour-based models. Each of these is described here, illustrated with a case study and evaluated critically in terms of its practical application. 3. These tools are fundamentally different. Statistical models of habitat use and demographic models both use descriptions of patterns in abundance and demography, in relation to a range of factors, to inform management decisions. In contrast, behaviour-based models describe the evolutionary processes underlying these patterns, and derive such patterns from the strategies employed by individuals when competing for resources under a specific set of environmental conditions. 4. Statistical models of habitat use and demographic models have been used successfully to make management recommendations for declining populations. To do this, assumptions are made about population growth or vital rates that will apply when environmental conditions are restored, based on either past data collected under favourable environmental conditions or estimates of these parameters when the agent of decline is removed. As a result, they can only be used to make reliable quantitative predictions about future environments when a comparable environment has been experienced by the population of interest in the past. 5. Many future changes in the environment driven by management will not have been experienced by a population in the past. Under these circumstances, vital rates and their relationship with population density will change in the future in a way that is not predictable from past patterns. Reliable quantitative predictions about population-level responses then need to be based on an explicit consideration of the evolutionary processes operating at the individual level. 6. Synthesis and applications. It is argued that evolutionary theory underpins Caughley's declining-population paradigm, and that it needs to become much more widely used within mainstream conservation biology. This will help conservationists examine critically the reliability of the tools they have traditionally used to aid management decision-making. It will also give them access to alternative tools, particularly when predictions are required for changes in the environment that have not been experienced by a population in the past.
Resumo:
We estimate the body sizes of direct ancestors of extant carnivores, and examine selected aspects of life history as a function not only of species' current size, but also of recent changes in size. Carnivore species that have undergone marked recent evolutionary size change show life history characteristics typically associated with species closer to the ancestral body size. Thus, phyletic giants tend to mature earlier and have larger litters of smaller offspring at shorter intervals than do species of the same body size that are not phyletic giants. Phyletic dwarfs, by contrast, have slower life histories than nondwarf species of the same body size. We discuss two possible mechanisms for the legacy of recent size change: lag (in which life history variables cannot evolve as quickly as body size, leading to species having the 'wrong' life history for their body size) and body size optimization (in which life history and hence body size evolve in response to changes in energy availability); at present, we cannot distinguish between these alternatives. Our finding that recent body size changes help explain residual variation around life history allometries shows that a more dynamic view of character change enables comparative studies to make more precise predictions about species traits in the context of their evolutionary background.
Resumo:
We have developed a novel Hill-climbing genetic algorithm (GA) for simulation of protein folding. The program (written in C) builds a set of Cartesian points to represent an unfolded polypeptide's backbone. The dihedral angles determining the chain's configuration are stored in an array of chromosome structures that is copied and then mutated. The fitness of the mutated chain's configuration is determined by its radius of gyration. A four-helix bundle was used to optimise simulation conditions, and the program was compared with other, larger, genetic algorithms on a variety of structures. The program ran 50% faster than other GA programs. Overall, tests on 100 non-redundant structures gave comparable results to other genetic algorithms, with the Hill-climbing program running from between 20 and 50% faster. Examples including crambin, cytochrome c, cytochrome B and hemerythrin gave good secondary structure fits with overall alpha carbon atom rms deviations of between 5 and 5.6 Angstrom with an optimised hydrophobic term in the fitness function. (C) 2003 Elsevier Ltd. All rights reserved.
Resumo:
Liquid chromatography-mass spectrometry (LC-MS) datasets can be compared or combined following chromatographic alignment. Here we describe a simple solution to the specific problem of aligning one LC-MS dataset and one LC-MS/MS dataset, acquired on separate instruments from an enzymatic digest of a protein mixture, using feature extraction and a genetic algorithm. First, the LC-MS dataset is searched within a few ppm of the calculated theoretical masses of peptides confidently identified by LC-MS/MS. A piecewise linear function is then fitted to these matched peptides using a genetic algorithm with a fitness function that is insensitive to incorrect matches but sufficiently flexible to adapt to the discrete shifts common when comparing LC datasets. We demonstrate the utility of this method by aligning ion trap LC-MS/MS data with accurate LC-MS data from an FTICR mass spectrometer and show how hybrid datasets can improve peptide and protein identification by combining the speed of the ion trap with the mass accuracy of the FTICR, similar to using a hybrid ion trap-FTICR instrument. We also show that the high resolving power of FTICR can improve precision and linear dynamic range in quantitative proteomics. The alignment software, msalign, is freely available as open source.
Resumo:
The convergence speed of the standard Least Mean Square adaptive array may be degraded in mobile communication environments. Different conventional variable step size LMS algorithms were proposed to enhance the convergence speed while maintaining low steady state error. In this paper, a new variable step LMS algorithm, using the accumulated instantaneous error concept is proposed. In the proposed algorithm, the accumulated instantaneous error is used to update the step size parameter of standard LMS is varied. Simulation results show that the proposed algorithm is simpler and yields better performance than conventional variable step LMS.
Resumo:
This paper represents the first step in an on-going work for designing an unsupervised method based on genetic algorithm for intrusion detection. Its main role in a broader system is to notify of an unusual traffic and in that way provide the possibility of detecting unknown attacks. Most of the machine-learning techniques deployed for intrusion detection are supervised as these techniques are generally more accurate, but this implies the need of labeling the data for training and testing which is time-consuming and error-prone. Hence, our goal is to devise an anomaly detector which would be unsupervised, but at the same time robust and accurate. Genetic algorithms are robust and able to avoid getting stuck in local optima, unlike the rest of clustering techniques. The model is verified on KDD99 benchmark dataset, generating a solution competitive with the solutions of the state-of-the-art which demonstrates high possibilities of the proposed method.
Resumo:
This paper deals with the design of optimal multiple gravity assist trajectories with deep space manoeuvres. A pruning method which considers the sequential nature of the problem is presented. The method locates feasible vectors using local optimization and applies a clustering algorithm to find reduced bounding boxes which can be used in a subsequent optimization step. Since multiple local minima remain within the pruned search space, the use of a global optimization method, such as Differential Evolution, is suggested for finding solutions which are likely to be close to the global optimum. Two case studies are presented.
Resumo:
This paper proposes a new iterative algorithm for OFDM joint data detection and phase noise (PHN) cancellation based on minimum mean square prediction error. We particularly highlight the problem of "overfitting" such that the iterative approach may converge to a trivial solution. Although it is essential for this joint approach, the overfitting problem was relatively less studied in existing algorithms. In this paper, specifically, we apply a hard decision procedure at every iterative step to overcome the overfitting. Moreover, compared with existing algorithms, a more accurate Pade approximation is used to represent the phase noise, and finally a more robust and compact fast process based on Givens rotation is proposed to reduce the complexity to a practical level. Numerical simulations are also given to verify the proposed algorithm.
Resumo:
Using the classical Parzen window (PW) estimate as the target function, the sparse kernel density estimator is constructed in a forward constrained regression manner. The leave-one-out (LOO) test score is used for kernel selection. The jackknife parameter estimator subject to positivity constraint check is used for the parameter estimation of a single parameter at each forward step. As such the proposed approach is simple to implement and the associated computational cost is very low. An illustrative example is employed to demonstrate that the proposed approach is effective in constructing sparse kernel density estimators with comparable accuracy to that of the classical Parzen window estimate.
Resumo:
In this paper, we present an on-line estimation algorithm for an uncertain time delay in a continuous system based on the observational input-output data, subject to observational noise. The first order Pade approximation is used to approximate the time delay. At each time step, the algorithm combines the well known Kalman filter algorithm and the recursive instrumental variable least squares (RIVLS) algorithm in cascade form. The instrumental variable least squares algorithm is used in order to achieve the consistency of the delay parameter estimate, since an error-in-the-variable model is involved. An illustrative example is utilized to demonstrate the efficacy of the proposed approach.