980 resultados para Artificial Selection


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tuta absoluta (Meyrick, 1917) é uma das pragas-chave da cultura do tomate e outras solanáceas na América do Sul e atualmente também na Eurásia e África. Devido aos grandes prejuízos que causa à cultura, são principalmente usados inseticidas para o seu controle. Entretanto, na busca de estratégias mais sustentáveis, cada dia adquire maior importância o uso do controle biológico, como uma das estratégias do manejo integrado de pragas. Para o desenvolvimento destas estratégias é fundamental desenvolver um método de criação de T. absoluta em laboratório, em dieta artificial, sem necessitar do hospedeiro natural, muitas vezes difícil de ser obtido e mantido em laboratório, e, de grande importância para produzir parasitoides específicos para esta praga. Dentre os parasitoides mais usados para ovos de lepidópteros está Trichogramma pretiosum Riley 1879 que é usado no controle biológico aplicado desta praga. Tendo como foco principal T. absoluta, neste trabalho foram pesquisados 1) a seleção de uma dieta artificial para este lepidóptero baseando-se em características físicas e químicas, avaliando o seu desempenho por várias gerações em laboratório, e 2) avaliação de aspectos biológicos e reprodutivos de T. pretiosum parasitando ovos de T. absoluta e aspectos físicos da planta (tricomas) para compreender o controle biológico desta praga no tomateiro. Foi encontrado que uma dieta à base de germe-de-trigo, caseína e celulose é apropriada para a criação deste lepidóptero, já que o inseto mostrou adaptação à mesma no transcorrer das gerações com base em características biológicas e de tabela de vida; adicionalmente, os ovos provenientes de T. absoluta alimentada com dieta artificial são comparáveis aos da dieta natural, no parasitismo de T. pretiosum. Com relação ao controle biológico foi demonstrado que este parasitoide desenvolvido em ovos de T. absoluta, diminui seu tamanho e desempenho com o transcorrer das gerações, apresentando menor capacidade de voo do que os insetos produzidos em A. kuenhiella, sendo necessária a liberação de altas densidades de parasitoides por ovo da praga. Foi observado que, embora o parasitismo de T. pretiosum de ovos de T. absoluta seja melhor em variedades com poucos tricomas, uma alta densidade destas estruturas não impede o controle da praga alvo dependendo da disposição destas estruturas. O controle biológico de T. absoluta com T. pretiosum tem uma ação momentânea, sendo necessárias liberações frequentes devido ao fato de os parasitoides desenvolvidos na praga serem menos competitivos com aqueles provenientes do hospedeiro alternativo que apresenta ovos maiores do que T. absoluta.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Natural coral reefs are in a state of serious decline worldwide. The pressures of over fishing, recreational activities, environmental pollutants, and global warming have stressed these marine ecosystems to the breaking point. One of the oldest methods of augmenting natural reef systems is the implementation of artificial reefs. These projects are not as simple as dumping waste or scrap materials in offshore areas. Proper material selection is vital to produce a healthy artificial marine habitat that is completed on schedule and on budget. This Capstone Project will evaluate the most commonly used materials and provide a comparison of their strengths and weaknesses. This comparison provides a valuable tool for project managers as they begin the reef planning process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, we propose a novel filter for feature selection. Such filter relies on the estimation of the mutual information between features and classes. We bypass the estimation of the probability density function with the aid of the entropic-graphs approximation of Rényi entropy, and the subsequent approximation of the Shannon one. The complexity of such bypassing process does not depend on the number of dimensions but on the number of patterns/samples, and thus the curse of dimensionality is circumvented. We show that it is then possible to outperform a greedy algorithm based on the maximal relevance and minimal redundancy criterion. We successfully test our method both in the contexts of image classification and microarray data classification.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Prototype Selection (PS) algorithms allow a faster Nearest Neighbor classification by keeping only the most profitable prototypes of the training set. In turn, these schemes typically lower the performance accuracy. In this work a new strategy for multi-label classifications tasks is proposed to solve this accuracy drop without the need of using all the training set. For that, given a new instance, the PS algorithm is used as a fast recommender system which retrieves the most likely classes. Then, the actual classification is performed only considering the prototypes from the initial training set belonging to the suggested classes. Results show that this strategy provides a large set of trade-off solutions which fills the gap between PS-based classification efficiency and conventional kNN accuracy. Furthermore, this scheme is not only able to, at best, reach the performance of conventional kNN with barely a third of distances computed, but it does also outperform the latter in noisy scenarios, proving to be a much more robust approach.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the current Information Age, data production and processing demands are ever increasing. This has motivated the appearance of large-scale distributed information. This phenomenon also applies to Pattern Recognition so that classic and common algorithms, such as the k-Nearest Neighbour, are unable to be used. To improve the efficiency of this classifier, Prototype Selection (PS) strategies can be used. Nevertheless, current PS algorithms were not designed to deal with distributed data, and their performance is therefore unknown under these conditions. This work is devoted to carrying out an experimental study on a simulated framework in which PS strategies can be compared under classical conditions as well as those expected in distributed scenarios. Our results report a general behaviour that is degraded as conditions approach to more realistic scenarios. However, our experiments also show that some methods are able to achieve a fairly similar performance to that of the non-distributed scenario. Thus, although there is a clear need for developing specific PS methodologies and algorithms for tackling these situations, those that reported a higher robustness against such conditions may be good candidates from which to start.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Stabilizing selection is a fundamental concept in evolutionary biology. In the presence of a single intermediate optimum phenotype (fitness peak) on the fitness surface, stabilizing selection should cause the population to evolve toward such a peak. This prediction has seldom been tested, particularly for suites of correlated traits. The lack of tests for an evolutionary match between population means and adaptive peaks may be due, at least in part, to problems associated with empirically detecting multivariate stabilizing selection and with testing whether population means are at the peak of multivariate fitness surfaces. Here we show how canonical analysis of the fitness surface, combined with the estimation of confidence regions for stationary points on quadratic response surfaces, may be used to define multivariate stabilizing selection on a suite of traits and to establish whether natural populations reside on the multivariate peak. We manufactured artificial advertisement calls of the male cricket Teleogryllus commodus and played them back to females in laboratory phonotaxis trials to estimate the linear and nonlinear sexual selection that female phonotactic choice imposes on male call structure. Significant nonlinear selection on the major axes of the fitness surface was convex in nature and displayed an intermediate optimum, indicating multivariate stabilizing selection. The mean phenotypes of four independent samples of males, from the same population as the females used in phonotaxis trials, were within the 95% confidence region for the fitness peak. These experiments indicate that stabilizing sexual selection may play an important role in the evolution of male call properties in natural populations of T. commodus.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Topological measures of large-scale complex networks are applied to a specific artificial regulatory network model created through a whole genome duplication and divergence mechanism. This class of networks share topological features with natural transcriptional regulatory networks. Specifically, these networks display scale-free and small-world topology and possess subgraph distributions similar to those of natural networks. Thus, the topologies inherent in natural networks may be in part due to their method of creation rather than being exclusively shaped by subsequent evolution under selection. The evolvability of the dynamics of these networks is also examined by evolving networks in simulation to obtain three simple types of output dynamics. The networks obtained from this process show a wide variety of topologies and numbers of genes indicating that it is relatively easy to evolve these classes of dynamics in this model. (c) 2006 Elsevier Ireland Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper reports on the development of an artificial neural network (ANN) method to detect laminar defects following the pattern matching approach utilizing dynamic measurement. Although structural health monitoring (SHM) using ANN has attracted much attention in the last decade, the problem of how to select the optimal class of ANN models has not been investigated in great depth. It turns out that the lack of a rigorous ANN design methodology is one of the main reasons for the delay in the successful application of the promising technique in SHM. In this paper, a Bayesian method is applied in the selection of the optimal class of ANN models for a given set of input/target training data. The ANN design method is demonstrated for the case of the detection and characterisation of laminar defects in carbon fibre-reinforced beams using flexural vibration data for beams with and without non-symmetric delamination damage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Non-technical losses (NTL) identification and prediction are important tasks for many utilities. Data from customer information system (CIS) can be used for NTL analysis. However, in order to accurately and efficiently perform NTL analysis, the original data from CIS need to be pre-processed before any detailed NTL analysis can be carried out. In this paper, we propose a feature selection based method for CIS data pre-processing in order to extract the most relevant information for further analysis such as clustering and classifications. By removing irrelevant and redundant features, feature selection is an essential step in data mining process in finding optimal subset of features to improve the quality of result by giving faster time processing, higher accuracy and simpler results with fewer features. Detailed feature selection analysis is presented in the paper. Both time-domain and load shape data are compared based on the accuracy, consistency and statistical dependencies between features.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This thesis presents a thorough and principled investigation into the application of artificial neural networks to the biological monitoring of freshwater. It contains original ideas on the classification and interpretation of benthic macroinvertebrates, and aims to demonstrate their superiority over the biotic systems currently used in the UK to report river water quality. The conceptual basis of a new biological classification system is described, and a full review and analysis of a number of river data sets is presented. The biological classification is compared to the common biotic systems using data from the Upper Trent catchment. This data contained 292 expertly classified invertebrate samples identified to mixed taxonomic levels. The neural network experimental work concentrates on the classification of the invertebrate samples into biological class, where only a subset of the sample is used to form the classification. Other experimentation is conducted into the identification of novel input samples, the classification of samples from different biotopes and the use of prior information in the neural network models. The biological classification is shown to provide an intuitive interpretation of a graphical representation, generated without reference to the class labels, of the Upper Trent data. The selection of key indicator taxa is considered using three different approaches; one novel, one from information theory and one from classical statistical methods. Good indicators of quality class based on these analyses are found to be in good agreement with those chosen by a domain expert. The change in information associated with different levels of identification and enumeration of taxa is quantified. The feasibility of using neural network classifiers and predictors to develop numeric criteria for the biological assessment of sediment contamination in the Great Lakes is also investigated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When composing stock portfolios, managers frequently choose among hundreds of stocks. The stocks' risk properties are analyzed with statistical tools, and managers try to combine these to meet the investors' risk profiles. A recently developed tool for performing such optimization is called full-scale optimization (FSO). This methodology is very flexible for investor preferences, but because of computational limitations it has until now been infeasible to use when many stocks are considered. We apply the artificial intelligence technique of differential evolution to solve FSO-type stock selection problems of 97 assets. Differential evolution finds the optimal solutions by self-learning from randomly drawn candidate solutions. We show that this search technique makes large scale problem computationally feasible and that the solutions retrieved are stable. The study also gives further merit to the FSO technique, as it shows that the solutions suit investor risk profiles better than portfolios retrieved from traditional methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract A new LIBS quantitative analysis method based on analytical line adaptive selection and Relevance Vector Machine (RVM) regression model is proposed. First, a scheme of adaptively selecting analytical line is put forward in order to overcome the drawback of high dependency on a priori knowledge. The candidate analytical lines are automatically selected based on the built-in characteristics of spectral lines, such as spectral intensity, wavelength and width at half height. The analytical lines which will be used as input variables of regression model are determined adaptively according to the samples for both training and testing. Second, an LIBS quantitative analysis method based on RVM is presented. The intensities of analytical lines and the elemental concentrations of certified standard samples are used to train the RVM regression model. The predicted elemental concentration analysis results will be given with a form of confidence interval of probabilistic distribution, which is helpful for evaluating the uncertainness contained in the measured spectra. Chromium concentration analysis experiments of 23 certified standard high-alloy steel samples have been carried out. The multiple correlation coefficient of the prediction was up to 98.85%, and the average relative error of the prediction was 4.01%. The experiment results showed that the proposed LIBS quantitative analysis method achieved better prediction accuracy and better modeling robustness compared with the methods based on partial least squares regression, artificial neural network and standard support vector machine.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research first evaluated levels and type of herbivory experienced by Centrosema virginianum plants in their native habitat and how florivory affected the pollinator activity. I found that populations of C. virginianum in two pine rockland habitat fragments experienced higher herbivory levels (15% and 22%) compared with plants in the protected study site (8.6%). I found that bees (Hymenoptera) pollinated butterfly pea. Furthermore, I found that florivores had a negative effect in the pollinators visitation rates and therefore in the seed set of the population. ^ I then conducted a study using a greenhouse population of C. virginianum. I applied artificial herbivory treatments: control, mild herbivory and severe herbivory. Flower size, pollen produced, ovules produced and seeds produced were negatively affected by herbivory. I did not find difference in nectar volume and quality by flowers among treatments. Surprisingly, severely damaged plants produced flowers with larger pollen than those from mildly damaged and undamaged plants. Results showed that plants tolerated mild and severe herbivory with 6% and 17% reduction of total fitness components, respectively. However, the investment of resources was not equisexual. ^ A comparison in the ability of siring seeds between large and small pollen was necessary to establish the biological consequence of size in pollen performance. I found that fruits produced an average of 18.7 ± 1.52 and 17.7 ± 1.50 from large and small pollen fertilization respectively. These findings supported a pollen number-size trade-off in plants under severe herbivory treatments. As far as I know, this result has not previously been reported. ^ Lastly, I tested how herbivory influenced seed abortion patterns in plants, examining how resources are allocated on different regions within fruits under artificial herbivory treatments. I found that self-fertilized fruits had greater seed abortion rates than cross-fertilized fruits. The proportion of seeds aborted was lower in the middle regions of the fruits in cross-fertilized fruits, producing more vigorous progeny. Self-fertilized fruits did not show patterns of seedling vigor. I also found that early abortion was higher closer to the peduncular end of the fruits. Position of seeds within fruits could be important in the seed dispersion mechanism characteristic of this species. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our nation’s highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Traffic incidents are non-recurring events that can cause a temporary reduction in roadway capacity. They have been recognized as a major contributor to traffic congestion on our national highway systems. To alleviate their impacts on capacity, automatic incident detection (AID) has been applied as an incident management strategy to reduce the total incident duration. AID relies on an algorithm to identify the occurrence of incidents by analyzing real-time traffic data collected from surveillance detectors. Significant research has been performed to develop AID algorithms for incident detection on freeways; however, similar research on major arterial streets remains largely at the initial stage of development and testing. This dissertation research aims to identify design strategies for the deployment of an Artificial Neural Network (ANN) based AID algorithm for major arterial streets. A section of the US-1 corridor in Miami-Dade County, Florida was coded in the CORSIM microscopic simulation model to generate data for both model calibration and validation. To better capture the relationship between the traffic data and the corresponding incident status, Discrete Wavelet Transform (DWT) and data normalization were applied to the simulated data. Multiple ANN models were then developed for different detector configurations, historical data usage, and the selection of traffic flow parameters. To assess the performance of different design alternatives, the model outputs were compared based on both detection rate (DR) and false alarm rate (FAR). The results show that the best models were able to achieve a high DR of between 90% and 95%, a mean time to detect (MTTD) of 55-85 seconds, and a FAR below 4%. The results also show that a detector configuration including only the mid-block and upstream detectors performs almost as well as one that also includes a downstream detector. In addition, DWT was found to be able to improve model performance, and the use of historical data from previous time cycles improved the detection rate. Speed was found to have the most significant impact on the detection rate, while volume was found to contribute the least. The results from this research provide useful insights on the design of AID for arterial street applications.