984 resultados para SELECTION PRESSURE


Relevância:

60.00% 60.00%

Publicador:

Resumo:

The strong selection pressure exerted by intensive use of glyphosate in cultivated areas has selected populations of the Rubiaceae weed species Borreria latifolia (Aubl.) K.Shum. (broadleaf buttonweed), Galianthe chodatiana (Standl.) E.L. Cabral (galiante) and Richardia brasiliensis Gomes (Brazilian pusley) with differential sensitivity to this herbicide in the South region of Brazil. The control of these weeds with herbicides is troublesome and signals the need to incorporate management practices of ruderal flora and crops, more sustainable and that results in more efficient control for long-term. Therefore, it is very important to expand the information about their biology and management. This study aimed: (a) select efficient methods to overcome dormancy of B. latifolia and G. chodatiana and determine how they influence the kinetics of seeds germination; (b) analyze the effects of temperature, irradiance, pH, aluminum and salinity on seed germination and initial growth of the B. latifolia, G. chodatiana e R. brasiliensis seedlings; (c) evaluate tolerance to glyphosate levels in biotypes of B. latifolia, G. chodatiana e R. brasiliensis through dose-response curves and compare two methods to evaluate herbicidal control; (d) and evaluated the effectiveness of alternative herbicides in pre-emergence and in early and late post-emergence of the three species. The treatment with KNO3 2%/3h + gibberellic acid 400 ppm resulted in higher percentage of G. chodatiana seed germination. This treatment and also the dry heat (60°C/30 min) + KNO3 2%/3h were more effective in overcoming dormancy of B. latifolia. G. chodatiana and R. brasiliensis tolerate lower temperatures during the germination process, while B. latifolia tolerate higher temperatures. B. latifolia and R. brasiliensis are positive photoblastic while G. chodatiana is indifferent to the photoperiod. B. latifolia shows higher germination and early development in pH 3, while G. chodatiana and R. brasiliensis prefer pH range between 5 and 7. B. latifolia and G. chodatiana were more tolerant to the aluminum during the germination process than R. brasiliensis. Low salt levels were sufficient to reduce the seed germination of the three species. Some biotypes of B. latifolia and R. brasiliensis showed medium-high glyphosate tolerance, not being controlled by higher doses than recommended. The G. chodatiana specie was not controlled with the highest dose used, showing a high glyphosate tolerance. The sulfentrazone, s-metolachlor and saflufenacil herbicides sprayed in pre-emergence showed high efficacy both on B. latifolia and R. brasiliensis, while chlorimuron-ethyl and diclosulan were effective only on R. brasiliensis. In early post-emergence the fomesafen, lactofem and flumioxazin herbicides efficiently controlled plants of all species, while bentazon showed high efficacy only on B. latifolia. Noteworthy the susceptibility of the G. chodatiana specie for applications in early post-emergence, because the control effectiveness and the number of effective herbicides are reduced with increasing the plant age. Many treatments with tank mix or sequencial applications with glyphosate, were effective in controlling B. latifolia and R. brasiliensis plants in advanced stage of development.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Bayesian optimisation algorithm for a nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse's assignment. When a human scheduler works, he normally builds a schedule systematically following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not yet completed, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this paper, we design a more human-like scheduling algorithm, by using a Bayesian optimisation algorithm to implement explicit learning from past solutions. A nurse scheduling problem from a UK hospital is used for testing. Unlike our previous work that used Genetic Algorithms to implement implicit learning [1], the learning in the proposed algorithm is explicit, i.e. we identify and mix building blocks directly. The Bayesian optimisation algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, new rule strings have been obtained. Sets of rule strings are generated in this way, some of which will replace previous strings based on fitness. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. For clarity, consider the following toy example of scheduling five nurses with two rules (1: random allocation, 2: allocate nurse to low-cost shifts). In the beginning of the search, the probabilities of choosing rule 1 or 2 for each nurse is equal, i.e. 50%. After a few iterations, due to the selection pressure and reinforcement learning, we experience two solution pathways: Because pure low-cost or random allocation produces low quality solutions, either rule 1 is used for the first 2-3 nurses and rule 2 on remainder or vice versa. In essence, Bayesian network learns 'use rule 2 after 2-3x using rule 1' or vice versa. It should be noted that for our and most other scheduling problems, the structure of the network model is known and all variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus, learning can amount to 'counting' in the case of multinomial distributions. For our problem, we use our rules: Random, Cheapest Cost, Best Cover and Balance of Cost and Cover. In more detail, the steps of our Bayesian optimisation algorithm for nurse scheduling are: 1. Set t = 0, and generate an initial population P(0) at random; 2. Use roulette-wheel selection to choose a set of promising rule strings S(t) from P(t); 3. Compute conditional probabilities of each node according to this set of promising solutions; 4. Assign each nurse using roulette-wheel selection based on the rules' conditional probabilities. A set of new rule strings O(t) will be generated in this way; 5. Create a new population P(t+1) by replacing some rule strings from P(t) with O(t), and set t = t+1; 6. If the termination conditions are not met (we use 2000 generations), go to step 2. Computational results from 52 real data instances demonstrate the success of this approach. They also suggest that the learning mechanism in the proposed approach might be suitable for other scheduling problems. Another direction for further research is to see if there is a good constructing sequence for individual data instances, given a fixed nurse scheduling order. If so, the good patterns could be recognized and then extracted as new domain knowledge. Thus, by using this extracted knowledge, we can assign specific rules to the corresponding nurses beforehand, and only schedule the remaining nurses with all available rules, making it possible to reduce the solution space. Acknowledgements The work was funded by the UK Government's major funding agency, Engineering and Physical Sciences Research Council (EPSRC), under grand GR/R92899/01. References [1] Aickelin U, "An Indirect Genetic Algorithm for Set Covering Problems", Journal of the Operational Research Society, 53(10): 1118-1126,

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Understanding the evolution of sociality in humans and other species requires understanding how selection on social behaviour varies with group size. However, the effects of group size are frequently obscured in the theoretical literature, which often makes assumptions that are at odds with empirical findings. In particular, mechanisms are suggested as supporting large-scale cooperation when they would in fact rapidly become ineffective with increasing group size. Here we review the literature on the evolution of helping behaviours (cooperation and altruism), and frame it using a simple synthetic model that allows us to delineate how the three main components of the selection pressure on helping must vary with increasing group size. The first component is the marginal benefit of helping to group members, which determines both direct fitness benefits to the actor and indirect fitness benefits to recipients. While this is often assumed to be independent of group size, marginal benefits are in practice likely to be maximal at intermediate group sizes for many types of collective action problems, and will eventually become very small in large groups due to the law of decreasing returns. The second component is the response of social partners on the past play of an actor, which underlies conditional behaviour under repeated social interactions. We argue that under realistic conditions on the transmission of information in a population, this response on past play decreases rapidly with increasing group size so that reciprocity alone (whether direct, indirect, or generalised) cannot sustain cooperation in very large groups. The final component is the relatedness between actor and recipient, which, according to the rules of inheritance, again decreases rapidly with increasing group size. These results explain why helping behaviours in very large social groups are limited to cases where the number of reproducing individuals is small, as in social insects, or where there are social institutions that can promote (possibly through sanctioning) large-scale cooperation, as in human societies. Finally, we discuss how individually devised institutions can foster the transition from small-scale to large-scale cooperative groups in human evolution.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Silverleaf whitefly (SLW) is a major late season pest of cotton due to its potential to contaminate cotton lint with honeydew. To prevent this, management is often reliant on the use of insecticides to control SLW populations. With selection pressure SLW develop resistance to insecticides they are exposed to, resulting in spray failures. Our lab tests resistance levels in SLW populations collected from across the cotton industry. In this presentation I will provide an update of emerging SLW resistance issues the cotton industry is facing.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A Bayesian optimisation algorithm for a nurse scheduling problem is presented, which involves choosing a suitable scheduling rule from a set for each nurse's assignment. When a human scheduler works, he normally builds a schedule systematically following a set of rules. After much practice, the scheduler gradually masters the knowledge of which solution parts go well with others. He can identify good parts and is aware of the solution quality even if the scheduling process is not yet completed, thus having the ability to finish a schedule by using flexible, rather than fixed, rules. In this paper, we design a more human-like scheduling algorithm, by using a Bayesian optimisation algorithm to implement explicit learning from past solutions. A nurse scheduling problem from a UK hospital is used for testing. Unlike our previous work that used Genetic Algorithms to implement implicit learning [1], the learning in the proposed algorithm is explicit, i.e. we identify and mix building blocks directly. The Bayesian optimisation algorithm is applied to implement such explicit learning by building a Bayesian network of the joint distribution of solutions. The conditional probability of each variable in the network is computed according to an initial set of promising solutions. Subsequently, each new instance for each variable is generated by using the corresponding conditional probabilities, until all variables have been generated, i.e. in our case, new rule strings have been obtained. Sets of rule strings are generated in this way, some of which will replace previous strings based on fitness. If stopping conditions are not met, the conditional probabilities for all nodes in the Bayesian network are updated again using the current set of promising rule strings. For clarity, consider the following toy example of scheduling five nurses with two rules (1: random allocation, 2: allocate nurse to low-cost shifts). In the beginning of the search, the probabilities of choosing rule 1 or 2 for each nurse is equal, i.e. 50%. After a few iterations, due to the selection pressure and reinforcement learning, we experience two solution pathways: Because pure low-cost or random allocation produces low quality solutions, either rule 1 is used for the first 2-3 nurses and rule 2 on remainder or vice versa. In essence, Bayesian network learns 'use rule 2 after 2-3x using rule 1' or vice versa. It should be noted that for our and most other scheduling problems, the structure of the network model is known and all variables are fully observed. In this case, the goal of learning is to find the rule values that maximize the likelihood of the training data. Thus, learning can amount to 'counting' in the case of multinomial distributions. For our problem, we use our rules: Random, Cheapest Cost, Best Cover and Balance of Cost and Cover. In more detail, the steps of our Bayesian optimisation algorithm for nurse scheduling are: 1. Set t = 0, and generate an initial population P(0) at random; 2. Use roulette-wheel selection to choose a set of promising rule strings S(t) from P(t); 3. Compute conditional probabilities of each node according to this set of promising solutions; 4. Assign each nurse using roulette-wheel selection based on the rules' conditional probabilities. A set of new rule strings O(t) will be generated in this way; 5. Create a new population P(t+1) by replacing some rule strings from P(t) with O(t), and set t = t+1; 6. If the termination conditions are not met (we use 2000 generations), go to step 2. Computational results from 52 real data instances demonstrate the success of this approach. They also suggest that the learning mechanism in the proposed approach might be suitable for other scheduling problems. Another direction for further research is to see if there is a good constructing sequence for individual data instances, given a fixed nurse scheduling order. If so, the good patterns could be recognized and then extracted as new domain knowledge. Thus, by using this extracted knowledge, we can assign specific rules to the corresponding nurses beforehand, and only schedule the remaining nurses with all available rules, making it possible to reduce the solution space. Acknowledgements The work was funded by the UK Government's major funding agency, Engineering and Physical Sciences Research Council (EPSRC), under grand GR/R92899/01. References [1] Aickelin U, "An Indirect Genetic Algorithm for Set Covering Problems", Journal of the Operational Research Society, 53(10): 1118-1126,

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Adeno-associated viral (AAV) vectors are among the most widely used gene transfer systems in basic and pre-clinical research and have been employed in more than 160 clinical trials. AAV vectors are commonly produced in producer cell lines like HEK293 by co-transfection with a so-called vector plasmid and one (in this work) or two so-called helper plasmids. The vector plasmid contains the transgene cassette of interest (TEC) flanked by AAV’s inverted terminal repeats (ITRs) which serve as packaging signals, whereas the helper plasmid provides the required AAV and helper virus functions in trans. A pivotal aspect of AAV vectorology is the manufacturing of AAV vectors free from impurities arising during the production process. These impurities include AAV vector preparations that contain capsids containing prokaryotic sequences, e.g. antibiotic resistance genes originating from the producer plasmids. In the first part of the thesis we aimed at improving the safety of AAV vectors. As we found that encapsidated prokaryotic sequences (using the ampicillin resistance gene as indicator) cannot be re-moved by standard purification methods we investigated whether the producer plasmids could be replaced by Minicircles (MCs). MCs are circular DNA constructs which contain no functional or coding prokaryotic sequences; they only consist of the TEC and a short sequence required for production and purification. MC counterparts of a vector plasmid encoding for enhanced green fluorescent (eGFP) protein and a helper plasmid encoding for AAV serotype 2 (AAV2) and helper Adenovirus (Ad) genes were designed and produced by PlasmidFactory (Bielefeld, Germany). Using all four possible combinations of plasmid and MCs, single-stranded AAV2 vectors (ssAAV) and self-complementary AAV vectors (scAAV) were produced and characterized for vector quantity, quality and functionality. The analyses showed that plasmids can be replaced by MCs without decreasing the efficiency of vector production and vector quality. MC-derived scAAV vector preparations even exceeded plasmid-derived preparations, as they displayed up to 30-fold improved transduction efficiencies. Using MCs as tools, we found that the vector plasmid is the main source of encapsidated prokaryotic sequences. Remarkably, we found that plasmid-derived scAAV vector preparations contained a much higher relative amount of prokaryotic sequences (up to 26.1 %, relative to TEC) compared to ssAAV vector preparations (up to 2.9 %). By replacing both plasmids by MCs the amount of functional prokaryotic sequences could be decreased to below the limit of quantification. Additional analyses for DNA impurities other than prokaryotic sequences showed that scAAV vectors generally contained a higher amount of non-vector DNA (e.g. adenoviral sequences) than ssAAV vectors. For both, ssAAV and scAAV vector preparations, MC-derived vectors tended to contain lower amounts of foreign DNA. None of the vectors tested could be shown to induce immunogenicity. In summary we could demonstrate that the quality of AAV vector preparations could be significantly improved by replacing producer plasmids by MCs. Upon transduction of a target tissue, AAV vector genomes predominantly remain in an episomal state, as duplex DNA circles or concatemers. These episomal forms mediate long-term transgene expression in terminally differentiated cells, but are lost in proliferating cells due to cell division. Therefore, in the second part of the thesis, in cooperation with Claudia Hagedorn and Hans J. Lipps (University Witten/Herdecke) an AAV vector genome was equipped with an autonomous replication element (Scaffold/matrix attachment region (S/MAR)). AAV-S/MAR encoding for eGFP and a blasticidin resistance gene and a control vector with the same TEC but lacking the S/MAR element (AAV-ΔS/MAR) were produced and transduced into highly proliferative HeLa cells. Antibiotic pressure was employed to select for cells stably maintaining the vector genome. AAV-S/MAR transduced cells yielded a higher number of colonies than AAV-ΔS/MAR-transduced cells. Colonies derived from each vector transduction were picked and cultured further. They remained eGFP-positive (up to 70 days, maximum cultivation period) even in the absence of antibiotic selection pressure. Interestingly, the mitotic stability of both AAV-S/MAR and control vector AAV-ΔS/MAR was found to be a result of episomal maintenance of the vector genome. This finding indicates that, under specific conditions such as the mild selection pressure we employed, “common” AAV vectors persist episomally. Thus, the S/MAR element increases the establishment frequency of stable episomes, but is not a prerequisite.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Résumé : La formation de métastases s’inscrit comme la finalité d’un processus darwinien dans lequel les cellules tumorales subissent des altérations génétiques et épigénétiques dans l’unique but de préserver un avantage prolifératif. L’environnement hypoxique, caractéristique des tumeurs solides, se révèle comme une pression de sélection et un facteur déterminant dans la progression tumorale. Face à l’hypoxie, une des adaptations majeures des cellules tumorales est le déséquilibre du pH cellulaire qui mène à la formation de métastases et à la résistance à la chimiothérapie. Cette thèse met en lumière de nouveaux liens moléculaires entre l’hypoxie et la régulation du pH dans des contextes d’invasion cellulaire et de chimiorésistance. Les échangeurs d’ions NHE1 et NHE6 sont au cœur de ces études où de nouveaux rôles dans la progression du cancer leur ont été attribués. Premièrement, nous avons observé l’influence de l’hypoxie sur la régulation de NHE1 par p90RSK et les conséquences fonctionnelles de cette interaction dans l’invasion cellulaire par les invadopodes. En conditions hypoxiques, NHE1 est activé par p90RSK résultant en une acidification extracellulaire. En modifiant le pH, NHE1 stimule la formation des invadopodes et la dégradation de la matrice extracellulaire. Ainsi, la phosphorylation de NHE1 par p90RSK en hypoxie apparaît comme un biomarqueur potentiel des cancers métastatiques. Peu étudié, le pH endosomal peut intervenir dans la chimiorésistance mais les mécanismes sont inconnus. Nous avons développé une méthode pour mesurer précisément le pH endosomal par microscopie. Ceci a permis d’illuminer un nouveau mécanisme de résistance induit par l’hypoxie et mettant en vedette l’échangeur NHE6. L’hypoxie favorise l’interaction de NHE6 avec RACK1 à la membrane plasmique empêchant la localisation endosomale de l’échangeur. Cette interaction mène à la séquestration de la doxorubicine dans des endosomes sur-acidifiés. Ces travaux mettent en évidence pour la première fois le rôle du pH endosomal et l’échangeur NHE6 comme des éléments centraux de la chimiorésistance induite par l’hypoxie. Cette thèse renforce donc l’idée voulant que les interactions entre les cellules tumorales et le microenvironnement hypoxique sont le « talon d’Achille » du cancer et la régulation du pH cellulaire est primordiale dans l’adaptation des cellules à l’hypoxie et l’instauration du phénotype malin du cancer. La découverte de nouveaux rôles pro-tumoraux pour NHE1 et NHE6 les placent à l’avant-plan pour le développement de stratégies thérapeutiques orientées contre la formation de métastases et la chimiorésistance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper contributes to the recent debate about the role of referees in the home advantage phenomenon. Specifically, it aims to provide a convincing answer to the newly posed question of the existence of individual differences among referees in terms of the home advantage (Boyko, Boyko, & Boyko, 2007; Johnston, 2008). Using multilevel modelling on a large and representative dataset we find that (1) the home advantage effect differs significantly among referees, and (2) this relationship is moderated by the size of the crowd. These new results suggest that a part of the home advantage is due to the effect of the crowd on the referees, and that some referees are more prone to be influenced by the crowd than others. This provides strong evidence to indicate that referees are a significant contributing factor to the home advantage. The implications of these findings are discussed both in terms of the relevant social psychological research, and with respect to the selection, assessment, and training of referees.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Under pressure from both the ever increasing level of market competition and the global financial crisis, clients in consumer electronics (CE) industry are keen to understand how to choose the most appropriate procurement method and hence to improve their competitiveness. Four rounds of Delphi questionnaire survey were conducted with 12 experts in order to identify the most appropriate procurement method in the Hong Kong CE industry. Five key selection criteria in the CE industry are highlighted, including product quality, capability, price competition, flexibility and speed. This study also revealed that product quality was found to be the most important criteria for the “First type used commercially” and “Major functional improvements” projects. As for “Minor functional improvements” projects, price competition was the most crucial factor to be considered during the PP selection. These research findings provide owners with useful insights to select the procurement strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Characterization of the combustion products released during the burning of commonly used engineering metallic materials may aid in material selection and risk assessment for the design of oxygen systems. The characterization of combustion products in regards to size distribution and morphology gives useful information for systems addressing fire detection. Aluminum rods (3.2-mm diameter cylinders) were vertically mounted inside a combustion chamber and ignited in pressurized oxygen by resistively heating an aluminum/palladium igniter wire attached to the bottom of the test sample. This paper describes the experimental work conducted to establish the particle size distribution and morphology of the resultant combustion products collected after the burning was completed and subsequently analyzed. In general, the combustion products consisted of a re-solidified oxidized slag and many small hollow spheres of size ranging from about 500 nm to 1000 µm in diameter, surfaced with quenched dendritic and grain-like structures. The combustion products were characterized using optical and scanning electron microscopy.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mobile teledermatoscopy (MTD) for the early detection of skin cancer uses smartphones with dermatoscope attachments to magnify, capture, and transfer images remotely.1 Using the asymmetry–color variation (AC) rule, consumers achieve dermoscopy sensitivity of 92.9% to 94.0% and specificity of 62.0% to 64.2% for melanoma.2 This pilot randomized trial assessed lesions of concern selected by consumers at high risk of melanoma using MTD plus the AC rule (intervention, n = 10) or the AC rule alone (control, n = 12) during skin self-examination (SSE). Also measured were lesion location patterns, lesions overlooked by participants, provisional clinical diagnoses, likelihood of malignant tumor, and participant pressure to excise lesions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cane fibre content has increased over the past ten years. Some of that increase can be attributed to new varieties selected for release. This paper reviews the existing methods for quantifying the fibre characteristics of a variety, including fibre content and fibre quality measurements – shear strength, impact resistance and short fibre content. The variety selection process is presented and it is reported that fibre content has zero weighting in the current selection index. An updated variety selection approach is proposed, potentially replacing the existing selection process relating to fibre. This alternative approach involves the use of a more complex mill area level model that accounts for harvesting, transport and processing equipment, taking into account capacity, efficiency and operational impacts, along with the end use for the bagasse. The approach will ultimately determine a net economic value for the variety. The methodology lends itself to a determination of the fibre properties that have a significant impact on the economic value so that variety tests can better target the critical properties. A low-pressure compression test is proposed as a good test to provide an assessment of the impact of a variety on milling capacity. NIR methodology is proposed as a technology to lead to a more rapid assessment of fibre properties, and hence the opportunity to more comprehensively test for fibre impacts at an earlier stage of variety development.