10 resultados para Pathogen-driven selection
em CentAUR: Central Archive University of Reading - UK
Resumo:
Several models have proposed that an action can be imitated via one of two routes: a direct visuospatial route, which can in principle mediate imitation of both meaningful (MF) and meaningless (ML) actions, and an indirect semantic route, which can be used only for MF actions. The present study investigated whether selection between the direct and indirect routes is strategic or stimulus driven. Tessari and Rumiati (J Exp Psychol Hum Percept Perform 30:1107–1116, 2004) have previously shown, using accuracy measures, that imitation of MF actions is superior to imitation of ML actions when the two action types are presented in separate blocks, and that the advantage of MF over ML items is smaller or absent when they are presented in mixed blocks. We first replicated this finding using an automated reaction time (RT), as well as accuracy, measure. We then examined imitation of MF and ML actions in the mixed condition as a function of the action type presented in the previous trial and in relation to the number of previous test trials. These analyses showed that (1) for both action types, performance was worse immediately after ML than MF trials, and (2) even at the beginning of the mixed condition, responding to MF actions was no better than responding to ML items. These results suggest that the properties of the action stimulus play a substantial role in determining whether imitation is mediated by the direct or the indirect route, and that effects of block composition on imitation need not be generated through strategic switching between routes.
Resumo:
Over the last 50 years, Spanish Atlantic salmon (Salmo salar) populations have been in decline. In order to bolster these populations, rivers were stocked with fish of northern European origin during the period 1974-1996, probably also introducing the furunculosis-inducing pathogen, Aeromonas salmonicida. Here we assess the relative importance of processes influencing mitochondrial (mt)DNA variability in these populations from 1948 to 2002. Genetic material collected over this period from four rivers in northern Spain (Cantabria) was used to detect variability at the mtDNA ND1 gene. Before stocking, a single haplotype was found at high frequency (0.980). Following stocking, haplotype diversity (h) increased in all rivers (mean h before stocking was 0.041, and 0.245 afterwards). These increases were due principally to the dramatic increase in frequency of a previously very low frequency haplotype, reported at higher frequencies in northern European populations proximate to those used to stock Cantabrian rivers. Genetic structuring increased after stocking: among-river differentiation was low before stocking (1950s/1960s Phi(ST) = -0.00296-0.00284), increasing considerably at the height of stocking (1980s Phi(ST) = 0.18932) and decreasing post-stocking (1990s/2002 Phi(ST) = 0.04934-0.03852). Gene flow from stocked fish therefore seems to have had a substantial role in increasing mtDNA variability. Additionally, we found significant differentiation between individuals that had probably died from infectious disease and apparently healthy, angled fish, suggesting a possible role for pathogen-driven selection of mtDNA variation. Our results suggest that stocking with non-native fish may increase genetic diversity in the short term, but may not reverse population declines.
Resumo:
Seamless phase II/III clinical trials are conducted in two stages with treatment selection at the first stage. In the first stage, patients are randomized to a control or one of k > 1 experimental treatments. At the end of this stage, interim data are analysed, and a decision is made concerning which experimental treatment should continue to the second stage. If the primary endpoint is observable only after some period of follow-up, at the interim analysis data may be available on some early outcome on a larger number of patients than those for whom the primary endpoint is available. These early endpoint data can thus be used for treatment selection. For two previously proposed approaches, the power has been shown to be greater for one or other method depending on the true treatment effects and correlations. We propose a new approach that builds on the previously proposed approaches and uses data available at the interim analysis to estimate these parameters and then, on the basis of these estimates, chooses the treatment selection method with the highest probability of correctly selecting the most effective treatment. This method is shown to perform well compared with the two previously described methods for a wide range of true parameter values. In most cases, the performance of the new method is either similar to or, in some cases, better than either of the two previously proposed methods.
Resumo:
A size-structured plant population model is developed to study the evolution of pathogen-induced leaf shedding under various environmental conditions. The evolutionary stable strategy (ESS) of the leaf shedding rate is determined for two scenarios: i) a constant leaf shedding strategy and ii) an infection load driven leaf shedding strategy. The model predicts that ESS leaf shedding rates increase with nutrient availability. No effect of plant density on the ESS leaf shedding rate is found even though disease severity increases with plant density. When auto-infection, that is increased infection due to spores produced on the plant itself, plays a key role in further disease increase on the plant, shedding leaves removes disease that would otherwise contribute to disease increase on the plant itself. Consequently leaf shedding responses to infections may evolve. When external infection, that is infection due to immigrant spores, is the key determinant, shedding a leaf does not reduce the force of infection on the leaf shedding plant. In this case leaf shedding will not evolve. Under a low external disease pressure adopting an infection driven leaf shedding strategy is more efficient than adopting a constant leaf shedding strategy, since a plant adopting an infection driven leaf shedding strategy does not shed any leaves in the absence of infection, even when leaf shedding rates are high. A plant adopting a constant leaf shedding rate sheds the same amount of leaves regardless of the presence of infection. Based on the results we develop two hypotheses that can be tested if the appropriate plant material is available.
Resumo:
An outdoor experiment was conducted to increase understanding of apical leaf necrosis in the presence of pathogen infection. Holcus lanatus seeds and Puccinia coronata spores were collected from two adjacent and otherwise similar habitats with differing long-term N fertilization levels. After inoculation, disease and necrosis dynamics were observed during the plant growing seasons of 2003 and 2006. In both years high nutrient availability resulted in earlier disease onset, a higher pathogen population growth rate, earlier physiological apical leaf necrosis onset and a reduced time between disease onset and apical leaf necrosis onset. Necrosis rate was shown to be independent of nutrient availability. The results showed that in these nutrient-rich habitats H. lanatus plants adopted necrosis mechanisms which wasted more nutrients. There was some indication that these necrosis mechanisms were subject to local selection pressures, but these results were not conclusive. The findings of this study are consistent with apical leaf necrosis being an evolved defence mechanism.
Resumo:
A size-structured plant population model is developed to study the evolution of pathogen-induced leaf shedding under various environmental conditions. The evolutionary stable strategy (ESS) of the leaf shedding rate is determined for two scenarios: i) a constant leaf shedding strategy and ii) an infection load driven leaf shedding strategy. The model predicts that ESS leaf shedding rates increase with nutrient availability. No effect of plant density on the ESS leaf shedding rate is found even though disease severity increases with plant density. When auto-infection, that is increased infection due to spores produced on the plant itself, plays a key role in further disease increase on the plant, shedding leaves removes disease that would otherwise contribute to disease increase on the plant itself. Consequently leaf shedding responses to infections may evolve. When external infection, that is infection due to immigrant spores, is the key determinant, shedding a leaf does not reduce the force of infection on the leaf shedding plant. In this case leaf shedding will not evolve. Under a low external disease pressure adopting an infection driven leaf shedding strategy is more efficient than adopting a constant leaf shedding strategy, since a plant adopting an infection driven leaf shedding strategy does not shed any leaves in the absence of infection, even when leaf shedding rates are high. A plant adopting a constant leaf shedding rate sheds the same amount of leaves regardless of the presence of infection. Based on the results we develop two hypotheses that can be tested if the appropriate plant material is available.
Resumo:
Costs of resistance are widely assumed to be important in the evolution of parasite and pathogen defence in animals, but they have been demonstrated experimentally on very few occasions. Endoparasitoids are insects whose larvae develop inside the bodies of other insects where they defend themselves from attack by their hosts' immune systems (especially cellular encapsulation). Working with Drosophila melanogaster and its endoparasitoid Leptopilina boulardi, we selected for increased resistance in four replicate populations of flies. The percentage of flies surviving attack increased from about 0.5% to between 40% and 50% in five generations, revealing substantial additive genetic variation in resistance in the field population from which our culture was established. In comparison with four control lines, flies from selected lines suffered from lower larval survival under conditions of moderate to severe intraspecific competition.
Resumo:
Analyzes the use of linear and neural network models for financial distress classification, with emphasis on the issues of input variable selection and model pruning. A data-driven method for selecting input variables (financial ratios, in this case) is proposed. A case study involving 60 British firms in the period 1997-2000 is used for illustration. It is shown that the use of the Optimal Brain Damage pruning technique can considerably improve the generalization ability of a neural model. Moreover, the set of financial ratios obtained with the proposed selection procedure is shown to be an appropriate alternative to the ratios usually employed by practitioners.
Resumo:
Bacteria possess a range of mechanisms to move in different environments, and these mechanisms have important direct and correlated impacts on the virulence of opportunistic pathogens. Bacteria use two surface organelles to facilitate motility: a single polar flagellum, and type IV pili, enabling swimming in aqueous habitats and twitching along hard surfaces, respectively. Here, we address whether there are trade-offs between these motility mechanisms, and hence whether different environments could select for altered motility. We experimentally evolved initially isogenic Pseudomonas aeruginosa under conditions that favored the different types of motility, and found evidence for a trade-off mediated by antagonistic pleiotropy between swimming and twitching. Moreover, changes in motility resulted in correlated changes in other behaviors, including biofilm formation and growth within an insect host. This suggests environmental origins of a particular motile opportunistic pathogen could predictably influence motility and virulence
Resumo:
The evolution of fungicide resistance in the cereal pathogen Zymoseptoria tritici, is a serious threat to the sustainability and profitability of wheat production in Europe. Application of azole fungicides has been shown to affect fitness of Z. tritici variants differentially, so it has been hypothesised that combinations of azoles could slow the evolution of resistance. This work was initiated to assess the effects of dose, mixtures and alternations of two azoles on selection for isolates with reduced sensitivity and on disease control. Naturally infected field trials were carried out at six sites across Ireland and the sensitivity of Z. tritici isolates monitored pre- and post-treatment. The azoles epoxiconazole and metconazole were applied as solo products, in alternation with each other and as a pre-formulated mixture. Full and half label doses were tested. The two azoles were partially cross-resistant, with a common azole resistance principal component accounting for 75% of the variation between isolates. Selection for isolates with reduced azole sensitivity was correlated with disease control. Decreased doses were related to decreases in sensitivity but the effect was barely significant (P = 0.1) and control was reduced. Single applications of an active ingredient (a.i.) caused smaller decreases in sensitivity than double applications. Shifts in sensitivity to the a.i. applied to a plot were greater than to the a.i. not applied, and the decrease in sensitivity was greater to the a.i. applied at the second timing. These results confirm the need to mix a.i.s with different modes of action.