675 resultados para Peggy Shaw
Resumo:
Many pathogens transmit to new hosts by both infection (horizontal transmission) and transfer to the infected host's offspring (vertical transmission). These two transmission modes require speci®c adap- tations of the pathogen that can be mutually exclusive, resulting in a trade-off between horizontal and vertical transmission. We show that in mathematical models such trade-offs can lead to the simultaneous existence of two evolutionary stable states (evolutionary bi-stability) of allocation of resources to the two modes of transmission. We also show that jumping between evolutionary stable states can be induced by gradual environmental changes. Using quantitative PCR-based estimates of abundance in seed and vege- tative parts, we show that the pathogen of wheat, Phaeosphaeria nodorum, has jumped between two distinct states of transmission mode twice in the past 160 years, which, based on published evidence, we interpret as adaptation to environmental change. The ®nding of evolutionary bi-stability has impli- cations for human, animal and other plant diseases. An ill-judged change in a disease control programme could cause the pathogen to evolve a new, and possibly more damaging, combination of transmission modes. Similarly, environmental changes can shift the balance between transmission modes, with adverse effects on human, animal and plant health.
Resumo:
Two models for predicting Septoria tritici on winter wheat (cv. Ri-band) were developed using a program based on an iterative search of correlations between disease severity and weather. Data from four consecutive cropping seasons (1993/94 until 1996/97) at nine sites throughout England were used. A qualitative model predicted the presence or absence of Septoria tritici (at a 5% severity threshold within the top three leaf layers) using winter temperature (January/February) and wind speed to about the first node detectable growth stage. For sites above the disease threshold, a quantitative model predicted severity of Septoria tritici using rainfall during stern elongation. A test statistic was derived to test the validity of the iterative search used to obtain both models. This statistic was used in combination with bootstrap analyses in which the search program was rerun using weather data from previous years, therefore uncorrelated with the disease data, to investigate how likely correlations such as the ones found in our models would have been in the absence of genuine relationships.
Resumo:
The recent decline in the effectiveness of some azole fungicides in controlling the wheat pathogen Mycosphaerella graminicola has been associated with mutations in the CYP51 gene encoding the azole target, the eburicol 14 alpha-demethylase (CYP51), an essential enzyme of the ergosterol biosynthesis pathway. In this study, analysis of the sterol content of M. graminicola isolates carrying different variants of the CYP51 gene has revealed quantitative differences in sterol intermediates, particularly the CYP51 substrate eburicol. Together with CYP51 gene expression studies, these data suggest that mutations in the CYP51 gene impact on the activity of the CYP51 protein.
Resumo:
A method was developed to evaluate crop disease predictive models for their economic and environmental benefits. Benefits were quantified as the value of a prediction measured by costs saved and fungicide dose saved. The value of prediction was defined as the net gain made by using predictions, measured as the difference between a scenario where predictions are available and used and a scenario without prediction. Comparable 'with' and 'without' scenarios were created with the use of risk levels. These risk levels were derived from a probability distribution fitted to observed disease severities. These distributions were used to calculate the probability that a certain disease induced economic loss was incurred. The method was exemplified by using it to evaluate a model developed for Mycosphaerella graminicola risk prediction. Based on the value of prediction, the tested model may have economic and environmental benefits to growers if used to guide treatment decisions on resistant cultivars. It is shown that the value of prediction measured by fungicide dose saved and costs saved is constant with the risk level. The model could also be used to evaluate similar crop disease predictive models.
Resumo:
Disease-weather relationships influencing Septoria leaf blotch (SLB) preceding growth stage (GS) 31 were identified using data from 12 sites in the UK covering 8 years. Based on these relationships, an early-warning predictive model for SLB on winter wheat was formulated to predict the occurrence of a damaging epidemic (defined as disease severity of 5% or > 5% on the top three leaf layers). The final model was based on accumulated rain > 3 mm in the 80-day period preceding GS 31 (roughly from early-February to the end of April) and accumulated minimum temperature with a 0A degrees C base in the 50-day period starting from 120 days preceding GS 31 (approximately January and February). The model was validated on an independent data set on which the prediction accuracy was influenced by cultivar resistance. Over all observations, the model had a true positive proportion of 0.61, a true negative proportion of 0.73, a sensitivity of 0.83, and a specificity of 0.18. True negative proportion increased to 0.85 for resistant cultivars and decreased to 0.50 for susceptible cultivars. Potential fungicide savings are most likely to be made with resistant cultivars, but such benefits would need to be identified with an in-depth evaluation.
Resumo:
An extensive study was conducted to determine where in the production chain Rhizoctonia solani became associated with UK module-raised Brassica oleracea plants. In total, 2600 plants from 52 crops were sampled directly from propagators and repeat sampled from the field. Additional soil, compost and water samples were collected from propagation nurseries and screened using conventional agar isolation methods. No isolates of R. solani were recovered from any samples collected from propagation nurseries. Furthermore, nucleic acid preparations from samples of soil and compost from propagation nurseries gave negative results when tested for R. solani using real-time PCR. Conversely, R. solani was recovered from 116 of 1300 stem bases collected from field crops. All the data collected suggested R. solani became associated with B. oleracea in the field rather than during propagation. Parsimony and Bayesian phylogenetic studies of ribosomal DNA suggested the majority of further classified isolates belonged to anastomosis groups 2-1 (48/57) and AG-4HGII (8/57), groups known to be pathogenic on Brassica spp. in other countries. Many R. solani isolates were recovered from symptomless plant material and the possibilities for such an association are discussed.
Resumo:
Real-time PCR protocols were developed to detect and discriminate 11 anastomosis groups (AGs) of Rhizoctonia solani using ribosomal internal transcribed spacer (ITS) regions (AG-1-IA, AG-1-IC, AG-2-1, AG-2-2, AG-4HGI+II, AG-4HGIII, AG-8) or beta-tubulin (AG-3, AG-4HGII, AG-5 and AG-9) sequences. All real-time assays were target group specific, except AG-2-2, which showed a weak cross-reaction with AG-2tabac. In addition, methods were developed for the high throughput extraction of DNA from soil and compost samples. The DNA extraction method was used with the AG-2-1 assay and shown to be quantitative with a detection threshold of 10-7 g of R. solani per g of soil. A similar DNA extraction efficiency was observed for samples from three contrasting soil types. The developed methods were then used to investigate the spatial distribution of R. solani AG-2-1 in field soils. Soil from shallow depths of a field planted with Brassica oleracea tested positive for R. solani AG-2-1 more frequently than soil collected from greater depths. Quantification of R. solani inoculum in field samples proved challenging due to low levels of inoculum in naturally occurring soils. The potential uses of real-time PCR and DNA extraction protocols to investigate the epidemiology of R. solani are discussed.
Resumo:
We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
It makes economic sense to use as little fungicide as possible on a crop. In many settings, it is common to apply less than the manufacturer's recommended dose. If sources of disease are scarce, or conditions are unsuitable for it to increase, the reduced control from a low dose may be adequate. In other cases, a big reduction in dose may cause little reduction in control, again permitting savings - especially for growers prepared to run a little risk. But the label recommendations for most fungicides state that to avoid resistance, a full dose must always be used. Are individual cost-savings therefore endangering everyone's access to an exceptionally useful tool? The emergence of fungicide resistance is evolution in action. In all cases, it involves the genetic replacement of the original susceptible population of the pathogen by a new population with genetically distinct biochemistry, which confers resistance. The resistant biochemistry originates in rare genetic mutations, so rare that initially the population is hardly altered. Replacement of susceptible forms by resistant ones happens because, with fungicide present, the resistant form multiplies more rapidly than the susceptible form. The key point to notice is that only the relative rates of multiplication of the resistant and susceptible types are involved in the evolution of resistance. The absolute rates are irrelevant.
Resumo:
Key weather factors determining the occurrence and severity of powdery mildew and yellow rust epidemics on winter wheat were identified. Empirical models were formulated to qualitatively predict a damaging epidemic (>5% severity) and quantitatively predict the disease severity given a damaging epidemic occurred. The disease data used was from field experiments at 12 locations in the UK covering the period from 1994 to 2002 with matching data from weather stations within a 5 km range. Wind in December to February was the most influential factor for a damaging epidemic of powdery mildew. Disease severity was best identified by a model with temperature, humidity, and rain in April to June. For yellow rust, the temperature in February to June was the most influential factor for a damaging epidemic as well as for disease severity. The qualitative models identified favorable circumstances for damaging epidemics, but damaging epidemics did not always occur in such circumstances, probably due to other factors such as the availability of initial inoculum and cultivar resistance.