19 resultados para epidemics
Resumo:
Two models for predicting Septoria tritici on winter wheat (cv. Ri-band) were developed using a program based on an iterative search of correlations between disease severity and weather. Data from four consecutive cropping seasons (1993/94 until 1996/97) at nine sites throughout England were used. A qualitative model predicted the presence or absence of Septoria tritici (at a 5% severity threshold within the top three leaf layers) using winter temperature (January/February) and wind speed to about the first node detectable growth stage. For sites above the disease threshold, a quantitative model predicted severity of Septoria tritici using rainfall during stern elongation. A test statistic was derived to test the validity of the iterative search used to obtain both models. This statistic was used in combination with bootstrap analyses in which the search program was rerun using weather data from previous years, therefore uncorrelated with the disease data, to investigate how likely correlations such as the ones found in our models would have been in the absence of genuine relationships.
Resumo:
Leaf blotch, caused by Rhynchosporium secalis, was studied in a range of winter barley cultivars using a combination of traditional plant pathological techniques and newly developed multiplex and real-time polymerase chain reaction (PCR) assays. Using PCR, symptomless leaf blotch colonization was shown to occur throughout the growing season in the resistant winter barley cv. Leonie. The dynamics of colonization throughout the growing season were similar in both Leonie and Vertige, a susceptible cultivar. However, pathogen DNA levels were approximately 10-fold higher in the susceptible cultivar, which expressed symptoms throughout the growing season. Visual assessments and PCR also were used to determine levels of R. secalis colonization and infection in samples from a field experiment used to test a range of winter barley cultivars with different levels of leaf blotch resistance. The correlation between the PCR and visual assessment data was better at higher infection levels (R(2) = 0.81 for leaf samples with >0.3% disease). Although resistance ratings did not correlate well with levels of disease for all cultivars tested, low levels of infection were observed in the cultivar with the highest resistance rating and high levels of infection in the cultivar with the lowest resistance rating.
Resumo:
Potatoes of a number of varieties of contrasting levels of resistance were planted in pure or mixed stands in four experiments over 3 years. Three experiments compared the late blight severity and progress in mixtures with that in pure stands. Disease on susceptible or moderately resistant varieties typical of those in commercial use was similar in mixtures and pure stands. In 2 of 3 years, there were slight reductions on cv. Sante, which is moderately susceptible, in mixture with cv. Cara, which is moderately resistant. Cara was unaffected by this mixture. Mixtures of an immune or near-immune partner with Cara or Sante substantially reduced disease on the latter. The effect of the size of plots of individual varieties or mixtures on blight severity was compared in two experiments. Larger plots had a greater area under the disease progress curve, but the average rate of disease progress was greater in smaller plots; this may be because most disease progress took place later, under more favourable conditions, in the smaller plots. In one experiment, two planting densities were used. Density had no effect on disease and did not interact with mixture effects. The overall conclusion is that, while mixtures of potato varieties may be desirable for other reasons, they do not offer any improvement on the average of the disease resistance of the components.
Resumo:
Both airborne spores of Rhynchosporium secalis and seed infection have been implied as major sources of primary inoculum for barley leaf blotch (scald) epidemics in fields without previous history of barley cropping. However, little is known about their relative importance in the onset of disease. Results from both quantitative real-time PCR and visual assessments indicated that seed infection was the main source of inoculum in the field trial conducted in this study. Glasshouse studies established that the pathogen can be transmitted from infected seeds into roots, shoots and leaves without causing symptoms. Plants in the field trial remained symptomless for approximately four months before symptoms were observed in the crop. Covering the crop during part of the growing season was shown to prevent pathogen growth, despite the use of infected seed, indicating that changes in the physiological condition of the plant and/or environmental conditions may trigger disease development. However, once the disease appeared in the field it quickly became uniform throughout the cropping area. Only small amounts of R. secalis DNA were measured in 24 h spore-trap tape samples using PCR. Inoculum levels equivalent to spore concentrations between 30 and 60 spores per m3 of air were only detected on three occasions during the growing season. The temporal pattern and level of detection of R. secalis DNA in spore tape samples indicated that airborne inoculum was limited and most likely represented rain-splashed conidia rather than putative ascospores.
Resumo:
This review assesses the impacts, both direct and indirect, of man-made changes to the composition of the air over a 200 year period on the severity of arable crop disease epidemics. The review focuses on two well-studied UK arable crops,wheat and oilseed rape, relating these examples to worldwide food security. In wheat, impacts of changes in concentrations of SO2 in air on two septoria diseases are discussed using data obtained from historical crop samples and unpublished experimental work. Changes in SO2 seem to alter septoria disease spectra both through direct effects on infection processes and through indirect effects on soil S status. Work on the oilseed rape diseases phoma stem canker and light leaf spot illustrates indirect impacts of increasing concentrations of greenhouse gases, mediated through climate change. It is projected that, by the 2050s, if diseases are not controlled, climate change will increase yields in Scotland but halve yields in southern England. These projections are discussed in relation to strategies for adaptation to environmental change. Since many strategies take10–15 years to implement, it is important to take appropriate decisions soon. Furthermore, it is essential to make appropriate investment in collation of long-term data, modelling and experimental work to guide such decision-making by industry and government, as a contribution to worldwide food security.
Resumo:
The incidence-severity relationship for cashew gummosis, caused by Lasiodiplodia theobromae, was studied to determine the feasibility of using disease incidence to estimate indirectly disease severity in order to establish the potential damage caused by this disease in semiarid north-eastern Brazil. Epidemics were monitored in two cashew orchards, from 1995 to 1998 in an experimental field composed of 28 dwarf clones, and from 2000 to 2002 in a commercial orchard of a single clone. The two sites were located 10 km from each other. Logarithmic transformation achieved the best linear adjustment of incidence and severity data as determined by coefficients of determination for place, age and pooled data. A very high correlation between incidence and severity was found in both fields, with different disease pressures, different cashew genotypes, different ages and at several epidemic stages. Thus, the easily assessed gummosis incidence could be used to estimate gummosis severity levels.
Resumo:
Foot and mouth disease (FMD) is a major threat, not only to countries whose economies rely on agricultural exports, but also to industrialised countries that maintain a healthy domestic livestock industry by eliminating major infectious diseases from their livestock populations. Traditional methods of controlling diseases such as FMD require the rapid detection and slaughter of infected animals, and any susceptible animals with which they may have been in contact, either directly or indirectly. During the 2001 epidemic of FMD in the United Kingdom (UK), this approach was supplemented by a culling policy driven by unvalidated predictive models. The epidemic and its control resulted in the death of approximately ten million animals, public disgust with the magnitude of the slaughter, and political resolve to adopt alternative options, notably including vaccination, to control any future epidemics. The UK experience provides a salutary warning of how models can be abused in the interests of scientific opportunism.
Resumo:
We developed a stochastic simulation model incorporating most processes likely to be important in the spread of Phytophthora ramorum and similar diseases across the British landscape (covering Rhododendron ponticum in woodland and nurseries, and Vaccinium myrtillus in heathland). The simulation allows for movements of diseased plants within a realistically modelled trade network and long-distance natural dispersal. A series of simulation experiments were run with the model, representing an experiment varying the epidemic pressure and linkage between natural vegetation and horticultural trade, with or without disease spread in commercial trade, and with or without inspections-with-eradication, to give a 2 x 2 x 2 x 2 factorial started at 10 arbitrary locations spread across England. Fifty replicate simulations were made at each set of parameter values. Individual epidemics varied dramatically in size due to stochastic effects throughout the model. Across a range of epidemic pressures, the size of the epidemic was 5-13 times larger when commercial movement of plants was included. A key unknown factor in the system is the area of susceptible habitat outside the nursery system. Inspections, with a probability of detection and efficiency of infected-plant removal of 80% and made at 90-day intervals, reduced the size of epidemics by about 60% across the three sectors with a density of 1% susceptible plants in broadleaf woodland and heathland. Reducing this density to 0.1% largely isolated the trade network, so that inspections reduced the final epidemic size by over 90%, and most epidemics ended without escape into nature. Even in this case, however, major wild epidemics developed in a few percent of cases. Provided the number of new introductions remains low, the current inspection policy will control most epidemics. However, as the rate of introduction increases, it can overwhelm any reasonable inspection regime, largely due to spread prior to detection. (C) 2009 Elsevier B.V. All rights reserved.
Resumo:
Tick-borne encephalitis virus (TBEV) causes human epidemics across Eurasia. Clinical manifestations range from inapparent infections and fevers to fatal encephalitis but the factors that determine disease severity are currently undefined. TBEV is characteristically a hemagglutinating (HA) virus; the ability to agglutinate erythrocytes tentatively reflects virion receptor/fusion activity. However, for the past few years many atypical HA-deficient strains have been isolated from patients and also from the natural European host tick, Ixodes persulcatus. By analysing the sequences of HA-deficient strains we have identified 3 unique amino acid substitutions (D67G, E122G or D277A) in the envelope protein, each of which increases the net charge and hydrophobicity of the virion surface. Therefore, we genetically engineered virus mutants each containing one of these 3 substitutions; they all exhibited HA-deficiency. Unexpectedly, each genetically modified non-HA virus demonstrated increased TBEV reproduction in feeding Ixodes ricinus, not the recognised tick host for these strains. Moreover, virus transmission efficiency between infected and uninfected ticks co-feeding on mice was also intensified by each substitution. Retrospectively, the mutation D67G was identified in viruses isolated from patients with encephalitis. We propose that the emergence of atypical Siberian HA-deficient TBEV strains in Europe is linked to their molecular adaptation to local ticks. This process appears to be driven by the selection of single mutations that change the virion surface thus enhancing receptor/fusion function essential for TBEV entry into the unfamiliar tick species. As the consequence of this adaptive mutagenesis, some of these mutations also appear to enhance the ability of TBEV to cross the human blood-brain barrier, a likely explanation for fatal encephalitis. Future research will reveal if these emerging Siberian TBEV strains continue to disperse westwards across Europe by adaptation to the indigenous tick species and if they are associated with severe forms of TBE.
Resumo:
Epidemics of tick-borne encephalitis involving thousands of humans occur annually in the forested regions of Europe and Asia. Despite the importance of this disease, the underlying basis for the development of encephalitis remains undefined. Here, we prove the key role of CD8(+) T-cells in the immunopathology of tick-borne encephalitis, as demonstrated by prolonged survival of SCID or CD8(-/-) mice, following infection, when compared with immunocompetent mice or mice with adoptively transferred CD8(+) T-cells. The results imply that tick-borne encephalitis is an immunopathological disease and that the inflammatory reaction significantly contributes to the fatal outcome of the infection. (C) 2008 Elsevier Inc. All rights reserved.
Resumo:
Key weather factors determining the occurrence and severity of powdery mildew and yellow rust epidemics on winter wheat were identified. Empirical models were formulated to qualitatively predict a damaging epidemic (>5% severity) and quantitatively predict the disease severity given a damaging epidemic occurred. The disease data used was from field experiments at 12 locations in the UK covering the period from 1994 to 2002 with matching data from weather stations within a 5 km range. Wind in December to February was the most influential factor for a damaging epidemic of powdery mildew. Disease severity was best identified by a model with temperature, humidity, and rain in April to June. For yellow rust, the temperature in February to June was the most influential factor for a damaging epidemic as well as for disease severity. The qualitative models identified favorable circumstances for damaging epidemics, but damaging epidemics did not always occur in such circumstances, probably due to other factors such as the availability of initial inoculum and cultivar resistance.
Resumo:
Long distance dispersal (LDD) plays an important role in many population processes like colonization, range expansion, and epidemics. LDD of small particles like fungal spores is often a result of turbulent wind dispersal and is best described by functions with power-law behavior in the tails ("fat tailed"). The influence of fat-tailed LDD on population genetic structure is reported in this article. In computer simulations, the population structure generated by power-law dispersal with exponents in the range of -2 to -1, in distinct contrast to that generated by exponential dispersal, has a fractal structure. As the power-law exponent becomes smaller, the distribution of individual genotypes becomes more self-similar at different scales. Common statistics like G(ST) are not well suited to summarizing differences between the population genetic structures. Instead, fractal and self-similarity statistics demonstrated differences in structure arising from fat-tailed and exponential dispersal. When dispersal is fat tailed, a log-log plot of the Simpson index against distance between subpopulations has an approximately constant gradient over a large range of spatial scales. The fractal dimension D-2 is linearly inversely related to the power-law exponent, with a slope of similar to -2. In a large simulation arena, fat-tailed LDD allows colonization of the entire space by all genotypes whereas exponentially bounded dispersal eventually confines all descendants of a single clonal lineage to a relatively small area.
Resumo:
Influenza virus epidemics occur on an annual basis and cause severe disease in the very young and old. The vaccine administered to high-risk groups is generated by amplifying reassortant viruses, with chronologically relevant viral surface antigens, in eggs. Every 20 years or so, influenza pandemics occur causing widespread fatality in all age groups. These viruses display novel viral surface antigens acquired from a zoonotic source, and vaccination against them poses new issues since production of large amounts of a respiratory virus containing novel surface antigens could be dangerous for those involved in manufacture. To minimise risks, it is advisable to use a virus whose genetic backbone is highly attenuated in man. Traditionally, the A/PR/8/34 strain of virus is used, however, the genetic basis of its attenuation is unclear. Cold-adapted (CA) strains of the influenza virus are all based on the H2N2 subtype, itself a virus with pandemic potential, and again the genetic basis of temperature sensitivity is not yet established. Reverse genetics technology allows us to engineer designer influenza viruses to order. Using this technology, we have been investigating mutations in several different gene segments to effectively attenuate potential vaccine strains allowing the safe production of vaccine to protect against the next pandemic. (C) 2003 Elsevier B.V. All rights reserved.
Resumo:
The Group on Earth Observations System of Systems, GEOSS, is a co-ordinated initiative by many nations to address the needs for earth-system information expressed by the 2002 World Summit on Sustainable Development. We discuss the role of earth-system modelling and data assimilation in transforming earth-system observations into the predictive and status-assessment products required by GEOSS, across many areas of socio-economic interest. First we review recent gains in the predictive skill of operational global earth-system models, on time-scales of days to several seasons. We then discuss recent work to develop from the global predictions a diverse set of end-user applications which can meet GEOSS requirements for information of socio-economic benefit; examples include forecasts of coastal storm surges, floods in large river basins, seasonal crop yield forecasts and seasonal lead-time alerts for malaria epidemics. We note ongoing efforts to extend operational earth-system modelling and assimilation capabilities to atmospheric composition, in support of improved services for air-quality forecasts and for treaty assessment. We next sketch likely GEOSS observational requirements in the coming decades. In concluding, we reflect on the cost of earth observations relative to the modest cost of transforming the observations into information of socio-economic value.
Resumo:
Epidemics of community-acquired Staphylococcus aureus are caused by strains producing high concentrations of phenol-soluble modulins (PSMs). How neutrophils sense PSMs is revealed in this issue of Cell Host & Microbe. Such interactions are key to infection outcome and may be the basis for development of new treatment strategies.