985 resultados para disease progress
Resumo:
Colletotrichum gossypii var. cephalosporioides, the fungus that causes ramulosis disease of cotton, is widespread in Brazil and can cause severe yield loss. Because weather conditions greatly affect disease development, the objective of this work was to develop weather-based models to assess disease favorability. Latent period, incidence, and severity of ramulosis symptoms were evaluated in controlled environment experiments using factorial combinations of temperature (15, 20, 25, 30, and 35 degrees C) and leaf wetness duration (0, 4, 8, 16, 32, and 64 h after inoculation). Severity was modeled as an exponential function of leaf wetness duration and temperature. At the optimum temperature of disease development, 27 degrees C, average latent period was 10 days. Maximum ramulosis severity occurred from 20 to 30 degrees C, with sharp decreases at lower and higher temperatures. Ramulosis severity increased as wetness periods were increased from 4 to 32 h. In field experiments at Piracicaba, Sao Paulo State, Brazil, cotton plots were inoculated (10(5) conidia ml(-1)) and ramulosis severity was evaluated weekly. The model obtained from the controlled environment study was used to generate a disease favorability index for comparison with disease progress rate in the field. Hourly measurements of solar radiation, temperature, relative humidity, leaf wetness duration, rainfall, and wind speed were also evaluated as possible explanatory variables. Both the disease favorability model and a model based on rainfall explained ramulosis growth rate well, with R(2) of 0.89 and 0.91, respectively. They are proposed as models of ramulosis development rate on cotton in Brazil, and weather-disease relationships revealed by this work can form the basis of a warning system for ramulosis development.
Resumo:
The increase in incidence of charcoal rot caused by Macrophomina phaseolina on soybeans (Glycine max) was followed four seasons in conventional and no-till cropping systems. In the 1997/98 and 2000/01 seasons, total precipitation between sowing and harvest reached 876.3 and 846.9 mm, respectively. For these seasons, disease incidence did not differ significantly between the no-till and conventional systems. In 1998/99 and 1999/00 precipitation totaled 689.9 and 478.3 mm, respectively. In 1998/99, in the no-till system, the disease incidence was 43.7% and 53.1% in the conventional system. In 1999/00 the final incidence was 68.7% and 81.2% for the no-till and conventional systems, respectively. For these two seasons, precipitation was lower than that required for soybean crops (840 mm), and the averages of disease incidence were significantly higher in the conventional system. The concentration of microsclerotia in soil samples was higher in samples collected in conventional system at 0 - 10 cm depth. However, analysis of microsclerotia in roots showed that in years with adequate rain no difference was detected. In dry years, however, roots from plants developed under the conventional system had significantly more microsclerotia. Because of the wide host range of M. phaseolina and the long survival times of the microsclerotia, crop rotation would probably have little benefit in reducing charcoal rot. Under these study conditions it may be a better alternative to suppress charcoal rot by using the no-till cropping system to conserve soil moisture and reduce disease progress.
Resumo:
The progress of the severity of southern rust in maize (Zea mays) caused by Puccinia polysora was quantified in staggered plantings in different geographical areas in Brazil, from October to May, over two years (1995-1996 and 1996-1997). The logistic model, fitted to the data, better described the disease progress curves than the Gompertz model. Four components of the disease progress curves (maximum disease severity; area under the disease progress curve, AUDPC; area under the disease progress curve around the inflection point, AUDPCi; and epidemic rate) were used to compare the epidemics in different areas and at different times of planting. The AUDPC, AUDPCi, and the epidemic rate were analyzed in relation to the weather (temperature, relative humidity, hours of relative humidity >90%, and rainfall) and recorded during the trials. Disease severity reached levels greater than 30% in Piracicaba and Guaíra in the plantings between December and January. Lower values of AUDPC occurred in later plantings at both locations. The epidemic rate was positively correlated (P < 0.05) with the mean daily temperatures and negatively correlated with hours of relative humidity >90%. The AUDPC was not correlated with any weather variable. The AUDPCi was negatively related to both variables connected to humidity, but not to rain. Long periods (mostly >13 h day-1) of relative humidity >90% (that corresponded to leaf wetness) occurred in Castro. Severity of southern rust in maize has always been low in Castro, thus the negative correlations between disease and the two humidity variables.
Resumo:
Asian soybean rust, caused by the fungus Phakopsora pachyrhizi, was reported at epidemic levels in 2003/2004 and is the main soybean disease in Brazil. The aim of this study was to investigate the spread of Asian soybean rust and to quantify airborne urediniospores in the region of Campo Mourão, Paraná State, Brazil. Three experiments were conducted under field conditions during the 2007/08 and 2008/09 crop seasons. Using the disease gradient method, provided by the application of increasing levels of the fungicide tebuconazole, four Asian soybean rust epidemics at different intensities were obtained in each experiment. To quantify the urediniospores, weathercock-type spore collectors were installed during and between the two crop seasons. Disease progress curves were plotted for each epidemic, and maximum severity was estimated. The curves were fit to the logistic model, which provided higher coefficients of determination and more randomly distributed residuals plotted over time. Analyses of the area under the disease progress curve showed that the largest epidemics occurred in the 2007/2008 crop season and that the progress rates were higher for severity, even among plants protected with the fungicide. The number of urediniospores collected in the air was related to the presence of soybean plants in the cultivated crops. The quantity of urediniospores was also positively correlated to the disease severity and incidence, as well as to cumulative rainfall and favorable days for P. Pachyrhizi infection.
Resumo:
Periodontal disease progress by destructive acute phases intercalated by reparative chronic phases. The aim of this study was to investigate the clinical and histological evidence of the periodontal disease reparative phase by analyzing bone wall conditions inside periodontal pockets and histologic images of periodontal pockets, identified in relevant publications. 81 patients with periodontitis, were randomly assigned into this study. Clinical and radiographic parameters were established to diagnose periodontal disease providing a sample of 133 diseased areas, which were treated by modified Widman flap. Documentation by digital photography were recorded in the surgery. Relevant publications showing histological images of periodontal pockets, were identified in Medline, PubMed and Google data base, were scanned and digitalized. All images obtained were evaluated and the presences of the reparative evidence in the zone around the underlying destroyed alveolar bone were critically analyzed. All periodontal bone defects, showed cortical bone reparations at different levels inside periodontal bone defects. All histologic images of periodontal pockets identified in relevant publications showed repaired gingival-attached connective tissue localized above underlying destroyed alveolar bone. All the evidences analyzed in this study suggested that periodontal disease is predominantly chronic, quiescent, showing reparative phases in different levels.
Resumo:
Acknowledgments This work was supported by The Croatian Science Foundation grant. no. IP-2014-09-9730 (“Tau protein hyperphosphorylation, aggregation, and trans-synaptic transfer in Alzheimer’s disease: cerebrospinal fluid analysis and assessment of protective neuroprotective compounds”) and European Cooperation in Science and Technology (COST) Action CM1103 (“Stucture-based drug design for diagnosis and treatment of neurological diseases: dissecting and modulating complex function in the monoaminergic systems of the brain”). PRH is supported in part by NIH grant P50 AG005138.
Resumo:
The aim was to verify if the fungigation via drip irrigation is an alternative to the conventional method of spraying on tomato for controlling early blight. Tomato plants (variety Santa Clara) were grown in pots inside a greenhouse. Fifty days after transplanting, the plants were inoculated with Alternaria solani and treated with four different fungicides: azoxystrobin (8 g 100 L(-1)), difeconazole (50 mL 100 L(-1)), metiram+piraclostrobin (200 g 100 L(-1)) and tebuconazole (100 mL 100 L(-1)) using two applications methods: conventional spraying and fiingigation dripping. The control plants did not receive fungicide application. To assess the severity of the disease, we used a rating scale expressed as the area under the disease progress curve (AUDPC) and production factors, such as number, weight and average diameter of the fruit and its productivity. The experimental design was completely randomized in factorial scheme 4 x 2 + 1 with eight replicates. Each plot had one plant in one pot. A 27% reduction in disease severity was observed when compared with the control plants, with no significant difference noted regarding the application method. The number of fruits did not statistically differ between the treatments. The average weight and diameter of the fruits were superior in the plants that had fungicide application compared to the control plant, reflecting an increase in productivity. Fungigation through water dripping is an alternative to the conventional method of spraying cultured tomatoes.
Resumo:
Maize breeding programmes in Brazil and elsewhere seek reliable methods to identify genotypes resistant to Phaeosphaeria leaf spot. The area under the disease progress curve (AUDPC) is an accurate method to evaluate the severity of foliar diseases. However, at least three data points are required to calculate the AUDPC, which is unfeasible when there are thousands of genotypes to be assessed. The aim of this work was to estimate the heritability of disease resistance, evaluate disease severity at different times using a nine-point scale in comparison to the AUDPC, and establish the most suitable phenological period for disease assessment. A repeated experiment was conducted in a 11 x 11 lattice experimental design with three replications. Disease assessments were carried out at flowering, 15 and 30 days post-anthesis for the parental lines DS95, DAS21, the F1 generation and 118 F2:3 progenies. Then, the AUDPC was obtained and results compared with the single-point evaluations used to calculate it. Individual and joint analyses of variance were conducted to obtain heritabiliy estimates. The assessments performed after the flowering stage gave higher estimates of heritability and correlation with AUDPC. We concluded that one assessment between the 15th and 30th day after flowering could provide enough information to distinguish maize genotypes for their resistance to Phaeosphaeria leaf spot under tropical conditions.
Resumo:
The effects of copper sprays on annual and polyetic progress of citrus canker, caused by Xanthomonas citri subsp. citri, in the presence of the Asian citrus leafminer (Phyllocnistis citrella), were evaluated in a study conducted in a commercial orchard in northwest Parana state, Brazil, where citrus canker is endemic. Nonlinear monomolecular, logistic and Gompertz models were fitted to monthly disease incidence data (proportion of leaves with symptoms) for each treatment for three seasons. The logistic model provided the best estimate of disease progress for all years and treatments evaluated and logistic parameter estimates were used to describe polyetic disease dynamics. Although citrus canker incidence increased during each of the seasons studied, it decreased over the whole study period, more so in copper-treated trees than in water-sprayed controls. Copper treatment reduced disease incidence compared with controls in every year, especially 2004-2005, when incidence was ca. 10-fold higher in controls than in treated plots (estimated asymptote values 0 center dot 82 and 0 center dot 07, respectively). Copper treatment also reduced estimated initial disease incidence and epidemic growth rates every year.
Resumo:
Nutrition in bean plants and anthracnose intensity in function of silicon and copper application. The objective of this work was to evaluate the effect of calcium silicate and copper sulfate on anthracnose intensity and nutrition of bean plants. The experiment was conducted using an experimental design in randomized blocks following a 4 x 4 factorial arrangement , (four levels of calcium silicate and four levels of copper sulfate) and two additional treatments (plants without inoculation and plants sprinkled with Benomyl). Four evaluations of the incidence and severity of anthracnose were done, in addition to measuring, total leaf area. At the end of the evaluations, incidence: and data were integrated over time, obtaining the area under disease progress curve (AUDPC). Contents of N, P, K, Ca, Mg, B, Cu, Fe, Mn, Zn, Si and lignin were determined in the aerial Part. A linear decrease of the intensity AUDPC was observed with the increase of the doses of calcium silicate. The severity AUDPC was influenced by the doses of copper, obtaining a reduction of 35% on the higher dosage. The supply of silicon and copper altered the content of the K, mg, S, Zn, Ca and Si in the aerial part of the bean plants.
Resumo:
Some patients with liver disease progress to cirrhosis, but the risk factors for cirrhosis development are unknown. Dyskeratosis congenita, an inherited bone marrow failure syndrome associated with mucocutaneous anomalies, pulmonary fibrosis, and cirrhosis, is caused by germline mutations of genes in the telomerase complex. We examined whether telomerase mutations also occurred in sporadic cirrhosis. In all, 134 patients with cirrhosis of common etiologies treated at the Liver Research Institute, University of Arizona, between May 2008 and July 2009, and 528 healthy subjects were screened for variation in the TERT and TERC genes by direct sequencing; an additional 1,472 controls were examined for the most common genetic variation observed in patients. Telomere length of leukocytes was measured by quantitative polymerase chain reaction. Functional effects of genetic changes were assessed by transfection of mutation-containing vectors into telomerase-deficient cell lines, and telomerase activity was measured in cell lysates. Nine of the 134 patients with cirrhosis (7%) carried a missense variant in TERT, resulting in a cumulative carrier frequency significantly higher than in controls (P = 0.0009). One patient was homozygous and eight were heterozygous. The allele frequency for the most common missense TERT variant was significantly higher in patients with cirrhosis (2.6%) than in 2,000 controls (0.7%; P = 0.0011). One additional patient carried a TERC mutation. The mean telomere length of leukocytes in patients with cirrhosis, including six mutant cases, was shorter than in age-matched controls (P = 0.0004). Conclusion: Most TERT gene variants reduced telomerase enzymatic activity in vitro. Loss-of-function telomerase gene variants associated with short telomeres are risk factors for sporadic cirrhosis. (HEPATOLOGY 2011;53:1600-1607)
Resumo:
A plant's nutritional balance can influence its resistance to diseases. In order to evaluate the effect of increasing doses of N and K on the yield and severity of the mayze white spot, two experiments were installed in the field, one in the city of Ijaci, Minas Gerais, and the other in the city of Sete Lagoas, Minas Gerais. The experimental delimitation was in randomized blocks with 5 x 5 factorial analysis of variance, and four repetitions. The treatments consisted of five doses of N (20; 40; 80; 150; 190 Kg ha-1of N in the experiments 1 and 2) and five doses of K (15; 30; 60; 120; 180 Kg ha-1of K in experiment 1 and 8.75; 17.5; 35; 50; 100 Kg ha-1of K in experiment 2). The susceptible cultivar 30P70 was planted in both experiments. The plot consisted of four rows 5 meters long, with a useful area consisting of two central rows 3 meters each. Evaluations began 43 days after emergence (DAE) in the first experiment and 56 DAE in the second one. There was no significant interaction between doses of N and K and the disease progress P+. The effect was only observed for N. The K did not influence the yield and the severity of the disease in these experiments. Bigger areas below the severity progress curve of the white spot and better yield were observed with increasing doses of N. Thus, with increasing doses of N, the white spot increased and also did the yield.
Resumo:
Spot bloth caused by Bipolaris sorokiniana is an important wheat desease mainly in hot and humid regions. The aim of this study was to evaluate the response of wheat to different sources and modes of Si application, as related to the severity of wheat spot blotch and plant growth, in two Si-deficient Latosols (Oxisols). An greenhouse experiment was arranged in a 2 x 5 factorial completely randomized design, with eight replications. The treatments consisted of two soils (Yellow Latosol and Red Latosol) and five Si supply modes (no Si application; Si applied as calcium silicate and monosilicic acid to the soil; and Si applied as potassium silicate or monosilicic acid to wheat leaves). No significant differences were observed between the two soils. When Si was applied to the soil, regardless the Si source, the disease incubation period, the shoot dry matter yield and the Si content in leaves were greater. Additionally, the final spot blotch severity was lower and the area under the spot blotch disease progress curve and the leaf insertion angle in the plant were smaller. Results of Si foliar application were similar to those observed in the control plants.
Resumo:
Four field trials were conducted, from 1995 to 1997, with the objective of studying the response of four upland cultivars to foliar fungicide application in relation to panicle blast control, grain yield and sustainability. Differential disease control and yield response of cultivars to fungicide treatment were obtained. Losses in grain yield of cultivars IAC 202, Caiapó, Rio Paranaíba and Araguaia due to panicle blast were 44.8%, 27.4%, 24.4% and 18.2%, respectively. Two applications of tricyclazole or benomyl controlled panicle blast, as indicated by lower values of disease progress curve and relative panicle blast severity, and increased grain yield of the cultivar IAC 202. The losses in 100 panicle grain weight and grain yield were significantly reduced by 22.3% and 25.1% in IAC 202 and 23.6% and 20.5% in Caiapó, respectively, with two sprays of tricyclazole. Sustainable value index for yield was maximum with two applications of tricyclazole (0.59), followed by one application at booting (0.46) and at heading (0.40) in cultivar IAC 202. Results showed no yield response of the cultivars Rio Paranaíba and Araguaia to fungicide applications for panicle blast control.