23 resultados para SOIL NEMATODE COMMUNITY
Resumo:
This is part of a GRDC funded project led by Dr Jeremy Whish of CSIRO Ecosystem Sciences. The project aims to build a root-lesion nematode module into the crop growth simulation program APSIM (Agricultural Production Systems Simulator). This will utilise existing nematode and crop data from field, glasshouse and laboratory research led by Dr John Thompson. New data will be collected to validate and extend the model.
Resumo:
A field experiment was established in which an amendment of poultry manure and sawdust (200 t/ha) was incorporated into some plots but not others and then a permanent pasture or a sequence of biomass-producing crops was grown with and without tillage, with all biomass being returned to the soil. After 4 years, soil C levels were highest in amended plots, particularly those that had been cropped using minimum tillage, and lowest in non-amended and fallowed plots, regardless of how they had been tilled. When ginger was planted, symphylans caused severe damage to all treatments, indicating that cropping, tillage and organic matter management practices commonly used to improve soil health are not necessarily effective for all crops or soils. During the rotational phase of the experiment, the development of suppressiveness to three key pathogens of ginger was monitored using bioassays. Results for root-knot nematode (Meloidogyne javanica) indicated that for the first 2 years, amended soil was more suppressive than non-amended soil from the same cropping and tillage treatment, whereas under pasture, the amendment only enhanced suppressiveness in the first year. Suppressiveness was generally associated with higher C levels and enhanced biological activity (as measured by the rate of fluorescein diacetate (FDA) hydrolysis and numbers of free-living nematodes). Reduced tillage also enhanced suppressiveness, as gall ratings and egg counts in the second and third years were usually significantly lower in cropped soils under minimum rather than conventional tillage. Additionally, soil that was not disturbed during the process of setting up bioassays was more suppressive than soil which had been gently mixed by hand. Results of bioassays with Fusarium oxysporum f. sp. zingiberi were too inconsistent to draw firm conclusions, but the severity of fusarium yellows was generally higher in fumigated fallow soil than in other treatments, with soil management practices having little impact on disease severity. With regard to Pythium myriotylum, biological factors capable of reducing rhizome rot were present, but were not effective enough to suppress the disease under environmental conditions that were ideal for disease development.
Resumo:
The root lesion nematode Pratylenchus thornei is widely distributed in Australian wheat (Triticum aestivum) producing regions and can reduce yield by more than 50%, costing the industry AU$50 M/year. Genetic resistance is the most effective form of management but no commercial cultivars are resistant (R) and the best parental lines are only moderately R. The wild relatives of wheat have evolved in P. thornei-infested soil for millennia and may have superior levels of resistance that can be transferred to commercial wheats. To evaluate this hypothesis, a collection of 251 accessions of wheat and related species was tested for resistance to P. thornei under controlled conditions in glasshouse pot experiments over two consecutive years. Diploid accessions were more R than tetraploid accessions which proved more R than hexaploid accessions. Of the diploid accessions, 11 (52%) Aegilops speltoides (S-[B]-genome), 10 (43%) Triticum monococcum (A (m) -genome) and 5 (24%) Triticum urartu (A (u) -genome) accessions were R. One tetraploid accession (Triticum dicoccoides) was R. This establishes for the first time that P. thornei resistance is located on the A-genome and confirms resistance on the B-genome. Since previous research has shown that the moderate levels of P. thornei resistance in hexaploid wheat are dose-dependent, additive and located on the B and D-genomes, it would seem efficient to target A-genome resistance for introduction to hexaploid lines through direct crossing, using durum wheat as a bridging species and/or through the development of amphiploids. This would allow resistances from each genome to be combined to generate a higher level of resistance than is currently available in hexaploid wheat.
Resumo:
The root-lesion nematode, Pratylenchus thornei, can reduce wheat yields by >50%. Although this nematode has a broad host range, crop rotation can be an effective tool for its management if the host status of crops and cultivars is known. The summer crops grown in the northern grain region of Australia are poorly characterised for their resistance to P. thornei and their role in crop sequencing to improve wheat yields. In a 4-year field experiment, we prepared plots with high or low populations of P. thornei by growing susceptible wheat or partially resistant canaryseed (Phalaris canariensis); after an 11-month, weed-free fallow, several cultivars of eight summer crops were grown. Following another 15-month, weed-free fallow, P. thornei-intolerant wheat cv. Strzelecki was grown. Populations of P. thornei were determined to 150 cm soil depth throughout the experiment. When two partially resistant crops were grown in succession, e.g. canaryseed followed by panicum (Setaria italica), P. thornei populations were <739/kg soil and subsequent wheat yields were 3245 kg/ha. In contrast, after two susceptible crops, e.g. wheat followed by soybean, P. thornei populations were 10 850/kg soil and subsequent wheat yields were just 1383 kg/ha. Regression analysis showed a linear, negative response of wheat biomass and grain yield with increasing P. thornei populations and a predicted loss of 77% for biomass and 62% for grain yield. The best predictor of wheat yield loss was P. thornei populations at 0-90 cm soil depth. Crop rotation can be used to reduce P. thornei populations and increase wheat yield, with greatest gains being made following two partially resistant crops grown sequentially.
Resumo:
Natural biological suppression of soil-borne diseases is a function of the activity and composition of soil microbial communities. Soil microbe and phytopathogen interactions can occur prior to crop sowing and/or in the rhizosphere, subsequently influencing both plant growth and productivity. Research on suppressive microbial communities has concentrated on bacteria although fungi can also influence soil-borne disease. Fungi were analyzed in co-located soils 'suppressive' or 'non-suppressive' for disease caused by Rhizoctonia solani AG 8 at two sites in South Australia using 454 pyrosequencing targeting the fungal 28S LSU rRNA gene. DNA was extracted from a minimum of 125 g of soil per replicate to reduce the micro-scale community variability, and from soil samples taken at sowing and from the rhizosphere at 7 weeks to cover the peak Rhizoctonia infection period. A total of ∼994,000 reads were classified into 917 genera covering 54% of the RDP Fungal Classifier database, a high diversity for an alkaline, low organic matter soil. Statistical analyses and community ordinations revealed significant differences in fungal community composition between suppressive and non-suppressive soil and between soil type/location. The majority of differences associated with suppressive soils were attributed to less than 40 genera including a number of endophytic species with plant pathogen suppression potentials and mycoparasites such as Xylaria spp. Non-suppressive soils were dominated by Alternaria , Gibberella and Penicillum. Pyrosequencing generated a detailed description of fungal community structure and identified candidate taxa that may influence pathogen-plant interactions in stable disease suppression. © 2014 Penton et al.
Resumo:
Soil biogeochemical cycles are largely mediated by microorganisms, while fire significantly modifies biogeochemical cycles mainly via altering microbial community and substrate availability. Majority of studies on fire effects have focused on the surface soil; therefore, our understanding of the vertical distribution of microbial communities and the impacts of fire on nitrogen (N) dynamics in the soil profile is limited. Here, we examined the changes of soil denitrification capacity (DNC) and denitrifying communities with depth under different burning regimes, and their interaction with environmental gradients along the soil profile. Results showed that soil depth had a more pronounced impact than the burning treatment on the bacterial community size. The abundance of 16S rRNA and denitrification genes (narG, nirK, and nirS) declined exponentially with soil depth. Surprisingly, the nosZ-harboring denitrifiers were enriched in the deeper soil layers, which was likely to indicate that the nosZ-harboring denitrifiers could better adapt to the stress conditions (i.e., oxygen deficiency, nutrient limitation, etc.) than other denitrifiers. Soil nutrients, including dissolved organic carbon (DOC), total soluble N (TSN), ammonium (NH4 +), and nitrate (NO3 −), declined significantly with soil depth, which probably contributed to the vertical distribution of denitrifying communities. Soil DNC decreased significantly with soil depth, which was negligible in the depths below 20 cm. These findings have provided new insights into niche separation of the N-cycling functional guilds along the soil profile, under a varied fire disturbance regime.
Resumo:
Pratylenchus thornei is a major pathogen of wheat in Australia. Two glasshouse experiments with four wheat cultivars that had different final populations (Pf) of P. thornei in the field were used to optimise conditions for assessing resistance. With different initial populations (Pi) ranging up to 5250 P. thornei/kg soil, Pf of P. thornei increased to 16 weeks after sowing, and then decreased at 20 weeks in some cultivar x Pi combinations. The population dynamics of P. thornei up to 16 weeks were best described by a modified exponential equation P f (t) = aP i e kt where P f (t) is the final population density at time t, P i is the initial population density, a is the proportion of P i that initiates population development, and k is the intrinsic rate of increase of the population. The cultivar GS50a had very low k values at Pi of 5250 and 1050 indicating its resistance, Suneca and Potam had high k values indicating susceptibility, whereas intolerant Gatcher had a low value at the higher Pi and a high value at the lower Pi. Nitrate fertiliser increased plant growth and Pf values of susceptible cultivars, but in unplanted soil it decreased Pf. Nematicide (aldicarb 5 mg/kg soil) killed P. thornei more effectively in planted than in unplanted soil and increased plant growth particularly in the presence of N fertiliser. In both experiments, the wheat cultivars Suneca and Potam were more susceptible than the cultivar GS50a reflecting field results. The method chosen to discriminate wheat cultivars was to assess Pf after growth for 16 weeks in soil with Pi ~1050–5250 P. thornei/kg soil and fertilised with 200 mg NO3–N/kg soil.
Resumo:
Two trials were done in this project. One was a continuation of work started under a previous GRDC/SRDC-funded activity, 'Strategies to improve the integration of legumes into cane based farming systems'. This trial aimed to assess the impact of trash and tillage management options and nematicide application on nematodes and crop performance. Methods and results are contained in the following publication: Halpin NV, Stirling GR, Rehbein WE, Quinn B, Jakins A, Ginns SP. The impact of trash and tillage management options and nematicide application on crop performance and plant-parasitic nematode populations in a sugarcane/peanut farming system. Proc. Aust. Soc. Sugar Cane Technol. 37, 192-203. Nematicide application in the plant crop significantly reduced total numbers of plant parasitic nematodes (PPN) but there was no impact on yield. Application of nematicide to the ratoon crop significantly reduced sugar yield. The study confirmed other work demonstrating that implementation of strategies like reduced tillage reduced populations of total PPN, suggesting that the soil was more suppressive to PPN in those treatments. The second trial, a variety trial, demonstrated the limited value of nematicide application in sugarcane farming systems. This study has highlighted that growers shouldn’t view nematicides as a ‘cure all’ for paddocks that have historically had high PPN numbers. Nematicides have high mammalian toxicity, have the potential to contaminate ground water (Kookana et al. 1995) and are costly. The cost of nematicide used in R1 was approx. $320 - $350/ha, adding $3.50/t of cane in a 100 t/ha crop. Also, our study demonstrated that a single nematicide treatment at the application rate registered for sugarcane is not very effective in reducing populations of nematode pests. There appears to be some levels of resistance to nematodes within the current suite of varieties available to the southern canelands. For example the soil in plots that were growing Q183 had 560% more root knot nematodes / 200mL soil compared to plots that grew Q245. The authors see great value in investment into a nematode screening program that could rate varieties into groups of susceptibility to both major sugarcane nematode pests. Such a rating could then be built into a decision support ‘tree’ or tool to better enable producers to select varieties on a paddock by paddock basis.