88 resultados para soil CO2 efflux
Resumo:
Workshops to increase participants understanding and knowledge by farm businesses and healthy catchments farmers about the role of soil health in supporting sustainable through variable circumstances, farm businesses and healthy catchments.
Resumo:
Radopholus similis is a major constraint to banana production in Australia and growers have relied on nematicides to manage production losses. The use of organic amendments is one method that may reduce the need for nematicides, but there is limited knowledge of the influence of organic amendments on endo-migratory nematodes, such as R. similis. Nine different amendments, namely, mill mud, mill ash, biosolids, municipal waste compost, banana residue, grass hay, legume hay, molasses and calcium silicate were applied to the three major soil types of the wet tropics region used for banana production. The nutrient content of the amendments was also determined. Banana plants were inoculated with R. similis and grown in the soil-amendment mix for 12-weeks in a glasshouse experiment. Assessments of plant growth, plant-parasitic nematodes and soil nematode community characteristics were made at the termination of the experiment. Significant suppression of plant-parasitic nematodes occurred in soils amended with legume hay, grass hay, banana residue and mill mud relative to untreated soil. These amendments were found to have the highest N and C content. The application of banana residue and mill mud significantly increased shoot dry weight at the termination of the experiment relative to untreated soil. Furthermore, the applications of banana residue, grass hay, mill mud and municipal waste compost increased the potential for suppression of plant-parasitic nematodes through antagonistic activity. The application of amendments that are high in C and N appeared to be able to induce suppression of plant-parasitic nematodes in bananas, by developing a more favourable environment for antagonistic organisms.
Resumo:
Parthenium weed (Parthenium hysterophorus L.) is an erect, branched, annual plant of the family Asteraceae. It is native to the tropical Americas, while now widely distributed throughout Africa, Asia, Oceania, and Australasia. Due to its allelopathic and toxic characteristics, parthenium weed has been considered to be a weed of global significance. These effects occur across agriculture (crops and pastures), within natural ecosystems, and has impacts upon health (human and animals). Although integrated weed management (IWM) for parthenium weed has had some success, due to its tolerance and good adaptability to temperature, precipitation, and CO2, this weed has been predicted to become more vigorous under a changing climate resulting in an altered canopy architecture. From the viewpoint of IWM, the altered canopy architecture may be associated with not only improved competitive ability and replacement but also may alter the effectiveness of biocontrol agents and other management strategies. This paper reports on a preliminary study on parthenium weed canopy architecture at three temperature regimes (day/night 22/15 °C, 27/20 °C, and 32/25 °C in thermal time 12/12 hours) and establishes a threedimensional (3D) canopy model using Lindenmayer-systems (L-systems). This experiment was conducted in a series of controlled environment rooms with parthenium weed plants being grown in a heavy clay soil. A sonic digitizer system was used to record the morphology, topology, and geometry of the plants for model construction. The main findings include the determination of the phyllochron which enables the prediction of parthenium weed growth under different temperature regimes and that increased temperature enhances growth and enlarges the plants canopy size and structure. The developed 3D canopy model provides a tool to simulate and predict the weed growth in response to temperature, and can be adjusted for studies of other climatic variables such as precipitation and CO2. Further studies are planned to investigate the effects of other climatic variables, and the predicted changes in the pathogenic biocontrol agent effectiveness.
Resumo:
A field experiment was established in which an amendment of poultry manure and sawdust (200 t/ha) was incorporated into some plots but not others and then a permanent pasture or a sequence of biomass-producing crops was grown with and without tillage, with all biomass being returned to the soil. After 4 years, soil C levels were highest in amended plots, particularly those that had been cropped using minimum tillage, and lowest in non-amended and fallowed plots, regardless of how they had been tilled. When ginger was planted, symphylans caused severe damage to all treatments, indicating that cropping, tillage and organic matter management practices commonly used to improve soil health are not necessarily effective for all crops or soils. During the rotational phase of the experiment, the development of suppressiveness to three key pathogens of ginger was monitored using bioassays. Results for root-knot nematode (Meloidogyne javanica) indicated that for the first 2 years, amended soil was more suppressive than non-amended soil from the same cropping and tillage treatment, whereas under pasture, the amendment only enhanced suppressiveness in the first year. Suppressiveness was generally associated with higher C levels and enhanced biological activity (as measured by the rate of fluorescein diacetate (FDA) hydrolysis and numbers of free-living nematodes). Reduced tillage also enhanced suppressiveness, as gall ratings and egg counts in the second and third years were usually significantly lower in cropped soils under minimum rather than conventional tillage. Additionally, soil that was not disturbed during the process of setting up bioassays was more suppressive than soil which had been gently mixed by hand. Results of bioassays with Fusarium oxysporum f. sp. zingiberi were too inconsistent to draw firm conclusions, but the severity of fusarium yellows was generally higher in fumigated fallow soil than in other treatments, with soil management practices having little impact on disease severity. With regard to Pythium myriotylum, biological factors capable of reducing rhizome rot were present, but were not effective enough to suppress the disease under environmental conditions that were ideal for disease development.
Resumo:
This manual identifies simple, practical tests to measure soil health and outlines the use of an on-farm testing kit to perform these tests. This testing is designed so that banana producers or agricultural consultants can asses or monitor the health of the soil inexpensively and without the need for a laboratory.
Resumo:
On-going, high-profile public debate about climate change has focussed attention on how to monitor the soil organic carbon stock (C(s)) of rangelands (savannas). Unfortunately, optimal sampling of the rangelands for baseline C(s) - the critical first step towards efficient monitoring - has received relatively little attention to date. Moreover, in the rangelands of tropical Australia relatively little is known about how C(s) is influenced by the practice of cattle grazing. To address these issues we used linear mixed models to: (i) unravel how grazing pressure (over a 12-year period) and soil type have affected C(s) and the stable carbon isotope ratio of soil organic carbon (delta(13)C) (a measure of the relative contributions of C(3) and C(4) vegetation to C(s)); (ii) examine the spatial covariation of C(s) and delta(13)C; and, (iii) explore the amount of soil sampling required to adequately determine baseline C(s). Modelling was done in the context of the material coordinate system for the soil profile, therefore the depths reported, while conventional, are only nominal. Linear mixed models revealed that soil type and grazing pressure interacted to influence C(s) to a depth of 0.3 m in the profile. At a depth of 0.5 m there was no effect of grazing on C(s), but the soil type effect on C(s) was significant. Soil type influenced delta(13)C to a soil depth of 0.5 m but there was no effect of grazing at any depth examined. The linear mixed model also revealed the strong negative correlation of C(s) with delta(13)C, particularly to a depth of 0.1 m in the soil profile. This suggested that increased C(s) at the study site was associated with increased input of C from C(3) trees and shrubs relative to the C(4) perennial grasses; as the latter form the bulk of the cattle diet, we contend that C sequestration may be negatively correlated with forage production. Our baseline C(s) sampling recommendation for cattle-grazing properties of the tropical rangelands of Australia is to: (i) divide the property into units of apparently uniform soil type and grazing management; (ii) use stratified simple random sampling to spread at least 25 soil sampling locations about each unit, with at least two samples collected per stratum. This will be adequate to accurately estimate baseline mean C(s) to within 20% of the true mean, to a nominal depth of 0.3 m in the profile.
Resumo:
Macfadyena unguis-cati (L.) Gentry (Bignoniaceae) is a major environmental weed in coastal Queensland, Australia. There is a lack of quantitative data on its leaf chemistry and its impact on soil properties. Soils from infested vs uninfested areas, and leaves of M. unguis-cati and three co-occurring vine species (one exotic, two native) were collected at six sites (riparian and non-riparian) in south-eastern Queensland. Effects of invasion status, species, site and habitat type were examined using univariate and multivariate analyses. Habitat type had a greater effect on soil nutrients than on leaf chemistry. Invasion effect of M. unguis-cati on soil chemistry was more pronounced in non-riparian than in riparian habitat. Significantly higher values were obtained in M. unguis-cati infested (vs. uninfested) soils for ~50% of traits. Leaf ion concentrations differed significantly between exotic and native vines. Observed higher leaf-nutrient load (especially nitrogen, phosphorus and potassium) in exotic plants aligns with the preference of invasive plant species for disturbed habitats with higher nutrient input. Higher load of trace elements (aluminium, boron, cadmium and iron) in its leaves suggests that cycling of heavy-metal ions, many of which are potentially toxic at excess level, could be accelerated in soils of M. unguis-cati-invaded landscape. Although inferences from the present study are based on correlative data, the consistency of the patterns across many sites suggests that M. unguis-cati may improve soil fertility and influence nutrient cycling, perhaps through legacy effects of its own litter input.
Resumo:
The longevity of seed in the soil is a key determinant of the cost and length of weed eradication programs. Soil seed bank information and ongoing research have input into the planning and reporting of two nationally cost shared weed eradication programs based in tropical north Queensland. These eradication programs are targeting serious weeds such as Chromoleana odorata, Mikania micrantha, Miconia calvescens, Clidemia hirta and Limnocharis flava. Various methods are available for estimating soil seed persistence. Field methods to estimate the total and germinable soil seed densities include seed packet burial trials, extracting seed from field soil samples, germinating seed in field soil samples and observations from native range seed bank studies. Interrogating field control records can also indicate the length of the control and monitoring periods needed to exhaust the seed bank. Recently, laboratory tests which rapidly age seed have provided an additional indicator of relative seed persistence. Each method has its advantages, drawbacks and logistical constraints.
Resumo:
Bellyache bush (Jatropha gossypifolia L.) is an invasive shrub that adversely impacts agricultural and natural systems of northern Australia. While several techniques are available to control bellyache bush, depletion of soil seed banks is central to its management. A 10-year study determined the persistence of intact and ant-discarded bellyache bush seeds buried in shade cloth packets at six depths (ranging from 0 to 40 cm) under both natural rainfall and rainfall-excluded conditions. A second study monitored changes in seedling emergence over time, to provide an indication of the natural rate of seed bank depletion at two sites (rocky and heavy clay) following the physical removal of all bellyache bush plants. Persistence of seed in the burial trial varied depending on seed type, rainfall conditions and burial depth. No viable seeds of bellyache bush remained after 72 months irrespective of seed type under natural rainfall conditions. When rainfall was excluded seeds persisted for much longer, with a small portion (0.4%) of ant-discarded seeds still viable after 120 months. Seed persistence was prolonged (> 96 months to decline to < 1% viability) at all burial depths under rainfall-excluded conditions. In contrast, under natural rainfall, surface located seeds took twice as long (70 months) to decline to 1% viability compared with buried seeds (35 months). No seedling emergence was observed after 58 months and 36 months at the rocky and heavy clay soil sites, respectively. These results suggest that the required duration of control programs on bellyache bush may vary due to the effect of biotic and abiotic factors on persistence of soil seed banks.
Resumo:
Statistical studies of rainfed maize yields in the United States(1) and elsewhere(2) have indicated two clear features: a strong negative yield response to accumulation of temperatures above 30 degrees C (or extreme degree days (EDD)), and a relatively weak response to seasonal rainfall. Here we show that the process-based Agricultural Production Systems Simulator (APSIM) is able to reproduce both of these relationships in the Midwestern United States and provide insight into underlying mechanisms. The predominant effects of EDD in APSIM are associated with increased vapour pressure deficit, which contributes to water stress in two ways: by increasing demand for soil water to sustain a given rate of carbon assimilation, and by reducing future supply of soil water by raising transpiration rates. APSIM computes daily water stress as the ratio of water supply to demand, and during the critical month of July this ratio is three times more responsive to 2 degrees C warming than to a 20% precipitation reduction. The results suggest a relatively minor role for direct heat stress on reproductive organs at present temperatures in this region. Effects of elevated CO2 on transpiration efficiency should reduce yield sensitivity to EDD in the coming decades, but at most by 25%.
Resumo:
TRFLP (terminal restriction fragment length polymorphism) was used to assess whether management practices that improved disease suppression and/or yield in a 4-year ginger field trial were related to changes in soil microbial community structure. Bacterial and fungal community profiles were defined by presence and abundance of terminal restriction fragments (TRFs), where each TRF represents one or more species. Results indicated inclusion of an organic amendment and minimum tillage increased the relative diversity of dominant fungal populations in a system dependant way. Inclusion of an organic amendment increased bacterial species richness in the pasture treatment. Redundancy analysis showed shifts in microbial community structure associated with different management practices and treatments grouped according to TRF abundance in relation to yield and disease incidence. ANOVA also indicated the abundance of certain TRFs was significantly affected by farming system management practices, and a number of these TRFs were also correlated with yield or disease suppression. Further analyses are required to determine whether identified TRFs can be used as general or soil-type specific bio-indicators of productivity (increased and decreased) and Pythium myriotylum suppressiveness.
Resumo:
Fire is an important driver of nutrient cycling in savannas. Here, we determined the impact of fire frequency on total and soluble soil nitrogen (N) pools in tropical savanna. The study sites consisted of 1-ha experimental plots near Darwin, Australia, which remained unburnt for at least 14 years or were burnt at 1-, 2- or 5-year intervals over the past 6 years. Soil was analysed from patches underneath tree canopies and in inter-canopy patches at 1, 12, 28, 55 and 152 days after fire. Patch type had a significant effect on all soil N pools, with greater concentrations of total and soluble (nitrate, ammonium, amino acids) N under tree canopies than inter-canopy patches. The time since the last fire had no significant effect on N pools. Fire frequency similarly did not affect total soil N but it did influence soluble soil N. Soil amino acids were most prominent in burnt savanna, ammonium was highest in infrequently burnt (5-year interval) savanna and nitrate was highest in unburnt savanna. We suggest that the main effect of fire on soil N relations occurs indirectly through altered tree-grass dynamics. Previous studies have shown that high fire frequencies reduce tree cover by lowering recruitment and increasing mortality. Our findings suggest that these changes in tree cover could result in a 30% reduction in total soil N and 1060% reductions in soluble N pools. This finding is consistent with studies from savannas globally, providing further evidence for a general theory of patchiness as a key driver of nutrient cycling in the savanna biome.
Resumo:
This study aimed to unravel the effects of climate, topography, soil, and grazing management on soil organic carbon (SOC) stocks in the grazing lands of north-eastern Australia. We sampled for SOC stocks at 98 sites from 18 grazing properties across Queensland, Australia. These samples covered four nominal grazing management classes (Continuous, Rotational, Cell, and Exclosure), eight broad soil types, and a strong tropical to subtropical climatic gradient. Temperature and vapour-pressure deficit explained >80% of the variability of SOC stocks at cumulative equivalent mineral masses nominally representing 0-0.1 and 0-0.3m depths. Once detrended of climatic effects, SOC stocks were strongly influenced by total standing dry matter, soil type, and the dominant grass species. At 0-0.3m depth only, there was a weak negative association between stocking rate and climate-detrended SOC stocks, and Cell grazing was associated with smaller SOC stocks than Continuous grazing and Exclosure. In future, collection of quantitative information on stocking intensity, frequency, and duration may help to improve understanding of the effect of grazing management on SOC stocks. Further exploration of the links between grazing management and above- and below-ground biomass, perhaps inferred through remote sensing and/or simulation modelling, may assist large-area mapping of SOC stocks in northern Australia. © CSIRO 2013.
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.