89 resultados para Soil disturbance
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Negative potassium (K) balances in all broadacre grain cropping systems in northern Australia are resulting in a decline in the plant-available reserves of K and necessitating a closer examination of strategies to detect and respond to developing K deficiency in clay soils. Grain growers on the Red Ferrosol soils have increasingly encountered K deficiency over the last 10 years due to lower available K reserves in these soils in their native condition. However, the problem is now increasingly evident on the medium-heavy clay soils (Black and Grey Vertosols) and is made more complicated by the widespread adoption of direct drill cropping systems and the resulting strong strati. cation of available K reserves in the top 0.05-0.1 m of the soil pro. le. This paper reports glasshouse studies examining the fate of applied K fertiliser in key cropping soils of the inland Burnett region of south-east Queensland, and uses the resultant understanding of K dynamics to interpret results of field trials assessing the effectiveness of K application strategies in terms of K availability to crop plants. At similar concentrations of exchangeable K (K-exch), soil solution K concentrations and activity of K in the soil solution (AR(K)) varied by 6-7-fold between soil types. When K-exch arising from different rates of fertiliser application was expressed as a percentage of the effective cation exchange capacity (i.e. K saturation), there was evidence of greater selective adsorption of K on the exchange complex of Red Ferrosols than Black and Grey Vertosols or Brown Dermosols. Both soil solution K and AR(K) were much less responsive to increasing K-exch in the Black Vertosols; this is indicative of these soils having a high K buffer capacity (KBC). These contrasting properties have implications for the rate of diffusive supply of K to plant roots and the likely impact of K application strategies (banding v. broadcast and incorporation) on plant K uptake. Field studies investigating K application strategies (banding v. broadcasting) and the interaction with the degree of soil disturbance/mixing of different soil types are discussed in relation to K dynamics derived from glasshouse studies. Greater propensity to accumulate luxury K in crop biomass was observed in a Brown Ferrosol with a KBC lower than that of a Black Vertosol, consistent with more efficient diffusive supply to plant roots in the Ferrosol. This luxury K uptake, when combined with crops exhibiting low proportional removal of K in the harvested product (i.e. low K harvest index coarse grains and winter cereals) and residue retention, can lead to rapid re-development of stratified K profiles. There was clear evidence that some incorporation of K fertiliser into soil was required to facilitate root access and crop uptake, although there was no evidence of a need to incorporate K fertiliser any deeper than achieved by conventional disc tillage (i.e. 0.1-0.15 m). Recovery of fertiliser K applied in deep (0.25-0.3 m) bands in combination with N and P to facilitate root proliferation was quite poor in Red Ferrosols and Grey or Black Vertosols with moderate effective cation exchange capacity (ECEC, 25-35 cmol(+)/kg), was reasonable but not enough to overcome K deficiency in a Brown Dermosol (ECEC 11 cmol(+)/kg), but was quite good on a Black Vertosol (ECEC 50-60 cmol(+)/kg). Collectively, results suggest that frequent small applications of K fertiliser, preferably with some soil mixing, is an effective fertiliser application strategy on lighter clay soils with low KBC and an effective diffusive supply mechanism. Alternately, concentrated K bands and enhanced root proliferation around them may be a more effective strategy in Vertosol soils with high KBC and limited diffusive supply. Further studies to assess this hypothesis are needed.
Resumo:
Herbicide runoff from cropping fields has been identified as a threat to the Great Barrier Reef ecosystem. A field investigation was carried out to monitor the changes in runoff water quality resulting from four different sugarcane cropping systems that included different herbicides and contrasting tillage and trash management practices. These include (i) Conventional - Tillage (beds and inter-rows) with residual herbicides used; (ii) Improved - only the beds were tilled (zonal) with reduced residual herbicides used; (iii) Aspirational - minimum tillage (one pass of a single tine ripper before planting) with trash mulch, no residual herbicides and a legume intercrop after cane establishment; and (iv) New Farming System (NFS) - minimum tillage as in Aspirational practice with a grain legume rotation and a combination of residual and knockdown herbicides. Results suggest soil and trash management had a larger effect on the herbicide losses in runoff than the physico-chemical properties of herbicides. Improved practices with 30% lower atrazine application rates than used in conventional systems produced reduced runoff volumes by 40% and atrazine loss by 62%. There were a 2-fold variation in atrazine and >10-fold variation in metribuzin loads in runoff water between reduced tillage systems differing in soil disturbance and surface residue cover from the previous rotation crops, despite the same herbicide application rates. The elevated risk of offsite losses from herbicides was illustrated by the high concentrations of diuron (14mugL-1) recorded in runoff that occurred >2.5months after herbicide application in a 1st ratoon crop. A cropping system employing less persistent non-selective herbicides and an inter-row soybean mulch resulted in no residual herbicide contamination in runoff water, but recorded 12.3% lower yield compared to Conventional practice. These findings reveal a trade-off between achieving good water quality with minimal herbicide contamination and maintaining farm profitability with good weed control.
Resumo:
Effects of fire on biogeochemical cycling in terrestrial ecosystem are widely acknowledged, while few studies have focused on the bacterial community under the disturbance of long-term frequent prescribed fire. In this study, three treatments (burning every two years (B2), burning every four years (B4) and no burning (B0)) were applied for 38 years in an Australian wet sclerophyll forest. Results showed that bacterial alpha diversity (i.e. bacterial OTU) in the top soil (0-10 cm) was significantly higher in the B2 treatment compared with the B0 and B4 treatments. Non-metric multidimensional analysis (NMDS) of bacterial community showed clear separation of the soil bacterial community structure among different fire frequency regimes and between the depths. Different frequency fire did not have a substantial effect on bacterial composition at phylum level or bacterial 16S rRNA gene abundance. Soil pH and C:N ratio were the major drivers for bacterial community structure in the most frequent fire treatment (B2), while other factors (EC, DOC, DON, MBC, NH 4 +, TC and TN) were significant in the less frequent burning and no burning treatments (B4 and B0). This study suggested that burning had a dramatic impact on bacterial diversity but not abundance with more frequent fire.
Resumo:
Soil biogeochemical cycles are largely mediated by microorganisms, while fire significantly modifies biogeochemical cycles mainly via altering microbial community and substrate availability. Majority of studies on fire effects have focused on the surface soil; therefore, our understanding of the vertical distribution of microbial communities and the impacts of fire on nitrogen (N) dynamics in the soil profile is limited. Here, we examined the changes of soil denitrification capacity (DNC) and denitrifying communities with depth under different burning regimes, and their interaction with environmental gradients along the soil profile. Results showed that soil depth had a more pronounced impact than the burning treatment on the bacterial community size. The abundance of 16S rRNA and denitrification genes (narG, nirK, and nirS) declined exponentially with soil depth. Surprisingly, the nosZ-harboring denitrifiers were enriched in the deeper soil layers, which was likely to indicate that the nosZ-harboring denitrifiers could better adapt to the stress conditions (i.e., oxygen deficiency, nutrient limitation, etc.) than other denitrifiers. Soil nutrients, including dissolved organic carbon (DOC), total soluble N (TSN), ammonium (NH4 +), and nitrate (NO3 −), declined significantly with soil depth, which probably contributed to the vertical distribution of denitrifying communities. Soil DNC decreased significantly with soil depth, which was negligible in the depths below 20 cm. These findings have provided new insights into niche separation of the N-cycling functional guilds along the soil profile, under a varied fire disturbance regime.
Resumo:
Seed production and soil seed hanks of H. contortus were studied in a subset of treatments within an extensive grazing study conducted in H. contortus pasture in southern Queensland between 1990 and 1996. Seed production of H. contortus in autumn ranged from 260 to 1800 seeds/m2 with much of this variation due to differences in rainfall between years. Seed production was generally higher in the silver-leaved ironbark than in the narrow-leaved ironbark land class and was also influenced by a consistent stocking rate x pasture type interaction. Inflorescence density was the main factor contributing to the variable seed production and was related to the rainfall received during February. The number of seeds per inflorescence was unaffected by seasonal rainfall, landscape position, stocking rate or legume oversowing. Seed viability was related to the rainfall received during March. Soil seed banks in spring varied from 130 to 520 seeds/m2 between 1990 and 1995 with generally more seed present in the silver-leaved ironbark than in the narrow-leaved ironbark land class. There were poor relationships between viable seed production and the size of the soil seed bank, and between the size of the soil seed bank and seedling recruitment. This study indicates that H. contortus has the potential to produce relatively large amounts of seed and showed that the seasonal pattern of rainfall plays a major role in achieving this potential
Resumo:
Recolonisation and succession in a multi-species tropical seagrass meadow was examined by creating gaps (50×50 cm) in the meadow and manipulating the supply of sexual and asexual propagules. Measurements of leaf shoot density and estimates of above-ground biomass were conducted monthly to measure recovery of gaps between September 1995 and November 1997. Measurements of the seeds stored in the sediment (seed bank) and horizontal rhizome growth of colonising species were also conducted to determine their role in the recovery process. Asexual colonisation through horizontal rhizome growth from the surrounding meadow was the main mechanism for colonisation of gaps created in the meadow. The seed bank played no role in recolonisation of cleared plots. Total shoot density and above-ground biomass (all species pooled) of cleared plots recovered asexually to the level of the undisturbed controls in 10 and 7 months, respectively. There was some sexual recruitment into cleared plots where asexual colonisation was prevented but seagrass abundance (shoot density and biomass) did not reach the level of unmanipulated controls. Seagrass species did not appear to form seed banks despite some species being capable of producing long-lived seeds. The species composition of cleared plots remained different to the undisturbed controls throughout the 26-month experiment. Syringodium isoetifolium was a rapid asexual coloniser of disturbed plots and remained at higher abundances than in the control treatments for the duration of the study. S. isoetifolium had the fastest horizontal rhizome growth of species asexually colonising cleared plots (6.9 mm day−1). Halophila ovalis was the most successful sexual coloniser but was displaced by asexually colonising species. H. ovalis was the only species observed to produce fruits during the study. Small disturbances in the meadow led to long-term (>2 years) changes in community composition. This study demonstrated that succession in tropical seagrass communities was not a deterministic process. Variations in recovery observed for different tropical seagrass communities highlighted the importance of understanding life history characteristics of species within individual communities to effectively predict their response to disturbance. A reproductive strategy involving clonal growth and production of long-lived, locally dispersed seeds is suggested which may provide an evolutionary advantage to plants growing in tropical environments subject to temporally unpredictable major disturbances such as cyclones
Resumo:
Attention is directed at land application of piggery effluent (containing urine, faeces, water, and wasted feed) as a potential source of water resource contamination with phosphorus (P). This paper summarises P-related properties of soil from 0-0.05 m depth at 11 piggery effluent application sites, in order to explore the impact that effluent application has had on the potential for run-off transport of P. The sites investigated were situated on Alfisol, Mollisol, Vertisol, and Spodosol soils in areas that received effluent for 1.5-30 years (estimated effluent-P applications of 100-310000 kg P/ha in total). Total (PT), bicarbonate extractable (PB), and soluble P forms were determined for the soil (0-0.05 m) at paired effluent and no-effluent sites, as well as texture, oxalate-extractable Fe and Al, organic carbon, and pH. All forms of soil P at 0-0.05 m depth increased with effluent application (PB at effluent sites was 1.7-15 times that at no-effluent sites) at 10 of the 11 sites. Increases in PB were strongly related to net P applications (regression analysis of log values for 7 sites with complete data sets: 82.6 % of variance accounted for, p <0.01). Effluent irrigation tended to increase the proportion of soil PT in dilute CaCl2-extractable forms (PTC: effluent average 2.0 %; no-effluent average 0.6%). The proportion of PTC in non-molybdate reactive forms (centrifuged supernatant) decreased (no-effluent average, 46.4 %; effluent average, 13.7 %). Anaerobic lagoon effluent did not reliably acidify soil, since no consistent relationship was observed for pH with effluent application. Soil organic carbon was increased in most of the effluent areas relative to the no-effluent areas. The four effluent areas where organic carbon was reduced had undergone intensive cultivation and cropping. Current effluent management at many of the piggeries failed to maximise the potential for waste P recapture. Ten of the case-study effluent application areas have received effluent-P in excess of crop uptake. While this may not represent a significant risk of leaching where sorption retains P, it has increased the risk of transport of P by run-off. Where such sites are close to surface water, run-off P loads should be managed.
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
The size of the soil microbial biomass carbon (SMBC) has been proposed as a sensitive indicator for measuring the adverse effects of contaminants on the soil microbial community. In this study of Australian agricultural systems, we demonstrated that field variability of SMBC measured using the fumigation-extraction procedure limited its use as a robust ecotoxicological endpoint. The SMBC varied up to 4-fold across control samples collected from a single field site, due to small-scale spatial heterogeneity in the soil physicochemical environment. Power analysis revealed that large numbers of replicates (3-93) were required to identify 20% or 50% decreases in the size of the SMBC of contaminated soil samples relative to their uncontaminated control samples at the 0.05% level of statistical significance. We question the value of the routine measurement of SMBC as an ecotoxicological endpoint at the field scale, and suggest more robust and predictive microbiological indicators.
Resumo:
Rainfall simulation experiments were carried out to measure runoff and soil water fluxes of suspended solids, total nitrogen, total phosphorus, dissolved organic carbon and total iron from sites in Pinus plantations on the coastal lowlands of south-eastern Queensland subjected to various operations (treatments). The operations investigated were cultivated and nil-cultivated site preparation, fertilised site preparation, clearfall harvesting and prescribed burning; these treatments were compared with an 8-y-old established plantation. Flow-weighted mean concentrations of total nitrogen and total phosphorus in surface runoff from the cultivated and nil-cultivated site-preparation, clearfall harvest, prescribed burning and 8-y-old established plantation treatments were very similar. However, both the soil water and the runoff from the fertilised site preparation treatment contained more nitrogen (N) and phosphorus (P) than the other treatments - with 3.10 mg N L-1 and 4.32 mg P L-1 (4 and 20 times more) in the runoff. Dissolved organic carbon concentrations in runoff from the nil-cultivated site-preparation and prescribed burn treatments were elevated. Iron concentrations were highest in runoff from the nil-cultivated site-preparation and 8-y-old established plantation treatments. Concentrations of suspended solids in runoff were higher from cultivated site preparation and prescribed burn treatments, and reflect the great disturbance of surface soil at these sites. The concentrations of all analytes were highest in initial runoff from plots, and generally decreased with time. Total nitrogen (mean 7.28, range 0.11-13.27 mg L-1) and total phosphorus (mean 11.60, range 0.06-83.99 mg L-1) concentrations in soil water were between 2 and 10 times greater than in surface runoff, which highlights the potential for nutrient fluxes in interflow (i.e. in the soil above the water table) through the general plantation area. Implications in regard to forest management are discussed, along with results of larger catchment-scale studies.
Resumo:
In south-eastern Queensland, Australia, sorghum planted in early spring usually escapes sorghum midge, Stenodiplosis sorghicola, attack. Experiments were conducted to better understand the role of winter diapause in the population dynamics of this pest. Emergence patterns of adult midge from diapausing larvae on the soil surface and at various depths were investigated during spring to autumn of 1987/88–1989/90. From 1987/88 to 1989/90, 89%, 65% and 98% of adult emergence, respectively, occurred during November and December. Adult emergence from larvae diapausing on the soil surface was severely reduced due to high mortality attributed to surface soil temperatures in excess of 40°C, with much of this mortality occurring between mid-September and mid-October. Emergence of adults from the soil surface was considerably delayed in the 1988/89 season compared with larvae buried at 5 or 10 cm which had similar emergence patterns for all three seasons. In 1989/90, when a 1-cm-deep treatment was included, there was a 392% increase in adult emergence from this treatment compared with deeper treatments. Some diapausing larvae on the surface did not emerge at the end of summer in only 1 year (1989/90), when 28.0% of the larvae on the surface remained in diapause, whereas only 0.8% of the buried larvae remained in diapause. We conclude that the pattern of emergence explains why spring plantings of sorghum in south-eastern Queensland usually escape sorghum midge attack.
Resumo:
An urgent need exists for indicators of soil health and patch functionality in extensive rangelands that can be measured efficiently and at low cost. Soil mites are candidate indicators, but their identification and handling is so specialised and time-consuming that their inclusion in routine monitoring is unlikely. The aim of this study was to measure the relationship between patch type and mite assemblages using a conventional approach. An additional aim was to determine if a molecular approach traditionally used for soil microbes could be adapted for soil mites to overcome some of the bottlenecks associated with soil fauna diversity assessment. Soil mite species abundance and diversity were measured using conventional ecological methods in soil from patches with perennial grass and litter cover (PGL), and compared to soil from bare patches with annual grasses and/or litter cover (BAL). Soil mite assemblages were also assessed using a molecular method called terminal-restriction fragment length polymorphism (T-RFLP) analysis. The conventional data showed a relationship between patch type and mite assemblage. The Prostigmata and Oribatida were well represented in the PGL sites, particularly the Aphelacaridae (Oribatida). For T-RFLP analysis, the mite community was represented by a series of DNA fragment lengths that reflected mite sequence diversity. The T-RFLP data showed a distinct difference in the mite assemblage between the patch types. Where possible, T-RFLP peaks were matched to mite families using a reference 18S rDNA database, and the Aphelacaridae prevalent in the conventional samples at PGL sites were identified, as were prostigmatids and oribatids. We identified limits to the T-RFLP approach and this included an inability to distinguish some species whose DNA sequences were similar. Despite these limitations, the data still showed a clear difference between sites, and the molecular taxonomic inferences also compared well with the conventional ecological data. The results from this study indicated that the T-RFLP approach was effective in measuring mite assemblages in this system. The power of this technique lies in the fact that species diversity and abundance data can be obtained quickly because of the time taken to process hundreds of samples, from soil DNA extraction to data output on the gene analyser, can be as little as 4 days.
Resumo:
To improve the sustainability and environmental accountability of the banana industry there is a need to develop a set of soil health indicators that integrate physical, chemical and biological soil properties. These indicators would allow banana growers, extension and research workers to improve soil health management practices. To determine changes in soil properties due to the cultivation of bananas, a paired site survey was conducted comparing soil properties under conventional banana systems to less intensively managed vegetation systems, such as pastures and forest. Measurements were made on physical, chemical and biological soil properties at seven locations in tropical and sub-tropical banana producing areas. Soil nematode community composition was used as a bioindicator of the biological properties of the soil. Soils under conventional banana production tended to have a greater soil bulk density, with less soil organic carbon (C) (both total C and labile C), greater exchangeable cations, higher extractable P, greater numbers of plant-parasitic nematodes and less nematode diversity, relative to less intensively managed plant systems. The organic banana production systems at two locations had greater labile C, relative to conventional banana systems, but there was no significant change in nematode community composition. There were significant interactions between physical, chemical and nematode community measurements in the soil, particularly with soil C measurements, confirming the need for a holistic set of indicators to aid soil management. There was no single indicator of soil health for the Australian banana industry, but a set of soil health indicators, which would allow the measurement of soil improvements should include: bulk density, soil C, pH, EC, total N, extractable P, ECEC and soil nematode community structure.
Resumo:
Heavy wheel traffic causes soil compaction, which adversely affects crop production and may persist for several years. We applied known compaction forces to entire plots annually for 5 years, and then determined the duration of the adverse effects on the properties of a Vertisol and the performance of crops under no-till dryland cropping with residue retention. For up to 5 years after a final treatment with a 10 Mg axle load on wet soil, soil shear strength at 70-100 mm and cone index at 180-360 mm were significantly (P < 0.05) higher than in a control treatment, and soil water storage and grain yield were lower. We conclude that compaction effects persisted because (1) there were insufficient wet-dry cycles to swell and shrink the entire compacted layer, (2) soil loosening by tillage was absent and (3) there were fewer earthworms in the compacted soil. Compaction of dry soil with 6 Mg had little effect at any time, indicating that by using wheel traffic only when the soil is dry, problems can be avoided. Unfortunately such a restriction is not always possible because sowing, tillage and harvest operations often need to be done when the soil is wet. A more generally applicable solution, which also ensures timely operations, is the permanent separation of wheel zones and crop zones in the field--the practice known as controlled traffic farming. Where a compacted layer already exists, even on a clay soil, management options to hasten repair should be considered, e.g. tillage, deep ripping, sowing a ley pasture or sowing crop species more effective at repairing compacted soil.
Resumo:
The leaching of phosphorus (P) within soils can be a limiting consideration for the sustainable operation of intensive livestock enterprises. Sorption curves are widely used to assist estimation of P retention, though the effect of effluent constituents on their accuracy is not well understood. We conducted a series of P-sorption-desorption batch experiments with an Oxic Haplustalf (soil 1), Haplusterts (soils 2 and 3), and a Natrustalf (soil 4). Phosphorus sources included effluent, orthophosphate-P in a matrix replicating the effluent's salt constituents (the reference solution), and an orthophosphate-P solution. Treated soils were incubated for up to 193 days before sequential desorption extraction. Effluent constituents, probably the organic or particulate components, temporarily increased the vulnerability of sorbed-P to desorption. The increase in vulnerability was removed by 2-113 days of incubation (25 degrees C). Despite vigorous extraction for 20 consecutive days, some P sorbed as part of the treatments of soils 1 and 2 was not desorbed. The increased vulnerability due to effluent constituents lasted a maximum of about one cropping season and, for all other treatments, adsorption curves overestimated vulnerability to desorption. Therefore, adsorption curves provide a conservative estimate of vulnerability to desorption where effluent is used in continued crop production in these soils.