82 resultados para soil sorption
em eResearch Archive - Queensland Department of Agriculture
Resumo:
The leaching of phosphorus (P) within soils can be a limiting consideration for the sustainable operation of intensive livestock enterprises. Sorption curves are widely used to assist estimation of P retention, though the effect of effluent constituents on their accuracy is not well understood. We conducted a series of P-sorption-desorption batch experiments with an Oxic Haplustalf (soil 1), Haplusterts (soils 2 and 3), and a Natrustalf (soil 4). Phosphorus sources included effluent, orthophosphate-P in a matrix replicating the effluent's salt constituents (the reference solution), and an orthophosphate-P solution. Treated soils were incubated for up to 193 days before sequential desorption extraction. Effluent constituents, probably the organic or particulate components, temporarily increased the vulnerability of sorbed-P to desorption. The increase in vulnerability was removed by 2-113 days of incubation (25 degrees C). Despite vigorous extraction for 20 consecutive days, some P sorbed as part of the treatments of soils 1 and 2 was not desorbed. The increased vulnerability due to effluent constituents lasted a maximum of about one cropping season and, for all other treatments, adsorption curves overestimated vulnerability to desorption. Therefore, adsorption curves provide a conservative estimate of vulnerability to desorption where effluent is used in continued crop production in these soils.
Resumo:
In Australia communities are concerned about atrazine being detected in drinking water supplies. It is important to understand mechanisms by which atrazine is transported from paddocks to waterways if we are to reduce movement of agricultural chemicals from the site of application. Two paddocks cropped with grain sorghum on a Black Vertosol were monitored for atrazine, potassium chloride (KCl) extractable atrazine, desethylatrazine (DEA), and desisopropylatrazine (DIA) at 4 soil depths (0-0.05, 0.05-0.10, 0.10-0.20, and 0.20-0.30 m) and in runoff water and runoff sediment. Atrazine + DEA + DIA (total atrazine) had a half-life in soil of 16-20 days, more rapid dissipation than in many earlier reports. Atrazine extracted in dilute potassium chloride, considered available for weed control, was initially 34% of the total and had a half-life of 15-20 days until day 30, after which it dissipated rapidly with a half life of 6 days. We conclude that, in this region, atrazine may not pose a risk for groundwater contamination, as only 0.5% of applied atrazine moved deeper than 0.20 m into the soil, where it dissipated rapidly. In runoff (including suspended sediment) atrazine concentrations were greatest during the first runoff event (57 days after application) (85 μg/L) and declined with time. After 160 days, the total atrazine lost in runoff was 0.4% of the initial application. The total atrazine concentration in runoff was strongly related to the total concentration in soil, as expected. Even after 98% of the KCl-extractable atrazine had dissipated (and no longer provided weed control), runoff concentrations still exceeded the human health guideline value of 40 μg/L. For total atrazine in soil (0-0.05 m), the range for coefficient of soil sorption (Kd) was 1.9-28.4 mL/g and for soil organic carbon sorption (KOC) was 100-2184 mL/g, increasing with time of contact with the soil and rapid dissipation of the more soluble, available phase. Partition coefficients in runoff for total atrazine were initially 3, increasing to 32 and 51 with time, values for DEA being half these. To minimise atrazine losses, cultural practices that maximise rain infiltration, and thereby minimise runoff, and minimise concentrations in the soil surface should be adopted.
Resumo:
Attention is directed at land application of piggery effluent (containing urine, faeces, water, and wasted feed) as a potential source of water resource contamination with phosphorus (P). This paper summarises P-related properties of soil from 0-0.05 m depth at 11 piggery effluent application sites, in order to explore the impact that effluent application has had on the potential for run-off transport of P. The sites investigated were situated on Alfisol, Mollisol, Vertisol, and Spodosol soils in areas that received effluent for 1.5-30 years (estimated effluent-P applications of 100-310000 kg P/ha in total). Total (PT), bicarbonate extractable (PB), and soluble P forms were determined for the soil (0-0.05 m) at paired effluent and no-effluent sites, as well as texture, oxalate-extractable Fe and Al, organic carbon, and pH. All forms of soil P at 0-0.05 m depth increased with effluent application (PB at effluent sites was 1.7-15 times that at no-effluent sites) at 10 of the 11 sites. Increases in PB were strongly related to net P applications (regression analysis of log values for 7 sites with complete data sets: 82.6 % of variance accounted for, p <0.01). Effluent irrigation tended to increase the proportion of soil PT in dilute CaCl2-extractable forms (PTC: effluent average 2.0 %; no-effluent average 0.6%). The proportion of PTC in non-molybdate reactive forms (centrifuged supernatant) decreased (no-effluent average, 46.4 %; effluent average, 13.7 %). Anaerobic lagoon effluent did not reliably acidify soil, since no consistent relationship was observed for pH with effluent application. Soil organic carbon was increased in most of the effluent areas relative to the no-effluent areas. The four effluent areas where organic carbon was reduced had undergone intensive cultivation and cropping. Current effluent management at many of the piggeries failed to maximise the potential for waste P recapture. Ten of the case-study effluent application areas have received effluent-P in excess of crop uptake. While this may not represent a significant risk of leaching where sorption retains P, it has increased the risk of transport of P by run-off. Where such sites are close to surface water, run-off P loads should be managed.
Resumo:
Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.
Resumo:
Seed production and soil seed hanks of H. contortus were studied in a subset of treatments within an extensive grazing study conducted in H. contortus pasture in southern Queensland between 1990 and 1996. Seed production of H. contortus in autumn ranged from 260 to 1800 seeds/m2 with much of this variation due to differences in rainfall between years. Seed production was generally higher in the silver-leaved ironbark than in the narrow-leaved ironbark land class and was also influenced by a consistent stocking rate x pasture type interaction. Inflorescence density was the main factor contributing to the variable seed production and was related to the rainfall received during February. The number of seeds per inflorescence was unaffected by seasonal rainfall, landscape position, stocking rate or legume oversowing. Seed viability was related to the rainfall received during March. Soil seed banks in spring varied from 130 to 520 seeds/m2 between 1990 and 1995 with generally more seed present in the silver-leaved ironbark than in the narrow-leaved ironbark land class. There were poor relationships between viable seed production and the size of the soil seed bank, and between the size of the soil seed bank and seedling recruitment. This study indicates that H. contortus has the potential to produce relatively large amounts of seed and showed that the seasonal pattern of rainfall plays a major role in achieving this potential
Resumo:
Land application of piggery effluent (containing urine, faeces, water, and wasted feed) is under close scrutiny as a potential source of water resource contamination with phosphorus (P). This paper investigates two case studies of the impact of long-term piggery effluent-P application to soil. A Natrustalf (Sodosol) at P1 has received a net load of 3700 kg effluent P/ha over 19 years. The Haplustalf (Dermosol) selected (P2) has received a net load of 310 000 kg P/ha over 30 years. Total, bicarbonate extractable, and soluble P forms were determined throughout the soil profiles for paired (irrigated and unirrigated) sites at P1 and P2, as well as P sorption and desorption characteristics. Surface bicarbonate (PB, 0 - 0.05 m depth) and dilute CaCl2 extractable molybdate-reactive P (PC) have been significantly elevated by effluent irrigation (P1: PB unirrigated 23±1, irrigated 290±6; PC unirrigated 0.03±0.00, irrigated 23.9±0.2. P2: PB unirrigated 72±48, irrigated 3950±1960; PC unirrigated 0.7±0.0, irrigated 443±287 mg P/kg; mean±s.d.). Phosphorus enrichment to 1.5 m, detected as PB, was observed at P2. Elevated concentrations of CaCl2 extractable organic P forms (POC; estimated by non-molybdate reactive P in centrifuged supernatants) were observed from the soil surface of P1 to a depth of 0.4 m. Despite the extent of effluent application at both of these sites, only P1 displayed evidence of significant accumulation of POC. The increase in surface soil total P (0 - 0.05 m depth) due to effluent irrigation was much greater than laboratory P sorption (>25 times for P1; >57 times for P2) for a comparable range of final solution concentrations (desorption extracts ranged from 1-5 mg P/L for P1 and 50-80 mg P/L for P2). Precipitation of sparingly soluble P phases was evidenced in the soils of the P2 effluent application area.
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
The size of the soil microbial biomass carbon (SMBC) has been proposed as a sensitive indicator for measuring the adverse effects of contaminants on the soil microbial community. In this study of Australian agricultural systems, we demonstrated that field variability of SMBC measured using the fumigation-extraction procedure limited its use as a robust ecotoxicological endpoint. The SMBC varied up to 4-fold across control samples collected from a single field site, due to small-scale spatial heterogeneity in the soil physicochemical environment. Power analysis revealed that large numbers of replicates (3-93) were required to identify 20% or 50% decreases in the size of the SMBC of contaminated soil samples relative to their uncontaminated control samples at the 0.05% level of statistical significance. We question the value of the routine measurement of SMBC as an ecotoxicological endpoint at the field scale, and suggest more robust and predictive microbiological indicators.
Resumo:
In south-eastern Queensland, Australia, sorghum planted in early spring usually escapes sorghum midge, Stenodiplosis sorghicola, attack. Experiments were conducted to better understand the role of winter diapause in the population dynamics of this pest. Emergence patterns of adult midge from diapausing larvae on the soil surface and at various depths were investigated during spring to autumn of 1987/88–1989/90. From 1987/88 to 1989/90, 89%, 65% and 98% of adult emergence, respectively, occurred during November and December. Adult emergence from larvae diapausing on the soil surface was severely reduced due to high mortality attributed to surface soil temperatures in excess of 40°C, with much of this mortality occurring between mid-September and mid-October. Emergence of adults from the soil surface was considerably delayed in the 1988/89 season compared with larvae buried at 5 or 10 cm which had similar emergence patterns for all three seasons. In 1989/90, when a 1-cm-deep treatment was included, there was a 392% increase in adult emergence from this treatment compared with deeper treatments. Some diapausing larvae on the surface did not emerge at the end of summer in only 1 year (1989/90), when 28.0% of the larvae on the surface remained in diapause, whereas only 0.8% of the buried larvae remained in diapause. We conclude that the pattern of emergence explains why spring plantings of sorghum in south-eastern Queensland usually escape sorghum midge attack.
Resumo:
An urgent need exists for indicators of soil health and patch functionality in extensive rangelands that can be measured efficiently and at low cost. Soil mites are candidate indicators, but their identification and handling is so specialised and time-consuming that their inclusion in routine monitoring is unlikely. The aim of this study was to measure the relationship between patch type and mite assemblages using a conventional approach. An additional aim was to determine if a molecular approach traditionally used for soil microbes could be adapted for soil mites to overcome some of the bottlenecks associated with soil fauna diversity assessment. Soil mite species abundance and diversity were measured using conventional ecological methods in soil from patches with perennial grass and litter cover (PGL), and compared to soil from bare patches with annual grasses and/or litter cover (BAL). Soil mite assemblages were also assessed using a molecular method called terminal-restriction fragment length polymorphism (T-RFLP) analysis. The conventional data showed a relationship between patch type and mite assemblage. The Prostigmata and Oribatida were well represented in the PGL sites, particularly the Aphelacaridae (Oribatida). For T-RFLP analysis, the mite community was represented by a series of DNA fragment lengths that reflected mite sequence diversity. The T-RFLP data showed a distinct difference in the mite assemblage between the patch types. Where possible, T-RFLP peaks were matched to mite families using a reference 18S rDNA database, and the Aphelacaridae prevalent in the conventional samples at PGL sites were identified, as were prostigmatids and oribatids. We identified limits to the T-RFLP approach and this included an inability to distinguish some species whose DNA sequences were similar. Despite these limitations, the data still showed a clear difference between sites, and the molecular taxonomic inferences also compared well with the conventional ecological data. The results from this study indicated that the T-RFLP approach was effective in measuring mite assemblages in this system. The power of this technique lies in the fact that species diversity and abundance data can be obtained quickly because of the time taken to process hundreds of samples, from soil DNA extraction to data output on the gene analyser, can be as little as 4 days.
Resumo:
BACKGROUND: In spite of the extensive use of phosphine fumigation around the world to control insects in stored grain, and the knowledge that grain sorbs phosphine, the effect of concentration on sorption has not been quantified. A laboratory study was undertaken, therefore, to investigate the effect of phosphine dose on sorption in wheat. Wheat was added to glass flasks to achieve filling ratios of 0.25-0.95, and the flasks were sealed and injected with phosphine at 0.1-1.5 mg L-1 based on flask volume. Phosphine concentration was monitored for 8 days at 25°C and 55% RH. RESULTS: When sorption occurred, phosphine concentration declined with time and was approximately first order, i.e. the data fitted an exponential decay equation. Percentage sorption per day was directly proportional to filling ratio, and was negatively correlated with dose for any given filling ratio. Based on the results, a tenfold increase in dose would result in a halving of the sorption constant and the percentage daily loss. Wheat was less sorptive if it was fumigated for a second time. CONCLUSIONS: The results have implications for the use of phosphine for control of insects in stored wheat. This study shows that dose is a factor that must be considered when trying to understand the impact of sorption on phosphine concentration, and that there appears to be a limit to the capacity of wheat to sorb phosphine.
Resumo:
To improve the sustainability and environmental accountability of the banana industry there is a need to develop a set of soil health indicators that integrate physical, chemical and biological soil properties. These indicators would allow banana growers, extension and research workers to improve soil health management practices. To determine changes in soil properties due to the cultivation of bananas, a paired site survey was conducted comparing soil properties under conventional banana systems to less intensively managed vegetation systems, such as pastures and forest. Measurements were made on physical, chemical and biological soil properties at seven locations in tropical and sub-tropical banana producing areas. Soil nematode community composition was used as a bioindicator of the biological properties of the soil. Soils under conventional banana production tended to have a greater soil bulk density, with less soil organic carbon (C) (both total C and labile C), greater exchangeable cations, higher extractable P, greater numbers of plant-parasitic nematodes and less nematode diversity, relative to less intensively managed plant systems. The organic banana production systems at two locations had greater labile C, relative to conventional banana systems, but there was no significant change in nematode community composition. There were significant interactions between physical, chemical and nematode community measurements in the soil, particularly with soil C measurements, confirming the need for a holistic set of indicators to aid soil management. There was no single indicator of soil health for the Australian banana industry, but a set of soil health indicators, which would allow the measurement of soil improvements should include: bulk density, soil C, pH, EC, total N, extractable P, ECEC and soil nematode community structure.
Resumo:
Heavy wheel traffic causes soil compaction, which adversely affects crop production and may persist for several years. We applied known compaction forces to entire plots annually for 5 years, and then determined the duration of the adverse effects on the properties of a Vertisol and the performance of crops under no-till dryland cropping with residue retention. For up to 5 years after a final treatment with a 10 Mg axle load on wet soil, soil shear strength at 70-100 mm and cone index at 180-360 mm were significantly (P < 0.05) higher than in a control treatment, and soil water storage and grain yield were lower. We conclude that compaction effects persisted because (1) there were insufficient wet-dry cycles to swell and shrink the entire compacted layer, (2) soil loosening by tillage was absent and (3) there were fewer earthworms in the compacted soil. Compaction of dry soil with 6 Mg had little effect at any time, indicating that by using wheel traffic only when the soil is dry, problems can be avoided. Unfortunately such a restriction is not always possible because sowing, tillage and harvest operations often need to be done when the soil is wet. A more generally applicable solution, which also ensures timely operations, is the permanent separation of wheel zones and crop zones in the field--the practice known as controlled traffic farming. Where a compacted layer already exists, even on a clay soil, management options to hasten repair should be considered, e.g. tillage, deep ripping, sowing a ley pasture or sowing crop species more effective at repairing compacted soil.
Resumo:
Salinity, sodicity, acidity, and phytotoxic levels of chloride (Cl) in subsoils are major constraints to crop production in many soils of north-eastern Australia because they reduce the ability of crop roots to extract water and nutrients from the soil. The complex interactions and correlations among soil properties result in multi-colinearity between soil properties and crop yield that makes it difficult to determine which constraint is the major limitation. We used ridge-regression analysis to overcome colinearity to evaluate the contribution of soil factors and water supply to the variation in the yields of 5 winter crops on soils with various levels and combinations of subsoil constraints in the region. Subsoil constraints measured were soil Cl, electrical conductivity of the saturation extract (ECse), and exchangeable sodium percentage (ESP). The ridge regression procedure selected several of the variables used in a descriptive model, which included in-crop rainfall, plant-available soil water at sowing in the 0.90-1.10 m soil layer, and soil Cl in the 0.90-1.10 m soil layer, and accounted for 77-85% of the variation in the grain yields of the 5 winter crops. Inclusion of ESP of the top soil (0.0-0.10 m soil layer) marginally increased the descriptive capability of the models for bread wheat, barley and durum wheat. Subsoil Cl concentration was found to be an effective substitute for subsoil water extraction. The estimates of the critical levels of subsoil Cl for a 10% reduction in the grain yield were 492 mg cl/kg for chickpea, 662 mg Cl/kg for durum wheat, 854 mg Cl/kg for bread wheat, 980 mg Cl/kg for canola, and 1012 mg Cl/kg for barley, thus suggesting that chickpea and durum wheat were more sensitive to subsoil Cl than bread wheat, barley, and canola.
Resumo:
The fate of nitrogen (N) applied in biosolids was investigated in a forage production system on an alluvial clay loam soil in south-eastern Queensland, Australia. Biosolids were applied in October 2002 at rates of 6, 12, 36, and 54dryt/ha for aerobically digested biosolids (AE) and 8, 16, 48, and 72dryt/ha for anaerobically digested biosolids (AN). Rates were based on multiples of the Nitrogen Limited Biosolids Application rate (0.5, 1, 3, and 4.5NLBAR) for each type of biosolid. The experiment included an unfertilised control and a fertilised control that received multiple applications of synthetic fertiliser. Forage sorghum was planted 1 week after biosolids application and harvested 4 times between December 2002 and May 2003. Dry matter production was significantly greater from the biosolids-treated plots (21-27t/ha) than from the unfertilised (16t/ha) and fertilised (18t/ha) controls. The harvested plant material removed an extra 148-488kg N from the biosolids-treated plots. Partial N budgets were calculated for the 1NLBAR and 4.5NLBAR treatments for each biosolids type at the end of the crop season. Crop removal only accounted for 25-33% of the applied N in the 1NLBAR treatments and as low as 8-15% with 4.5NLBAR. Residual biosolids N was predominantly in the form of organic N (38-51% of applied biosolids N), although there was also a significant proportion (10-23%) as NO3-N, predominantly in the top 0.90m of the soil profile. From 12 to 29% of applied N was unaccounted for, and presumed to be lost as gaseous nitrogen and/or ammonia, as a consequence of volatilisation or denitrification, respectively. In-season mineralisation of organic N in biosolids was 43-59% of the applied organic N, which was much greater than the 15% (AN)-25% (AE) expected, based on current NLBAR calculation methods. Excessive biosolids application produced little additional biomass but led to high soil mineral N concentrations that were vulnerable to multiple loss pathways. Queensland Guidelines need to account for higher rates of mineralisation and losses via denitrification and volatilisation and should therefore encourage lower application rates to achieve optimal plant growth and minimise the potential for detrimental impacts on the environment.