79 resultados para soil deterioration
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Five species of commercial prawns Penaeus plebejus, P. merguiensis, P. semisulcatus/P. esculentus and M. bennettae, were obtained from South-East and North Queensland, chilled soon after capture and then stored either whole or deheaded on ice and ice slurry, until spoilage. Total bacterial counts, total volatile nitrogen, K-values and total demerit scores were assessed at regular intervals. Their shelf lives ranged from 10-17 days on ice and >20 days on ice slurry. Initial bacterial flora on prawns from shallower waters (4-15m) were dominated by Gram-positives and had lag periods around 7 days, whereas prawns from deeper waters (100m) were dominant in Pseudomonas spp. with no lag periods in bacterial growth. The dominant spoiler in ice was mainly Pseudomonas fragi whereas the main spoiler in ice slurry was Shewanella putrefaciens. Bacterial interactions seem to play a major role in the patterns of spoilage in relation to capture environment and pattern of storage
Resumo:
Seed production and soil seed hanks of H. contortus were studied in a subset of treatments within an extensive grazing study conducted in H. contortus pasture in southern Queensland between 1990 and 1996. Seed production of H. contortus in autumn ranged from 260 to 1800 seeds/m2 with much of this variation due to differences in rainfall between years. Seed production was generally higher in the silver-leaved ironbark than in the narrow-leaved ironbark land class and was also influenced by a consistent stocking rate x pasture type interaction. Inflorescence density was the main factor contributing to the variable seed production and was related to the rainfall received during February. The number of seeds per inflorescence was unaffected by seasonal rainfall, landscape position, stocking rate or legume oversowing. Seed viability was related to the rainfall received during March. Soil seed banks in spring varied from 130 to 520 seeds/m2 between 1990 and 1995 with generally more seed present in the silver-leaved ironbark than in the narrow-leaved ironbark land class. There were poor relationships between viable seed production and the size of the soil seed bank, and between the size of the soil seed bank and seedling recruitment. This study indicates that H. contortus has the potential to produce relatively large amounts of seed and showed that the seasonal pattern of rainfall plays a major role in achieving this potential
Resumo:
Attention is directed at land application of piggery effluent (containing urine, faeces, water, and wasted feed) as a potential source of water resource contamination with phosphorus (P). This paper summarises P-related properties of soil from 0-0.05 m depth at 11 piggery effluent application sites, in order to explore the impact that effluent application has had on the potential for run-off transport of P. The sites investigated were situated on Alfisol, Mollisol, Vertisol, and Spodosol soils in areas that received effluent for 1.5-30 years (estimated effluent-P applications of 100-310000 kg P/ha in total). Total (PT), bicarbonate extractable (PB), and soluble P forms were determined for the soil (0-0.05 m) at paired effluent and no-effluent sites, as well as texture, oxalate-extractable Fe and Al, organic carbon, and pH. All forms of soil P at 0-0.05 m depth increased with effluent application (PB at effluent sites was 1.7-15 times that at no-effluent sites) at 10 of the 11 sites. Increases in PB were strongly related to net P applications (regression analysis of log values for 7 sites with complete data sets: 82.6 % of variance accounted for, p <0.01). Effluent irrigation tended to increase the proportion of soil PT in dilute CaCl2-extractable forms (PTC: effluent average 2.0 %; no-effluent average 0.6%). The proportion of PTC in non-molybdate reactive forms (centrifuged supernatant) decreased (no-effluent average, 46.4 %; effluent average, 13.7 %). Anaerobic lagoon effluent did not reliably acidify soil, since no consistent relationship was observed for pH with effluent application. Soil organic carbon was increased in most of the effluent areas relative to the no-effluent areas. The four effluent areas where organic carbon was reduced had undergone intensive cultivation and cropping. Current effluent management at many of the piggeries failed to maximise the potential for waste P recapture. Ten of the case-study effluent application areas have received effluent-P in excess of crop uptake. While this may not represent a significant risk of leaching where sorption retains P, it has increased the risk of transport of P by run-off. Where such sites are close to surface water, run-off P loads should be managed.
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
The size of the soil microbial biomass carbon (SMBC) has been proposed as a sensitive indicator for measuring the adverse effects of contaminants on the soil microbial community. In this study of Australian agricultural systems, we demonstrated that field variability of SMBC measured using the fumigation-extraction procedure limited its use as a robust ecotoxicological endpoint. The SMBC varied up to 4-fold across control samples collected from a single field site, due to small-scale spatial heterogeneity in the soil physicochemical environment. Power analysis revealed that large numbers of replicates (3-93) were required to identify 20% or 50% decreases in the size of the SMBC of contaminated soil samples relative to their uncontaminated control samples at the 0.05% level of statistical significance. We question the value of the routine measurement of SMBC as an ecotoxicological endpoint at the field scale, and suggest more robust and predictive microbiological indicators.
Resumo:
In south-eastern Queensland, Australia, sorghum planted in early spring usually escapes sorghum midge, Stenodiplosis sorghicola, attack. Experiments were conducted to better understand the role of winter diapause in the population dynamics of this pest. Emergence patterns of adult midge from diapausing larvae on the soil surface and at various depths were investigated during spring to autumn of 1987/88–1989/90. From 1987/88 to 1989/90, 89%, 65% and 98% of adult emergence, respectively, occurred during November and December. Adult emergence from larvae diapausing on the soil surface was severely reduced due to high mortality attributed to surface soil temperatures in excess of 40°C, with much of this mortality occurring between mid-September and mid-October. Emergence of adults from the soil surface was considerably delayed in the 1988/89 season compared with larvae buried at 5 or 10 cm which had similar emergence patterns for all three seasons. In 1989/90, when a 1-cm-deep treatment was included, there was a 392% increase in adult emergence from this treatment compared with deeper treatments. Some diapausing larvae on the surface did not emerge at the end of summer in only 1 year (1989/90), when 28.0% of the larvae on the surface remained in diapause, whereas only 0.8% of the buried larvae remained in diapause. We conclude that the pattern of emergence explains why spring plantings of sorghum in south-eastern Queensland usually escape sorghum midge attack.
Resumo:
An urgent need exists for indicators of soil health and patch functionality in extensive rangelands that can be measured efficiently and at low cost. Soil mites are candidate indicators, but their identification and handling is so specialised and time-consuming that their inclusion in routine monitoring is unlikely. The aim of this study was to measure the relationship between patch type and mite assemblages using a conventional approach. An additional aim was to determine if a molecular approach traditionally used for soil microbes could be adapted for soil mites to overcome some of the bottlenecks associated with soil fauna diversity assessment. Soil mite species abundance and diversity were measured using conventional ecological methods in soil from patches with perennial grass and litter cover (PGL), and compared to soil from bare patches with annual grasses and/or litter cover (BAL). Soil mite assemblages were also assessed using a molecular method called terminal-restriction fragment length polymorphism (T-RFLP) analysis. The conventional data showed a relationship between patch type and mite assemblage. The Prostigmata and Oribatida were well represented in the PGL sites, particularly the Aphelacaridae (Oribatida). For T-RFLP analysis, the mite community was represented by a series of DNA fragment lengths that reflected mite sequence diversity. The T-RFLP data showed a distinct difference in the mite assemblage between the patch types. Where possible, T-RFLP peaks were matched to mite families using a reference 18S rDNA database, and the Aphelacaridae prevalent in the conventional samples at PGL sites were identified, as were prostigmatids and oribatids. We identified limits to the T-RFLP approach and this included an inability to distinguish some species whose DNA sequences were similar. Despite these limitations, the data still showed a clear difference between sites, and the molecular taxonomic inferences also compared well with the conventional ecological data. The results from this study indicated that the T-RFLP approach was effective in measuring mite assemblages in this system. The power of this technique lies in the fact that species diversity and abundance data can be obtained quickly because of the time taken to process hundreds of samples, from soil DNA extraction to data output on the gene analyser, can be as little as 4 days.
Resumo:
To improve the sustainability and environmental accountability of the banana industry there is a need to develop a set of soil health indicators that integrate physical, chemical and biological soil properties. These indicators would allow banana growers, extension and research workers to improve soil health management practices. To determine changes in soil properties due to the cultivation of bananas, a paired site survey was conducted comparing soil properties under conventional banana systems to less intensively managed vegetation systems, such as pastures and forest. Measurements were made on physical, chemical and biological soil properties at seven locations in tropical and sub-tropical banana producing areas. Soil nematode community composition was used as a bioindicator of the biological properties of the soil. Soils under conventional banana production tended to have a greater soil bulk density, with less soil organic carbon (C) (both total C and labile C), greater exchangeable cations, higher extractable P, greater numbers of plant-parasitic nematodes and less nematode diversity, relative to less intensively managed plant systems. The organic banana production systems at two locations had greater labile C, relative to conventional banana systems, but there was no significant change in nematode community composition. There were significant interactions between physical, chemical and nematode community measurements in the soil, particularly with soil C measurements, confirming the need for a holistic set of indicators to aid soil management. There was no single indicator of soil health for the Australian banana industry, but a set of soil health indicators, which would allow the measurement of soil improvements should include: bulk density, soil C, pH, EC, total N, extractable P, ECEC and soil nematode community structure.
Resumo:
Heavy wheel traffic causes soil compaction, which adversely affects crop production and may persist for several years. We applied known compaction forces to entire plots annually for 5 years, and then determined the duration of the adverse effects on the properties of a Vertisol and the performance of crops under no-till dryland cropping with residue retention. For up to 5 years after a final treatment with a 10 Mg axle load on wet soil, soil shear strength at 70-100 mm and cone index at 180-360 mm were significantly (P < 0.05) higher than in a control treatment, and soil water storage and grain yield were lower. We conclude that compaction effects persisted because (1) there were insufficient wet-dry cycles to swell and shrink the entire compacted layer, (2) soil loosening by tillage was absent and (3) there were fewer earthworms in the compacted soil. Compaction of dry soil with 6 Mg had little effect at any time, indicating that by using wheel traffic only when the soil is dry, problems can be avoided. Unfortunately such a restriction is not always possible because sowing, tillage and harvest operations often need to be done when the soil is wet. A more generally applicable solution, which also ensures timely operations, is the permanent separation of wheel zones and crop zones in the field--the practice known as controlled traffic farming. Where a compacted layer already exists, even on a clay soil, management options to hasten repair should be considered, e.g. tillage, deep ripping, sowing a ley pasture or sowing crop species more effective at repairing compacted soil.
Resumo:
The leaching of phosphorus (P) within soils can be a limiting consideration for the sustainable operation of intensive livestock enterprises. Sorption curves are widely used to assist estimation of P retention, though the effect of effluent constituents on their accuracy is not well understood. We conducted a series of P-sorption-desorption batch experiments with an Oxic Haplustalf (soil 1), Haplusterts (soils 2 and 3), and a Natrustalf (soil 4). Phosphorus sources included effluent, orthophosphate-P in a matrix replicating the effluent's salt constituents (the reference solution), and an orthophosphate-P solution. Treated soils were incubated for up to 193 days before sequential desorption extraction. Effluent constituents, probably the organic or particulate components, temporarily increased the vulnerability of sorbed-P to desorption. The increase in vulnerability was removed by 2-113 days of incubation (25 degrees C). Despite vigorous extraction for 20 consecutive days, some P sorbed as part of the treatments of soils 1 and 2 was not desorbed. The increased vulnerability due to effluent constituents lasted a maximum of about one cropping season and, for all other treatments, adsorption curves overestimated vulnerability to desorption. Therefore, adsorption curves provide a conservative estimate of vulnerability to desorption where effluent is used in continued crop production in these soils.
Resumo:
Salinity, sodicity, acidity, and phytotoxic levels of chloride (Cl) in subsoils are major constraints to crop production in many soils of north-eastern Australia because they reduce the ability of crop roots to extract water and nutrients from the soil. The complex interactions and correlations among soil properties result in multi-colinearity between soil properties and crop yield that makes it difficult to determine which constraint is the major limitation. We used ridge-regression analysis to overcome colinearity to evaluate the contribution of soil factors and water supply to the variation in the yields of 5 winter crops on soils with various levels and combinations of subsoil constraints in the region. Subsoil constraints measured were soil Cl, electrical conductivity of the saturation extract (ECse), and exchangeable sodium percentage (ESP). The ridge regression procedure selected several of the variables used in a descriptive model, which included in-crop rainfall, plant-available soil water at sowing in the 0.90-1.10 m soil layer, and soil Cl in the 0.90-1.10 m soil layer, and accounted for 77-85% of the variation in the grain yields of the 5 winter crops. Inclusion of ESP of the top soil (0.0-0.10 m soil layer) marginally increased the descriptive capability of the models for bread wheat, barley and durum wheat. Subsoil Cl concentration was found to be an effective substitute for subsoil water extraction. The estimates of the critical levels of subsoil Cl for a 10% reduction in the grain yield were 492 mg cl/kg for chickpea, 662 mg Cl/kg for durum wheat, 854 mg Cl/kg for bread wheat, 980 mg Cl/kg for canola, and 1012 mg Cl/kg for barley, thus suggesting that chickpea and durum wheat were more sensitive to subsoil Cl than bread wheat, barley, and canola.
Resumo:
The fate of nitrogen (N) applied in biosolids was investigated in a forage production system on an alluvial clay loam soil in south-eastern Queensland, Australia. Biosolids were applied in October 2002 at rates of 6, 12, 36, and 54dryt/ha for aerobically digested biosolids (AE) and 8, 16, 48, and 72dryt/ha for anaerobically digested biosolids (AN). Rates were based on multiples of the Nitrogen Limited Biosolids Application rate (0.5, 1, 3, and 4.5NLBAR) for each type of biosolid. The experiment included an unfertilised control and a fertilised control that received multiple applications of synthetic fertiliser. Forage sorghum was planted 1 week after biosolids application and harvested 4 times between December 2002 and May 2003. Dry matter production was significantly greater from the biosolids-treated plots (21-27t/ha) than from the unfertilised (16t/ha) and fertilised (18t/ha) controls. The harvested plant material removed an extra 148-488kg N from the biosolids-treated plots. Partial N budgets were calculated for the 1NLBAR and 4.5NLBAR treatments for each biosolids type at the end of the crop season. Crop removal only accounted for 25-33% of the applied N in the 1NLBAR treatments and as low as 8-15% with 4.5NLBAR. Residual biosolids N was predominantly in the form of organic N (38-51% of applied biosolids N), although there was also a significant proportion (10-23%) as NO3-N, predominantly in the top 0.90m of the soil profile. From 12 to 29% of applied N was unaccounted for, and presumed to be lost as gaseous nitrogen and/or ammonia, as a consequence of volatilisation or denitrification, respectively. In-season mineralisation of organic N in biosolids was 43-59% of the applied organic N, which was much greater than the 15% (AN)-25% (AE) expected, based on current NLBAR calculation methods. Excessive biosolids application produced little additional biomass but led to high soil mineral N concentrations that were vulnerable to multiple loss pathways. Queensland Guidelines need to account for higher rates of mineralisation and losses via denitrification and volatilisation and should therefore encourage lower application rates to achieve optimal plant growth and minimise the potential for detrimental impacts on the environment.
Resumo:
The first larval instar has been identified as a critical stage for population mortality in Lepidoptera, yet due to the body size of these larvae, the factors that contribute to mortality under field conditions are still not clear. Dispersal behaviour has been suggested as a significant, but ignored factor contributing to mortality in first-instar lepidopteran larvae. The impact that leaving the host plant has on the mortality rate of Helicoverpa armigera neonates was examined in field crops and laboratory trials. In this study the following are examined: (1) the effects of soil surface temperature, and the level of shade within the crop, on the mortality of neonates on the soil after dropping off from the host plant; (2) the percentage of neonates that dropped off from a host plant and landed on the soil; and (3) the effects of exposure to different soil surface temperatures on the development and mortality of neonates. The findings of this study showed that: (1) on the soil, surface temperatures above 43°C were lethal for neonates, and exposure to these temperatures contributed greatly to the overall mortality rate observed; however, the fate of neonates on the soil varied significantly depending on canopy closure within the crop; (2) at least 15% of neonates dropped off from the host plant and landed on the soil, meaning that the proportion of neonates exposed to these condition is not trivial; and (3) 30 min exposure to soil surface temperatures approaching the lethal level (>43°C) has no significant negative effects on the development and mortality of larvae through to the second instar. Overall leaving the plant through drop-off contributes to first-instar mortality in crops with open canopies; however, survival of neonates that have lost contact with a host plant is possible, and becomes more likely later in the crop growing season.
Resumo:
One of the pathways for transfer of cadmium (Cd) through the food chain is addition of urban wastewater solids (biosolids) to soil, and many countries have restrictions on biosolid use to minimize crop Cd contamination. The basis of these restrictions often lies in laboratory or glasshouse experimentation of soil-plant transfer of Cd, but these studies are confounded by artefacts from growing crops in controlled laboratory conditions. This study examined soil to plant (wheat grain) transfer of Cd under a wide range of field environments under typical agronomic conditions, and compared the solubility and bioavailability of Cd in biosolids to soluble Cd salts. Solubility of biosolid Cd (measured by examining Cd partitioning between soil and soil solution) was found to be equal to or greater than that of soluble Cd salts, possibly due to competing ions added with the biosolids. Conversely, bioavailability of Cd to wheat and transfer to grain was less than that of soluble Cd salts, possibly due to addition of Zn with the biosolids, causing reduced plant uptake or grain loading, or due to complexation of soluble Cd2+ by dissolved organic matter.
Resumo:
Thirty-seven surface (0-0.10 or 0-0.20 m) soils covering a wide range of soil types (16 Vertosols, 6 Ferrosols, 6 Dermosols, 4 Hydrosols, 2 Kandosols, 1 Sodosol, 1 Rudosol, and 1 Chromosol) were exhaustively cropped in 2 glasshouse experiments. The test species were Panicum maximum cv. Green Panic in Experiment A and Avena sativa cv. Barcoo in Experiment B. Successive forage harvests were taken until the plants could no longer grow in most soils because of severe potassium (K) deficiency. Soil samples were taken prior to cropping and after the final harvest in both experiments, and also after the initial harvest in Experiment B. Samples were analysed for solution K, exchangeable K (Exch K), tetraphenyl borate extractable K for extraction periods of 15 min (TBK15) and 60 min (TBK60), and boiling nitric acid extractable K (Nitric K). Inter-correlations between the initial levels of the various soil K parameters indicated that the following pools were in sequential equilibrium: solution K, Exch K, fast release fixed K [estimated as (TBK15-Exch K)], and slow release fixed K [estimated as (TBK60-TBK15)]. Structural K [estimated as (Nitric K-TBK60)] was not correlated with any of the other pools. However, following exhaustive drawdown of soil K by cropping, structural K became correlated with solution K, suggesting dissolution of K minerals when solution K was low. The change in the various K pools following cropping was correlated with K uptake at Harvest 1 ( Experiment B only) and cumulative K uptake ( both experiments). The change in Exch K for 30 soils was linearly related to cumulative K uptake (r = 0.98), although on average, K uptake was 35% higher than the change in Exch K. For the remaining 7 soils, K uptake considerably exceeded the change in Exch K. However, the changes in TBK15 and TBK60 were both highly linearly correlated with K uptake across all soils (r = 0.95 and 0.98, respectively). The slopes of the regression lines were not significantly different from unity, and the y-axis intercepts were very small. These results indicate that the plant is removing K from the TBK pool. Although the change in Exch K did not consistently equate with K uptake across all soils, initial Exch K was highly correlated with K uptake (r = 0.99) if one Vertosol was omitted. Exchangeable K is therefore a satisfactory diagnostic indicator of soil K status for the current crop. However, the change in Exch K following K uptake is soil-dependent, and many soils with large amounts of TBK relative to Exch K were able to buffer changes in Exch K. These soils tended to be Vertosols occurring on floodplains. In contrast, 5 soils (a Dermosol, a Rudosol, a Kandosol, and 2 Hydrosols) with large amounts of TBK did not buffer decreases in Exch K caused by K uptake, indicating that the TBK pool in these soils was unavailable to plants under the conditions of these experiments. It is likely that K fertiliser recommendations will need to take account of whether the soil has TBK reserves, and the availability of these reserves, when deciding rates required to raise exchangeable K status to adequate levels.