19 resultados para SURFACE CONTAMINATION
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Herbicide runoff from cropping fields has been identified as a threat to the Great Barrier Reef ecosystem. A field investigation was carried out to monitor the changes in runoff water quality resulting from four different sugarcane cropping systems that included different herbicides and contrasting tillage and trash management practices. These include (i) Conventional - Tillage (beds and inter-rows) with residual herbicides used; (ii) Improved - only the beds were tilled (zonal) with reduced residual herbicides used; (iii) Aspirational - minimum tillage (one pass of a single tine ripper before planting) with trash mulch, no residual herbicides and a legume intercrop after cane establishment; and (iv) New Farming System (NFS) - minimum tillage as in Aspirational practice with a grain legume rotation and a combination of residual and knockdown herbicides. Results suggest soil and trash management had a larger effect on the herbicide losses in runoff than the physico-chemical properties of herbicides. Improved practices with 30% lower atrazine application rates than used in conventional systems produced reduced runoff volumes by 40% and atrazine loss by 62%. There were a 2-fold variation in atrazine and >10-fold variation in metribuzin loads in runoff water between reduced tillage systems differing in soil disturbance and surface residue cover from the previous rotation crops, despite the same herbicide application rates. The elevated risk of offsite losses from herbicides was illustrated by the high concentrations of diuron (14mugL-1) recorded in runoff that occurred >2.5months after herbicide application in a 1st ratoon crop. A cropping system employing less persistent non-selective herbicides and an inter-row soybean mulch resulted in no residual herbicide contamination in runoff water, but recorded 12.3% lower yield compared to Conventional practice. These findings reveal a trade-off between achieving good water quality with minimal herbicide contamination and maintaining farm profitability with good weed control.
Resumo:
Land application of piggery effluent (containing urine, faeces, water, and wasted feed) is under close scrutiny as a potential source of water resource contamination with phosphorus (P). This paper investigates two case studies of the impact of long-term piggery effluent-P application to soil. A Natrustalf (Sodosol) at P1 has received a net load of 3700 kg effluent P/ha over 19 years. The Haplustalf (Dermosol) selected (P2) has received a net load of 310 000 kg P/ha over 30 years. Total, bicarbonate extractable, and soluble P forms were determined throughout the soil profiles for paired (irrigated and unirrigated) sites at P1 and P2, as well as P sorption and desorption characteristics. Surface bicarbonate (PB, 0 - 0.05 m depth) and dilute CaCl2 extractable molybdate-reactive P (PC) have been significantly elevated by effluent irrigation (P1: PB unirrigated 23±1, irrigated 290±6; PC unirrigated 0.03±0.00, irrigated 23.9±0.2. P2: PB unirrigated 72±48, irrigated 3950±1960; PC unirrigated 0.7±0.0, irrigated 443±287 mg P/kg; mean±s.d.). Phosphorus enrichment to 1.5 m, detected as PB, was observed at P2. Elevated concentrations of CaCl2 extractable organic P forms (POC; estimated by non-molybdate reactive P in centrifuged supernatants) were observed from the soil surface of P1 to a depth of 0.4 m. Despite the extent of effluent application at both of these sites, only P1 displayed evidence of significant accumulation of POC. The increase in surface soil total P (0 - 0.05 m depth) due to effluent irrigation was much greater than laboratory P sorption (>25 times for P1; >57 times for P2) for a comparable range of final solution concentrations (desorption extracts ranged from 1-5 mg P/L for P1 and 50-80 mg P/L for P2). Precipitation of sparingly soluble P phases was evidenced in the soils of the P2 effluent application area.
Resumo:
Attention is directed at land application of piggery effluent (containing urine, faeces, water, and wasted feed) as a potential source of water resource contamination with phosphorus (P). This paper summarises P-related properties of soil from 0-0.05 m depth at 11 piggery effluent application sites, in order to explore the impact that effluent application has had on the potential for run-off transport of P. The sites investigated were situated on Alfisol, Mollisol, Vertisol, and Spodosol soils in areas that received effluent for 1.5-30 years (estimated effluent-P applications of 100-310000 kg P/ha in total). Total (PT), bicarbonate extractable (PB), and soluble P forms were determined for the soil (0-0.05 m) at paired effluent and no-effluent sites, as well as texture, oxalate-extractable Fe and Al, organic carbon, and pH. All forms of soil P at 0-0.05 m depth increased with effluent application (PB at effluent sites was 1.7-15 times that at no-effluent sites) at 10 of the 11 sites. Increases in PB were strongly related to net P applications (regression analysis of log values for 7 sites with complete data sets: 82.6 % of variance accounted for, p <0.01). Effluent irrigation tended to increase the proportion of soil PT in dilute CaCl2-extractable forms (PTC: effluent average 2.0 %; no-effluent average 0.6%). The proportion of PTC in non-molybdate reactive forms (centrifuged supernatant) decreased (no-effluent average, 46.4 %; effluent average, 13.7 %). Anaerobic lagoon effluent did not reliably acidify soil, since no consistent relationship was observed for pH with effluent application. Soil organic carbon was increased in most of the effluent areas relative to the no-effluent areas. The four effluent areas where organic carbon was reduced had undergone intensive cultivation and cropping. Current effluent management at many of the piggeries failed to maximise the potential for waste P recapture. Ten of the case-study effluent application areas have received effluent-P in excess of crop uptake. While this may not represent a significant risk of leaching where sorption retains P, it has increased the risk of transport of P by run-off. Where such sites are close to surface water, run-off P loads should be managed.
Resumo:
A panel of 19 monoclonal antibodies (mAbs) was used to study the immunological variability of Lettuce mosaic virus (LMV), a member of the genus Potyvirus, and to perform a first epitope characterization of this virus. Based on their specificity of recognition against a panel of 15 LMV isolates, the mAbs could be clustered in seven reactivity groups. Surface plasmon resonance analysis indicated the presence, on the LMV particles, of at least five independent recognition/binding regions, correlating with the seven mAbs reactivity groups. The results demonstrate that LMV shows significant serological variability and shed light on the LMV epitope structure. The various mAbs should prove a new and efficient tool for LMV diagnostic and field epidemiology studies.
Resumo:
We examined the effect of surface-applied treatments on the above-ground decay resistance of the tenon of mortice-and-tenon timber joints designed to simulate joinery that is exposed to the weather. Joints made from untreated radiata pine, Douglas-fir, brush box, spotted gum and copper-chrome-arsenic (CCA) treated radiata pine were exposed to the weather for 9 y on above-ground racks at five sites throughout eastern Australia. Results indicate (1) a poorly maintained external paint film generally accelerated decay, (2) a brush coat of water-repellent preservative inside the joints often extended serviceability (in some cases by a factor of up to seven times that of untreated joints) and (3) the level of protection provided by a coat of primer applied inside the joint varied and in most cases was not as effective as the water-repellent preservative treatment.
Resumo:
High-value fruit crops are exposed to a range of environmental conditions that can reduce fruit quality. Solar injury (SI) or sunburn is a common disorder in tropical, sub-tropical, and temperate climates and is related to: 1) high fruit surface temperature; 2) high visible light intensity; and, 3) ultraviolet radiation (UV). Positional changes in fruit that are caused by increased weight or abrupt changes that result from summer pruning, limb breakage, or other damage to the canopy can expose fruit to high solar radiation levels, increased fruit surface temperatures, and increased UV exposure that are higher than the conditions to which they are adapted. In our studies, we examined the effects of high fruit surface temperature, saturating photosynthetically-active radiation (PAR), and short-term UV exposure on chlorophyll fluorescence, respiration, and photosynthesis of fruit peel tissues from tropical and temperate fruit in a simulation of these acute environmental changes. All tropical fruits (citrus, macadamia, avocado, pineapple, and custard apple) and the apple cultivars 'Gala', 'Gold Rush', and 'Granny Smith' increased dark respiration (A0) when exposed to UV, suggesting that UV repair mechanisms were induced. The maximum quantum efficiency of photosystem II (Fv/Fm) and the quantum efficiency of photosystem II (ΦII) were unaffected, indicating no adverse effects on photosystem II (PSII). In contrast, 'Braeburn' apple had a reduced Fv/Fm with no increase in A0 on all sampling dates. There was a consistent pattern in all studies. When Fv/Fm was unaffected by UV treatment, A0 increased significantly. Conversely, when Fv/Fm was reduced by UV treatment, then A0 was unaffected. The pattern suggests that when UV repair mechanisms are effective, PSII is adequately protected, and that this protection occurs at the cost of higher respiration. However, when the UV repair mechanisms are ineffective, not only is PSII damaged, but there is additional short-term damage to the repair mechanisms, indicated by a lack of respiration to provide energy.
Resumo:
Aflatoxins are highly carcinogenic mycotoxins produced by two fungi, Aspergillus flavus and A. parasiticus, under specific moisture and temperature conditions before harvest and/or during storage of a wide range of crops including maize. Modelling of interactions between host plant and environment during the season can enable quantification of preharvest aflatoxin risk and its potential management. A model was developed to quantify climatic risks of aflatoxin contamination in maize using principles previously used for peanuts. The model outputs an aflatoxin risk index in response to seasonal temperature and soil moisture during the maize grain filling period using the APSIM's maize module. The model performed well in simulating climatic risk of aflatoxin contamination in maize as indicated by a significant R2 (P ≤ 0.01) between aflatoxin risk index and the measured aflatoxin B1 in crop samples, which was 0.69 for a range of rainfed Australian locations and 0.62 when irrigated locations were also included in the analysis. The model was further applied to determine probabilities of exceeding a given aflatoxin risk in four non-irrigated maize growing locations of Queensland using 106 years of historical climatic data. Locations with both dry and hot climates had a much higher probability of higher aflatoxin risk compared with locations having either dry or hot conditions alone. Scenario analysis suggested that under non-irrigated conditions the risk of aflatoxin contamination could be minimised by adjusting sowing time or selecting an appropriate hybrid to better match the grain filling period to coincide with lower temperature and water stress conditions.
Resumo:
Recent incidents of mycotoxin contamination (particularly aflatoxins and fumonisins) have demonstrated a need for an industry-wide management system to ensure Australian maize meets the requirements of all domestic users and export markets. Results of recent surveys are presented, demonstrating overall good conformity with nationally accepted industry marketing standards but with occasional samples exceeding these levels. This paper describes mycotoxin-related hazards inherent in the Australian maize production system and a methodology combining good agricultural practices and the hazard analysis critical control point framework to manage risk.
Resumo:
The first larval instar has been identified as a critical stage for population mortality in Lepidoptera, yet due to the body size of these larvae, the factors that contribute to mortality under field conditions are still not clear. Dispersal behaviour has been suggested as a significant, but ignored factor contributing to mortality in first-instar lepidopteran larvae. The impact that leaving the host plant has on the mortality rate of Helicoverpa armigera neonates was examined in field crops and laboratory trials. In this study the following are examined: (1) the effects of soil surface temperature, and the level of shade within the crop, on the mortality of neonates on the soil after dropping off from the host plant; (2) the percentage of neonates that dropped off from a host plant and landed on the soil; and (3) the effects of exposure to different soil surface temperatures on the development and mortality of neonates. The findings of this study showed that: (1) on the soil, surface temperatures above 43°C were lethal for neonates, and exposure to these temperatures contributed greatly to the overall mortality rate observed; however, the fate of neonates on the soil varied significantly depending on canopy closure within the crop; (2) at least 15% of neonates dropped off from the host plant and landed on the soil, meaning that the proportion of neonates exposed to these condition is not trivial; and (3) 30 min exposure to soil surface temperatures approaching the lethal level (>43°C) has no significant negative effects on the development and mortality of larvae through to the second instar. Overall leaving the plant through drop-off contributes to first-instar mortality in crops with open canopies; however, survival of neonates that have lost contact with a host plant is possible, and becomes more likely later in the crop growing season.
Resumo:
In Australia communities are concerned about atrazine being detected in drinking water supplies. It is important to understand mechanisms by which atrazine is transported from paddocks to waterways if we are to reduce movement of agricultural chemicals from the site of application. Two paddocks cropped with grain sorghum on a Black Vertosol were monitored for atrazine, potassium chloride (KCl) extractable atrazine, desethylatrazine (DEA), and desisopropylatrazine (DIA) at 4 soil depths (0-0.05, 0.05-0.10, 0.10-0.20, and 0.20-0.30 m) and in runoff water and runoff sediment. Atrazine + DEA + DIA (total atrazine) had a half-life in soil of 16-20 days, more rapid dissipation than in many earlier reports. Atrazine extracted in dilute potassium chloride, considered available for weed control, was initially 34% of the total and had a half-life of 15-20 days until day 30, after which it dissipated rapidly with a half life of 6 days. We conclude that, in this region, atrazine may not pose a risk for groundwater contamination, as only 0.5% of applied atrazine moved deeper than 0.20 m into the soil, where it dissipated rapidly. In runoff (including suspended sediment) atrazine concentrations were greatest during the first runoff event (57 days after application) (85 μg/L) and declined with time. After 160 days, the total atrazine lost in runoff was 0.4% of the initial application. The total atrazine concentration in runoff was strongly related to the total concentration in soil, as expected. Even after 98% of the KCl-extractable atrazine had dissipated (and no longer provided weed control), runoff concentrations still exceeded the human health guideline value of 40 μg/L. For total atrazine in soil (0-0.05 m), the range for coefficient of soil sorption (Kd) was 1.9-28.4 mL/g and for soil organic carbon sorption (KOC) was 100-2184 mL/g, increasing with time of contact with the soil and rapid dissipation of the more soluble, available phase. Partition coefficients in runoff for total atrazine were initially 3, increasing to 32 and 51 with time, values for DEA being half these. To minimise atrazine losses, cultural practices that maximise rain infiltration, and thereby minimise runoff, and minimise concentrations in the soil surface should be adopted.
Resumo:
Hybrids between Corymbia torelliana (F.Muell.) K.D.Hill & L.A.S.Johnson and C. citriodora subsp. variegata (F.Muell.) A.R.Bean & M.W.McDonald are used extensively to establish forestry plantations in subtropical Australia. Methods were developed for in vitro seed germination, shoot multiplication and plantlet formation that could be used to establish in vitro and ex vitro clone banks of juvenile Corymbia hybrids. Effects of sodium hypochlorite concentration and exposure time on seed contamination and germination, and effects of cytokinin and auxin concentrations on shoot multiplication and subsequent rooting, were assessed. A two-step surface sterilisation procedure, involving 70% ethanol followed by 1% sodium hypochlorite, provided almost no contamination and at least 88% germination. A novel method of cytokinin-free node culture proved most effective for in vitro propagation. Lateral bud break of primary shoots was difficult to induce by using cytokinin, but primary shoots rooted prolifically, elongated rapidly and produced multiple nodes in the absence of exogenous cytokinin. Further multiplication was obtained either by elongating lateral shoots of nodal explants in cytokinin-free medium or by inducing organogenic callus and axillary shoot proliferation with 2.2 µm benzyladenine. Plantlets were produced using an in vitro soil-less method that provided extensive rooting in sterile propagation mixture. These methods provide a means for simultaneous laboratory storage and field-testing of clones before selection and multiplication of desired genotypes.
Resumo:
Surface losses of nitrogen from horticulture farms in coastal Queensland, Australia, may have the potential to eutrophy sensitive coastal marine habitats nearby. A case-study of the potential extent of such losses was investigated in a coastal macadamia plantation. Nitrogen losses were quantified in 5 consecutive runoff events during the 13-month study. Irrigation did not contribute to surface flows. Runoff was generated by storms at combined intensities and durations that were 20–40 mm/h for >9 min. These intensities and durations were within expected short-term (1 year) and long-term (up to 20 years) frequencies of rainfall in the study area. Surface flow volumes were 5.3 ± 1.1% of the episodic rainfall generated by such storms. Therefore, the largest part of each rainfall event was attributed to infiltration and drainage in this farm soil (Kandosol). The estimated annual loss of total nitrogen in runoff was 0.26 kg N/ha.year, representing a minimal loading of nitrogen in surface runoff when compared to other studies. The weighted average concentrations of total sediment nitrogen (TSN) and total dissolved nitrogen (TDN) generated in the farm runoff were 2.81 ± 0.77% N and 1.11 ± 0.27 mg N/L, respectively. These concentrations were considerably greater than ambient levels in an adjoining catchment waterway. Concentrations of TSN and TDN in the waterway were 0.11 ± 0.02% N and 0.50 ± 0.09 mg N/L, respectively. The steep concentration gradient of TSN and TDN between the farm runoff and the waterway demonstrated the occurrence of nutrient loading from the farming landscapes to the waterway. The TDN levels in the stream exceeded the current specified threshold of 0.2–0.3 mg N/L for eutrophication of such a waterway. Therefore, while the estimate of annual loading of N from runoff losses was comparatively low, it was evident that the stream catchment and associated agricultural land uses were already characterised by significant nitrogen loadings that pose eutrophication risks. The reported levels of nitrogen and the proximity of such waterways (8 km) to the coastline may have also have implications for the nearshore (oligotrophic) marine environment during periods of turbulent flow.
Resumo:
Interest in cashew production in Australia has been stimulated by domestic and export market opportunities and suitability of large areas of tropical Australia. Economic models indicate that cashew production is profitable at 2.8 t ha-1 nut-in-shell (NIS). Balanced plant nutrition is essential to achieve economic yields in Australia, with nitrogen (N) of particular importance because of its capacity to modify growth, affect nut yield and cause environmental degradation through soil acidification and off-site contamination. The study on a commercial cashew plantation at Dimbulah, Australia, investigated the effect of N rate and timing on cashew growth, nut production, N leaching and soil chemical properties over five growth cycles (1995-1999). Nitrogen was applied during the main periods of vegetative (December-April) and reproductive (June-October) growth. Commercial NIS yields (up to 4.4 t ha-1 from individual trees) that exceeded the economic threshold of 2.8 t ha-1 were achieved. The yield response was mainly determined by canopy size as mean nut weight, panicle density and nuts per panicle were largely unaffected by N treatments. Nitrogen application confined to the main period of vegetative growth (December-April) produced a seasonal growth pattern that corresponded most consistently with highest NIS yield. This N timing also reduced late season flowering and undesirable post-November nut drop. Higher yields were not produced at N rates greater than 17 g m-2 of canopy surface area (equating to 210 kg N ha-1 for mature size trees). High yields were attained when N concentrations in Mveg leaves in May-June were about 2%, but this assessment occurs at a time when it is not feasible to correct N deficiency. The Mflor leaf of the preceding November, used in conjunction with the Mveg leaf, was proposed as a diagnostic tool to guide N rate decisions. Leaching of nitrate-N and acidification of the soil profile was recorded to 0.9 m. This is an environmental and sustainability hazard, and demonstrates that improved methods of N management are required.
Resumo:
Herbicide contamination from agriculture is a major issue worldwide, and has been identified as a threat to freshwater and marine environments in the Great Barrier Reef World Heritage Area in Australia. The triazine herbicides are of particular concern because of potential adverse effects, both on photosynthetic organisms and upon vertebrate development. To date a number of bioremediation strategies have been proposed for triazine herbicides, but are unlikely to be implemented due to their reliance upon the release of genetically modified organisms. We propose an alternative strategy using a free-enzyme bioremediant, which is unconstrained by the issues surrounding the use of live organisms. Here we report an initial field trial with an enzyme-based product, demonstrating that the technology is technically capable of remediating water bodies contaminated with the most common triazine herbicide, atrazine.
Resumo:
The impact of three cropping histories (sugarcane, maize and soybean) and two tillage practices (conventional tillage and direct drill) on plant-parasitic and free-living nematodes in the following sugarcane crop was examined in a field trial at Bundaberg. Soybean reduced populations of lesion nematode (Pratylenchus zeae) and root-knot nematode (Meloidogyne javanica) in comparison to previous crops of sugarcane or maize but increased populations of spiral nematode (Helicotylenchus dihystera) and maintained populations of dagger nematode (Xiphinema elongatum). However the effect of soybean on P zeae and M. javanica was no longer apparent 15 weeks after planting sugarcane, while later in the season, populations of these nematodes following soybean were as high as or higher than maize or sugarcane. Populations of P zeae were initially reduced by cultivation but due to strong resurgence tended to be higher in conventionally tilled than direct drill plots at the end of the plant crop. Even greater tillage effects were observed with M. javanica and X. elongatum, as nematode populations were significantly higher in conventionally tilled than direct drill plots late in the season. Populations of free-living nematodes in the upper 10 cm of soil were initially highest following soybean, but after 15, 35 and 59 weeks were lower than after sugarcane and contained fewer omnivorous and predatory nematodes. Conventional tillage increased populations of free-living nematodes in soil in comparison to direct drill and was also detrimental to omnivorous and predatory nematodes. These results suggest that crop rotation and tillage not only affect plant-parasitic nematodes directly, but also have indirect effects by impacting on natural enemies that regulate nematode populations. More than 2 million nematodes/m(2) were often present in crop residues on the surface of direct drill plots. Bacterial-feeding nematodes were predominant in residues early in the decomposition process but fungal-feeding nematodes predominated after 15 weeks. This indicates that fungi become an increasingly important component of the detritus food web as decomposition proceeds, and that that the rate of nutrient cycling decreases with time. Correlations between total numbers of free-living nematodes and mineral N concentrations in crop residues and surface soil suggested that the free-living nematode community may provide an indication of the rate of mineralisation of N from organic matter.