79 resultados para Herbicide runoff
em eResearch Archive - Queensland Department of Agriculture
Resumo:
The off-site transport of agricultural chemicals, such as herbicides, into freshwater and marine ecosystems is a world-wide concern. The adoption of farm management practices that minimise herbicide transport in rainfall-runoff is a priority for the Australian sugarcane industry, particularly in the coastal catchments draining into the World Heritage listed Great Barrier Reef (GBR) lagoon. In this study, residual herbicide runoff and infiltration were measured using a rainfall simulator in a replicated trial on a brown Chromosol with 90–100% cane trash blanket cover in the Mackay Whitsunday region, Queensland. Management treatments included conventional 1.5 m spaced sugarcane beds with a single row of sugarcane (CONV) and 2 m spaced, controlled traffic sugarcane beds with dual sugarcane rows (0.8 m apart) (2mCT). The aim was to simulate the first rainfall event after the application of the photosynthesis inhibiting (PSII) herbicides ametryn, atrazine, diuron and hexazinone, by broadcast (100% coverage, on bed and furrow) and banding (50–60% coverage, on bed only) methods. These events included heavy rainfall 1 day after herbicide application, considered a worst case scenario, or rainfall 21 days after application. The 2mCT rows had significantly (P < 0.05) less runoff (38%) and lower peak runoff rates (43%) than CONV rows for a rainfall average of 93 mm at 100 mm h−1 (1:20 yr Average Return Interval). Additionally, final infiltration rates were higher in 2mCT rows than CONV rows, with 72 and 52 mm h−1 respectively. This resulted in load reductions of 60, 55, 47, and 48% for ametryn, atrazine, diuron and hexazinone from 2mCT rows, respectively. Herbicide losses in runoff were also reduced by 32–42% when applications were banded rather than broadcast. When rainfall was experienced 1 day after application, a large percentage of herbicides were washed off the cane trash. However, by day 21, concentrations of herbicide residues on cane trash were lower and more resistant to washoff, resulting in lower losses in runoff. Consequently, ametryn and atrazine event mean concentrations in runoff were approximately 8 fold lower at day 21 compared with day 1, whilst diuron and hexazinone were only 1.6–1.9 fold lower, suggesting longer persistence of these chemicals. Runoff collected at the end of the paddock in natural rainfall events indicated consistent though smaller treatment differences to the rainfall simulation study. Overall, it was the combination of early application, banding and controlled traffic that was most effective in reducing herbicide losses in runoff. Crown copyright © 2012
Resumo:
Herbicide runoff from cropping fields has been identified as a threat to the Great Barrier Reef ecosystem. A field investigation was carried out to monitor the changes in runoff water quality resulting from four different sugarcane cropping systems that included different herbicides and contrasting tillage and trash management practices. These include (i) Conventional - Tillage (beds and inter-rows) with residual herbicides used; (ii) Improved - only the beds were tilled (zonal) with reduced residual herbicides used; (iii) Aspirational - minimum tillage (one pass of a single tine ripper before planting) with trash mulch, no residual herbicides and a legume intercrop after cane establishment; and (iv) New Farming System (NFS) - minimum tillage as in Aspirational practice with a grain legume rotation and a combination of residual and knockdown herbicides. Results suggest soil and trash management had a larger effect on the herbicide losses in runoff than the physico-chemical properties of herbicides. Improved practices with 30% lower atrazine application rates than used in conventional systems produced reduced runoff volumes by 40% and atrazine loss by 62%. There were a 2-fold variation in atrazine and >10-fold variation in metribuzin loads in runoff water between reduced tillage systems differing in soil disturbance and surface residue cover from the previous rotation crops, despite the same herbicide application rates. The elevated risk of offsite losses from herbicides was illustrated by the high concentrations of diuron (14mugL-1) recorded in runoff that occurred >2.5months after herbicide application in a 1st ratoon crop. A cropping system employing less persistent non-selective herbicides and an inter-row soybean mulch resulted in no residual herbicide contamination in runoff water, but recorded 12.3% lower yield compared to Conventional practice. These findings reveal a trade-off between achieving good water quality with minimal herbicide contamination and maintaining farm profitability with good weed control.
Resumo:
Grass and broad-leaved weeds can reduce both yields and product marketability of desmanthus (Desmanthus virgatus) seed crops, even when cultural control strategies are used. Selective herbicides might economically control these weeds, but, prior to this study, the few herbicides tolerated by desmanthus did not control key weed contaminants of desmanthus seed crops. In this study, the tolerance of desmanthus cv. Marc to 55 herbicides used for selective weed control in other leguminous crops was assessed in 1 pot trial and 3 Queensland field trials. One field trial assessed the tolerance of desmanthus seedlings to combinations of the most promising pre-emergent and post emergent herbicides. The pre-emergent herbicides, imazaquin, imazethapyr, pendimethalin, oryzalin and trifluralin, gave useful weed control with very little crop damage. The post-emergent herbicides, haloxyfop, clethodim, propyzamide, carbetamide and dalapon, were safe for controlling grass weeds in desmanthus. Selective post-emergence control of broad-leaved weeds was achieved using bentazone, bromoxynil and imazethapyr. One trial investigated salvaging second-year desmanthus crops from mature perennial weeds, and atrazine, terbacil and hexazinone showed some potential in this role. Overall, our results show that desmanthus tolerates herbicides which collectively control a wide range of weeds encountered in Queensland. These, in combination with cultural weed control strategies, should control most weeds in desmanthus seed crops.
Resumo:
Runoff and sediment loss from forest roads were monitored for a two-year period in a Pinus plantation in southeast Queensland. Two classes of road were investigated: a gravelled road, which is used as a primary daily haulage route for the logging area, and an ungravelled road, which provides the main access route for individual logging compartments and is intensively used as a haulage route only during the harvest of these areas (approximately every 30 years). Both roads were subjected to routine traffic loads and maintenance during the study. Surface runoff in response to natural rainfall was measured and samples taken for the determination of sediment and nutrient (total nitrogen, total phosphorus, dissolved organic carbon and total iron) loads from each road. Results revealed that the mean runoff coefficient (runoff depth/rainfall depth) was consistently higher from the gravelled road plot with 0.57, as compared to the ungravelled road with 0.38. Total sediment loss over the two-year period was greatest from the gravelled road plot at 5.7 t km−1 compared to the ungravelled road plot with 3.9 t km−1. Suspended solids contributed 86% of the total sediment loss from the gravelled road, and 72% from the ungravelled road over the two years. Nitrogen loads from the two roads were both relatively constant throughout the study, and averaged 5.2 and 2.9 kg km−1 from the gravelled and ungravelled road, respectively. Mean annual phosphorus loads were 0.6 kg km−1 from the gravelled road and 0.2 kg km−1 from the ungravelled road. Organic carbon and total iron loads increased in the second year of the study, which was a much wetter year, and are thought to reflect the breakdown of organic matter in roadside drains and increased sediment generation, respectively. When road and drain maintenance (grading) was performed runoff and sediment loss were increased from both road types. Additionally, the breakdown of the gravel road base due to high traffic intensity during wet conditions resulted in the formation of deep (10 cm) ruts which increased erosion. The Water Erosion Prediction Project (WEPP):Road model was used to compare predicted to observed runoff and sediment loss from the two road classes investigated. For individual rainfall events, WEPP:Road predicted output showed strong agreement with observed values of runoff and sediment loss. WEPP:Road predictions for annual sediment loss from the entire forestry road network in the study area also showed reasonable agreement with the extrapolated observed values.
Resumo:
In Australia communities are concerned about atrazine being detected in drinking water supplies. It is important to understand mechanisms by which atrazine is transported from paddocks to waterways if we are to reduce movement of agricultural chemicals from the site of application. Two paddocks cropped with grain sorghum on a Black Vertosol were monitored for atrazine, potassium chloride (KCl) extractable atrazine, desethylatrazine (DEA), and desisopropylatrazine (DIA) at 4 soil depths (0-0.05, 0.05-0.10, 0.10-0.20, and 0.20-0.30 m) and in runoff water and runoff sediment. Atrazine + DEA + DIA (total atrazine) had a half-life in soil of 16-20 days, more rapid dissipation than in many earlier reports. Atrazine extracted in dilute potassium chloride, considered available for weed control, was initially 34% of the total and had a half-life of 15-20 days until day 30, after which it dissipated rapidly with a half life of 6 days. We conclude that, in this region, atrazine may not pose a risk for groundwater contamination, as only 0.5% of applied atrazine moved deeper than 0.20 m into the soil, where it dissipated rapidly. In runoff (including suspended sediment) atrazine concentrations were greatest during the first runoff event (57 days after application) (85 μg/L) and declined with time. After 160 days, the total atrazine lost in runoff was 0.4% of the initial application. The total atrazine concentration in runoff was strongly related to the total concentration in soil, as expected. Even after 98% of the KCl-extractable atrazine had dissipated (and no longer provided weed control), runoff concentrations still exceeded the human health guideline value of 40 μg/L. For total atrazine in soil (0-0.05 m), the range for coefficient of soil sorption (Kd) was 1.9-28.4 mL/g and for soil organic carbon sorption (KOC) was 100-2184 mL/g, increasing with time of contact with the soil and rapid dissipation of the more soluble, available phase. Partition coefficients in runoff for total atrazine were initially 3, increasing to 32 and 51 with time, values for DEA being half these. To minimise atrazine losses, cultural practices that maximise rain infiltration, and thereby minimise runoff, and minimise concentrations in the soil surface should be adopted.
Resumo:
Surface losses of nitrogen from horticulture farms in coastal Queensland, Australia, may have the potential to eutrophy sensitive coastal marine habitats nearby. A case-study of the potential extent of such losses was investigated in a coastal macadamia plantation. Nitrogen losses were quantified in 5 consecutive runoff events during the 13-month study. Irrigation did not contribute to surface flows. Runoff was generated by storms at combined intensities and durations that were 20–40 mm/h for >9 min. These intensities and durations were within expected short-term (1 year) and long-term (up to 20 years) frequencies of rainfall in the study area. Surface flow volumes were 5.3 ± 1.1% of the episodic rainfall generated by such storms. Therefore, the largest part of each rainfall event was attributed to infiltration and drainage in this farm soil (Kandosol). The estimated annual loss of total nitrogen in runoff was 0.26 kg N/ha.year, representing a minimal loading of nitrogen in surface runoff when compared to other studies. The weighted average concentrations of total sediment nitrogen (TSN) and total dissolved nitrogen (TDN) generated in the farm runoff were 2.81 ± 0.77% N and 1.11 ± 0.27 mg N/L, respectively. These concentrations were considerably greater than ambient levels in an adjoining catchment waterway. Concentrations of TSN and TDN in the waterway were 0.11 ± 0.02% N and 0.50 ± 0.09 mg N/L, respectively. The steep concentration gradient of TSN and TDN between the farm runoff and the waterway demonstrated the occurrence of nutrient loading from the farming landscapes to the waterway. The TDN levels in the stream exceeded the current specified threshold of 0.2–0.3 mg N/L for eutrophication of such a waterway. Therefore, while the estimate of annual loading of N from runoff losses was comparatively low, it was evident that the stream catchment and associated agricultural land uses were already characterised by significant nitrogen loadings that pose eutrophication risks. The reported levels of nitrogen and the proximity of such waterways (8 km) to the coastline may have also have implications for the nearshore (oligotrophic) marine environment during periods of turbulent flow.
Resumo:
The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.
Resumo:
Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.
Resumo:
The previous projects (phase I - III) highlighted that northern region wheat and barley cultivars differ considerably in their sensitivity to herbicides. The new project will focus on increased screening of advanced breeding lines and new cultivars lines to commonly used herbicides, for barley, chickpea and wheat. Studies on impact of environment on herbicide x genotype responses will also be undertaken with the national team. The new information will be added to the existing information package on herbicide tolerance. Thus, adverse impacts of herbicides on productivity in northern region will be reduced, as growers and agronomists will select safer herbicides for their sown variety, or select more tolerant varieties for their important herbicides.
Resumo:
A national focus on strategic and applied research to minimise herbicide resistance in Australian cropping.
Resumo:
The project will evaluate seed bank depletion of key northern herbicide resistant weeds under different environments, cropping systems, crop agronomies and non-chemical control tactics. The project will also evaluate soil biology and seed bank relationships to explain differences in seed bank persistence.
Resumo:
The threat and management of glyphosate# resistant weeds are major issues facing northern region growers. At present five weeds are confirmed glyphosate-resistant: barnyard grass, liverseed grass, windmill grass, annual ryegrass and flaxleaf fleabane. This project used 25 experiments to investigate the ecology of the grass weeds, plus new or improved chemical and non-chemical control tactics for them. The refined glyphosate resistance model developed in this project used the experiments' findings to predict the long-term impacts on evolution of resistance and on seed bank numbers of resistant weeds. These data led to revised management and resistance avoidance strategies, which were published in the Reporter newsletter, and via an on-line risk assessment tool. - See more at: http://finalreports.grdc.com.au/UQ00054#sthash.oTkCN4Sk.dpuf
Resumo:
During the past 10 years, this project tested 23 barley and 51 wheat varieties with 19 and 34 registered herbicides, respectively. It concentrated on new varieties and herbicides. The research highlighted that Northern Region (NR) wheat and barley varieties differed considerably in their sensitivity to these herbicides. Overall, 9 per cent of wheat variety x herbicide combinations and 6 per cent of barley variety x herbicide combinations had significant yield losses (3 to 38%) from herbicides at recommended rates and crop stages. In addition, 21 to 23 per cent had significant yield losses from herbicides at double rates, indicating a narrow margin of crop safety.
Resumo:
The introduction of glyphosate tolerant cotton has significantly improved the flexibility and management of a number of problem weeds in cotton systems. However, reliance on glyphosate poses risks to the industry in term of glyphosate resistance and species shift. The aims of this project were to identify these risks, and determine strategies to prevent and mitigate the potential for resistance evolution. Field surveys identified fleabane as the most common weed now in both irrigated and dryland system. Sowthistle has also increased in prevalence, and bladder ketmia and peachvine remained common. The continued reliance on glyphosate has favoured small seeded, and glyphosate tolerant species. Fleabane is both of these, with populations confirmed resistant in grains systems in Queensland and NSW. When species were assessed for their resistance risk, fleabane, liverseed grass, feathertop Rhodes grass, sowthistle and barnyard grass were determined to have high risk ratings. Management practices were also determined to rely heavily on glyphosate and therefore be high risk in summer fallows, and dryland glyphosate tolerant and conventional cotton. Situations were these high risk species are present in high risk cropping phases need particular attention. The confirmation of a glyphosate resistance barnyard grass population in a dryland glyphosate tolerant cotton system means resistance is now a reality for the cotton industry. However, experiments have shown that resistant populations can be managed with other herbicide options currently available. However, the options for fleabane management in cotton are still limited. Although some selective residual herbicides are showing promise, the majority of fleabane control tactics can only be used in other phases of the cotton rotation. An online glyphosate resistance tool has been developed. This tool allows growers to assess their individual glyphosate resistance risks, and how they can adjust their practices to reduce their risks. It also provides researchers with current information on weed species present and practices used across the industry. This tool will be extremely useful in tailoring future research and extension efforts. Simulations from the expanded glyphosate resistance model have shown that glyphosate resistance can be prevented and managed in glyphosate-tolerant cotton farming systems. However, for strategies to be successful, some effort is required. Simulations have shown the importance of controlling survivors of glyphosate applications, using effective glyphosate alternatives in fallows, and combining several effective glyphosate alternatives in crop, and these are the key to the prevention and management of glyphosate resistance.