72 resultados para herbicide leaching
Resumo:
The main weeds and weed management practices undertaken in broad acre dryland cropping areas of north-eastern Australia have been identified. The information was collected in a comprehensive postal survey of both growers and agronomists from Dubbo in New South Wales (NSW) through to Clermont in central Queensland, where 237 surveys were returned. A very diverse weed flora of 105 weeds from 91 genera was identified for the three cropping zones within the region (central Queensland, southern Queensland and northern NSW). Twenty-three weeds were common to all cropping zones. The major common weeds were Sonchus oleraceus, Rapistrum rugosum, Echinochloa spp. and Urochloa panicoides. The main weeds were identified for both summer and winter fallows, and sorghum, wheat and chickpea crops for each of the zones, with some commonality as well as floral uniqueness recorded. More genera were recorded in the fallows than in crops, and those in summer fallows exceeded the number in winter. Across the region, weed management relied heavily on herbicides. In fallows, glyphosate and mixes with glyphosate were very common, although the importance of the glyphosate mix partner differed among the cropping zones. Use and importance of pre-emergence herbicides in-crop varied considerably among the zones. In wheat, more graminicides were used in northern NSW than in southern Queensland, and virtually none were used in central Queensland, reflecting the differences in winter grass weed flora across the region. Atrazine was the major herbicide used in sorghum, although metolachlor was also used predominantly in northern NSW. Fallow and inter-row cultivation were used more often in the southern areas of the region. Grazing of fallows was more prominent in northern NSW. High crop seeding rates were not commonly recorded indicating that growers are not using crop competition as a tool for weed management. Although many management practices were recorded overall, few growers were using integrated weed management, and herbicide resistance has been and continues to be an issue for the region.
Resumo:
Growing agricultural crops in wide row spacings has been widely adopted to conserve water, to control pests and diseases, and to minimise problems associated with sowing into stubble. The development of herbicide resistance combined with the advent of precision agriculture has resulted in a further reason for wide row spacings to be adopted: weed control. Increased row spacing enables two different methods of weed control to be implemented with non-selective chemical and physical control methods utilised in the wide inter-row zone, with or without selective chemicals used on the on-row only. However, continual application of herbicides and tillage on the inter-row zone brings risks of herbicide resistance, species shifts and/or changes in species dominance, crop damage, increased costs, yield losses, and more expensive weed management technology.
Resumo:
Quilpie mesquite (Prosopis velutina) is an invasive woody weed that is believed to have been introduced into south-west Queensland in the 1930s. Following the withdrawal of 2,4,5-T, research on P. pallida resulted in revised recommendations for control of all Prosopis spp. in Queensland. Adoption of many of these recommendations for Quilpie mesquite control produced substandard results. Following a pilot trial, a shade-house experiment was conducted to determine the differences in susceptibility of two species of mesquite, P. velutina and P. pallida, to commonly available herbicides. It was hypothesized that P. velutina was less susceptible than P. pallida, based upon claims that the registered chemical recommendations for Prosopis spp. were not sufficiently effective on P. velutina. Nine foliar herbicide treatments were applied to potted shade-house plants. Treatment effects indicated differing susceptibility between the two species. P. velutina consistently showed less response to metsulfuron, fluroxypyr, 2,4-D/picloram and triclopyr/picloram, compared to the glyphosate formulations, where negligible differences occurred between the two species. The response to glyphosate was poor at all rates in this experiment. Re-application of herbicides to surviving plants indicated that susceptibility can decrease when follow-up application is in autumn and the time since initial application is short. The relationship between leaf structure and the volume of spray adhering to a plant was assessed across species. The herbicide captured by similar-sized plants of each species differed, with P. pallida retaining a greater volume of herbicide.
Resumo:
Background and Aims: The evolution of resistance to herbicides is a substantial problem in contemporary agriculture. Solutions to this problem generally consist of the use of practices to control the resistant population once it evolves, and/or to institute preventative measures before populations become resistant. Herbicide resistance evolves in populations over years or decades, so predicting the effectiveness of preventative strategies in particular relies on computational modelling approaches. While models of herbicide resistance already exist, none deals with the complex regional variability in the northern Australian sub-tropical grains farming region. For this reason, a new computer model was developed. Methods: The model consists of an age- and stage-structured population model of weeds, with an existing crop model used to simulate plant growth and competition, and extensions to the crop model added to simulate seed bank ecology and population genetics factors. Using awnless barnyard grass (Echinochloa colona) as a test case, the model was used to investigate the likely rate of evolution under conditions expected to produce high selection pressure. Key Results: Simulating continuous summer fallows with glyphosate used as the only means of weed control resulted in predicted resistant weed populations after approx. 15 years. Validation of the model against the paddock history for the first real-world glyphosate-resistant awnless barnyard grass population shows that the model predicted resistance evolution to within a few years of the real situation. Conclusions: This validation work shows that empirical validation of herbicide resistance models is problematic. However, the model simulates the complexities of sub-tropical grains farming in Australia well, and can be used to investigate, generate and improve glyphosate resistance prevention strategies.
Resumo:
The present study set out to test the hypothesis through field and simulation studies that the incorporation of short-term summer legumes, particularly annual legume lablab (Lablab purpureus cv. Highworth), in a fallow-wheat cropping system will improve the overall economic and environmental benefits in south-west Queensland. Replicated, large plot experiments were established at five commercial properties by using their machineries, and two smaller plot experiments were established at two intensively researched sites (Roma and St George). A detailed study on various other biennial and perennial summer forage legumes in rotation with wheat and influenced by phosphorus (P) supply (10 and 40 kg P/ha) was also carried out at the two research sites. The other legumes were lucerne (Medicago sativa), butterfly pea (Clitoria ternatea) and burgundy bean (Macroptilium bracteatum). After legumes, spring wheat (Triticum aestivum) was sown into the legume stubble. The annual lablab produced the highest forage yield, whereas germination, establishment and production of other biennial and perennial legumes were poor, particularly in the red soil at St George. At the commercial sites, only lablab-wheat rotations were experimented, with an increased supply of P in subsurface soil (20 kg P/ha). The lablab grown at the commercial sites yielded between 3 and 6 t/ha forage yield over 2-3 month periods, whereas the following wheat crop with no applied fertiliser yielded between 0.5 to 2.5 t/ha. The wheat following lablab yielded 30% less, on average, than the wheat in a fallow plot, and the profitability of wheat following lablab was slightly higher than that of the wheat following fallow because of greater costs associated with fallow management. The profitability of the lablab-wheat phase was determined after accounting for the input costs and additional costs associated with the management of fallow and in-crop herbicide applications for a fallow-wheat system. The economic and environmental benefits of forage lablab and wheat cropping were also assessed through simulations over a long-term climatic pattern by using economic (PreCAPS) and biophysical (Agricultural Production Systems Simulation, APSIM) decision support models. Analysis of the long-term rainfall pattern (70% in summer and 30% in winter) and simulation studies indicated that ~50% time a wheat crop would not be planted or would fail to produce a profitable crop (grain yield less than 1 t/ha) because of less and unreliable rainfall in winter. Whereas forage lablab in summer would produce a profitable crop, with a forage yield of more than 3 t/ha, ~90% times. Only 14 wheat crops (of 26 growing seasons, i.e. 54%) were profitable, compared with 22 forage lablab (of 25 seasons, i.e. 90%). An opportunistic double-cropping of lablab in summer and wheat in winter is also viable and profitable in 50% of the years. Simulation studies also indicated that an opportunistic lablab-wheat cropping can reduce the potential runoff+drainage by more than 40% in the Roma region, leading to improved economic and environmental benefits.
Resumo:
The efficacy of individual tree treatment (stem-injection), aerially applied root-absorbed herbicide and mechanical felling (with and without subsequent fire) in controlling woody plants was compared in a poplar box (Eucalyptus populnea) woodland community in central Queensland, Australia. All treatments reduced woody plant populations and basal area relative to the untreated control. Chemical control and 'mechanical felling plus fire' treatments were equally effective in reducing woody plant basal area 7 years after the treatments were imposed. However, mechanical felling alone was less effective. There was a clear tendency for the scattered tree (80% thinning) treatment to recover woody plant basal area towards pre-treatment levels faster than other clearing strategies, although this response was not significantly different from 20% clump retention and mechanical felling (without burning) treatments.
Resumo:
Interest in cashew production in Australia has been stimulated by domestic and export market opportunities and suitability of large areas of tropical Australia. Economic models indicate that cashew production is profitable at 2.8 t ha-1 nut-in-shell (NIS). Balanced plant nutrition is essential to achieve economic yields in Australia, with nitrogen (N) of particular importance because of its capacity to modify growth, affect nut yield and cause environmental degradation through soil acidification and off-site contamination. The study on a commercial cashew plantation at Dimbulah, Australia, investigated the effect of N rate and timing on cashew growth, nut production, N leaching and soil chemical properties over five growth cycles (1995-1999). Nitrogen was applied during the main periods of vegetative (December-April) and reproductive (June-October) growth. Commercial NIS yields (up to 4.4 t ha-1 from individual trees) that exceeded the economic threshold of 2.8 t ha-1 were achieved. The yield response was mainly determined by canopy size as mean nut weight, panicle density and nuts per panicle were largely unaffected by N treatments. Nitrogen application confined to the main period of vegetative growth (December-April) produced a seasonal growth pattern that corresponded most consistently with highest NIS yield. This N timing also reduced late season flowering and undesirable post-November nut drop. Higher yields were not produced at N rates greater than 17 g m-2 of canopy surface area (equating to 210 kg N ha-1 for mature size trees). High yields were attained when N concentrations in Mveg leaves in May-June were about 2%, but this assessment occurs at a time when it is not feasible to correct N deficiency. The Mflor leaf of the preceding November, used in conjunction with the Mveg leaf, was proposed as a diagnostic tool to guide N rate decisions. Leaching of nitrate-N and acidification of the soil profile was recorded to 0.9 m. This is an environmental and sustainability hazard, and demonstrates that improved methods of N management are required.
Resumo:
In February 2004, Redland Shire Council with help from a Horticulture Australia research project was able to establish a stable grass cover of seashore paspalum (Paspalum vaginatum) on a Birkdale park where the soil had previously proved too salty to grow anything else. Following on from their success with this small 0.2 ha demonstration area, Redland Shire has since invested hundreds of thousands of dollars in successfully turfing other similarly “impossible” park areas with seashore paspalum. Urban salinity can arise for different reasons in different places. In inland areas such as southern NSW and the WA wheatbelt, the usual cause is rising groundwater bringing salt to the surface. In coastal sites, salt spray or periodic tidal inundation can result in problems. In Redland Shire’s case, the issue was compacted marine sediments (mainly mud) dug up and dumped to create foreshore parkland in the course of artificial canal developments. At Birkdale, this had created a site that was both strongly acid and too salty for most plants. Bare saline scalds were interspersed by areas of unthrifty grass. Finding a salt tolerant grass is no “silver bullet” or easy solution to salinity problems. Rather, it buys time to implement sustainable long-term establishment and maintenance practices, which are even more critical than with conventional turfgrasses. These practices include annual slicing or coring in conjunction with gypsum/dolomite amendment and light topdressing with sandy loam soil (to about 1 cm depth), adequate maintenance fertiliser, weed control measures, regular leaching irrigation was applied to flush salts below the root zone, and irrigation scheduling to maximise infiltration and minimise run off. Three other halophytic turfgrass species were also identified, each of them adapted to different environments, management regimes and uses. These have been shortlisted for larger-scale plantings in future work.
Resumo:
Soft-leaf buffalo grass is increasing in popularity as an amenity turfgrass in Australia. This project was instigated to assess the adaptation of and establish management guidelines for its use in Australias vast array of growing environments. There is an extensive selection of soft-leaf buffalo grass cultivars throughout Australia and with the countrys changing climates from temperate in the south to tropical in the north not all cultivars are going to be adapted to all regions. The project evaluated 19 buffalo grass cultivars along with other warm-season grasses including green couch, kikuyu and sweet smother grass. The soft-leaf buffalo grasses were evaluated for their growth and adaptation in a number of regions throughout Australia including Western Australia, Victoria, ACT, NSW and Queensland. The growth habit of the individual cultivars was examined along with their level of shade tolerance, water use, herbicide tolerance, resistance to wear, response to nitrogen applications and growth potential in highly alkaline (pH) soils. The growth habit of the various cultivars currently commercially available in Australia differs considerably from the more robust type that spreads quicker and is thicker in appearance (Sir Walter, Kings Pride, Ned Kelly and Jabiru) to the dwarf types that are shorter and thinner in appearance (AusTine and AusDwarf). Soft-leaf buffalo grass types tested do not differ in water use when compared to old-style common buffalo grass. Thus, soft-leaf buffalo grasses, like other warm-season turfgrass species, are efficient in water use. These grasses also recover after periods of low water availability. Individual cultivar differences were not discernible. In high pH soils (i.e. on alkaline-side) some elements essential for plant growth (e.g. iron and manganese) may be deficient causing turfgrass to appear pale green, and visually unacceptable. When 14 soft-leaf buffalo grass genotypes were grown on a highly alkaline soil (pH 7.5-7.9), cultivars differed in leaf iron, but not in leaf manganese, concentrations. Nitrogen is critical to the production of quality turf. The methods for applying this essential element can be manipulated to minimise the maintenance inputs (mowing) during the peak growing period (summer). By applying the greatest proportion of the turfs total nitrogen requirements in early spring, peak summer growth can be reduced resulting in a corresponding reduction in mowing requirements. Soft-leaf buffalo grass cultivars are more shade and wear tolerant than other warm-season turfgrasses being used by homeowners. There are differences between the individual buffalo grass varieties however. The majority of types currently available would be classified as having moderate levels of shade tolerance and wear reasonably well with good recovery rates. The impact of wear in a shaded environment was not tested and there is a need to investigate this as this is a typical growing environment for many homeowners. The use of herbicides is required to maintain quality soft-leaf buffalo grass turf. The development of softer herbicides for other turfgrasses has seen an increase in their popularity. The buffalo grass cultivars currently available have shown varying levels of susceptibility to the chemicals tested. The majority of the cultivars evaluated have demonstrated low levels of phytotoxicity to the herbicides chlorsulfuron (Glean) and fluroxypyr (Starane and Comet). In general, soft leaf buffalo grasses are varied in their makeup and have demonstrated varying levels of tolerance/susceptibility/adaptation to the conditions they are grown under. Consequently, there is a need to choose the cultivar most suited to the environment it is expected to perform in and the management style it will be exposed to. Future work is required to assess how the structure of the different cultivars impacts on their capacity to tolerate wear, varying shade levels, water use and herbicide tolerance. The development of a growth model may provide the solution.
Resumo:
Soil water repellency occurs widely in horticultural and agricultural soils when very dry. The gradual accumulation and breakdown of surface organic matter over time produces wax-like organic acids, which coat soil particles preventing uniform entry of water into the soil. Water repellency is usually managed by regular surfactant applications. Surfactants, literally, are surface active agents (SURFace ACTive AgeNTS). Their mode of action is to reduce the surface tension of water, allowing it to penetrate and wet the soil more easily and completely. This practice improves water use efficiency (by requiring less water to wet the soil and by capturing rainfall and irrigation more effectively and rapidly). It also reduces nutrient losses through run-off erosion or leaching. These nutrients have the potential to pollute the surrounding environment and water courses. This project investigated potential improvements to standard practices (product combination and scheduling) for surfactant use to overcome localised dry spots on water repellent soils and thus improve turf quality and water use efficiency. Weather conditions for the duration of the trial prevented the identification of improved practices in terms of combination and scheduling. However, the findings support previous research that the use of soil surfactants decreased the time for water to infiltrate dry soil samples taken from a previously severely hydrophobic site. Data will be continually collected from this trial site on a private contractual basis, with the hope that improvements to standard practices will be observed during the drier winter months when moisture availability is a limiting factor for turfgrass growth and quality.
Resumo:
The impact of cropping histories (sugarcane, maize and soybean), tillage practices (conventional tillage and direct drill) and fertiliser N in the plant and 1st ratoon (1R) crops of sugarcane were examined in field trials at Bundaberg and Ingham. Average yields at Ingham (Q200) and Bundaberg (Q151) were quite similar in both the plant crop (83 t/ha and 80 t/ha, respectively) and the 1R (89 t/ha v 94 t/ha, respectively), with only minor treatment effects on CCS at each site. Cane yield responses to tillage, break history and N fertiliser varied significantly between sites. There was a 27% yield increase in the plant crop from the soybean fallow at Ingham, with soybeans producing a yield advantage over continuous cane, but there were no clear break effects at Bundaberg - possibly due to a complex of pathogenic nematodes that responded differently to soybeans and maize breaks. There was no carryover benefit of the soybean break into the 1R crop at Ingham, while at Bundaberg the maize break produced a 15% yield advantage over soybeans and continuous cane. The Ingham site recorded positive responses to N fertiliser addition in both the plant (20% yield increase) and 1R (34% yield increase) crops, but there was negligible carryover benefit from plant crop N in the 1R crop, or of a reduced N response after a soybean rotation. By contrast, the Bundaberg site showed no N response in any history in the plant crop, and only a small (5%) yield increase with N applied in the 1R crop. There was again no evidence of a reduced N response in the 1R crop after a soybean fallow. There were no significant effects of tillage on cane yields at either site, although there were some minor interactions between tillage, breaks and N management in the 1R crop at both sites. Crop N contents at Bundaberg were more than 3 times those recorded at Ingham in both the plant and 1R crops, with N concentrations in millable stalk at Ingham suggesting N deficiencies in all treatments. There was negligible additional N recovered in crop biomass from N fertiliser application or soybean residues at the Ingham site. There was additional N recovered in crop biomass in response to N fertiliser and soybean breaks at Bundaberg, but effects were small and fertiliser use efficiencies poor. Loss pathways could not be quantified, but denitrification or losses in runoff were the likely causes at Ingham while leaching predominated at Bundaberg. Results highlight the complexity involved in developing sustainable farming systems for contrasting soil types and climatic conditions. A better understanding of key sugarcane pathogens and their host range, as well as improved capacity to predict in-crop N mineralisation, will be key factors in future improvements to sugarcane farming systems.
Resumo:
Herbicide contamination from agriculture is a major issue worldwide, and has been identified as a threat to freshwater and marine environments in the Great Barrier Reef World Heritage Area in Australia. The triazine herbicides are of particular concern because of potential adverse effects, both on photosynthetic organisms and upon vertebrate development. To date a number of bioremediation strategies have been proposed for triazine herbicides, but are unlikely to be implemented due to their reliance upon the release of genetically modified organisms. We propose an alternative strategy using a free-enzyme bioremediant, which is unconstrained by the issues surrounding the use of live organisms. Here we report an initial field trial with an enzyme-based product, demonstrating that the technology is technically capable of remediating water bodies contaminated with the most common triazine herbicide, atrazine.
Resumo:
Navua sedge, a member of the Cyperaceae family, is an aggressive weed of pastures in Fiji, Sri Lanka, Malay Peninsula, Vanuatu, Samoa, Solomons, and Tahiti and is now a weed of pastures and roadsides in north Queensland, Australia. Primarily restricted to areas with an annual rainfall exceeding 2500 mm, Navua sedge is capable of forming dense stands smothering many tropical pasture species. Seventeen herbicides were field tested at three sites in north Queensland, with glyphosate, halosulfuron, hexazinone, imazapic, imazapyr, or MSMA the most effective for Navua sedge control. Environmental problems such as persistence in soil, lack of selectivity and movement off-site may occur using some herbicides at the predicted LC90 control level rates. A seasonality trial using halosulfuron (97.5 g ai/ha) gave better Navua sedge control (84%) spraying March to September than spraying at other times (50%). In a frequency trial, sequential glyphosate applications (2,160 g ae/ha) every two months was more effective for continued Navua sedge control (67%) than a single application of glyphosate (36%), though loss of ground cover would occur. In a management trial, single applications of glyphosate (2,160 to 3,570 g ae/ha) using either a rope wick, ground foliar spraying or a rotary rope wick gave 59 to 73% control, while other treatments (rotary hoe (3%), slashing (-13%) or crushing (-30%)) were less effective. In a second management trial, four monthly rotary wick applications were much more effective (98%) than four monthly crushing applications (42%). An effective management plan must include the application of regular herbicide treatments to eliminate Navua sedge seed being added to the soil seed bank. Treatments that result in seed burial, for example, discing are likely to prolong seed persistence and should be avoided. The sprouting activity of vegetative propagules and root fragmentation needs to also be considered when selecting control options.
Resumo:
Field studies were conducted at two locations in southern Queensland, Australia during the 2003-2004 and 2004-2005 growing seasons to determine the differential competitiveness of sorghum (Sorghum bicolor L. Moench) cultivars and crop densities against weeds and the sorghum yield loss due to weeds. Weed competition was investigated by growing sorghum in the presence or absence of a model grass weed, Japanese millet (Echinochloa esculenta). The correlation analyses showed that the early growth traits (height, shoot biomass, and daily growth rate of the shoot biomass) of sorghum adversely affected the height, biomass, and seed production of millet, as measured at maturity. "MR Goldrush" and "Bonus MR" were the most competitive cultivars, resulting in reduced weed biomass, weed density, and weed seed production. The density of sorghum also had a significant effect on the crop's ability to compete with millet. When compared to the density of 4.5 plants per m2, sorghum that was planted at 7.5 plants per m2 suppressed the density, biomass, and seed production of millet by 22%, 27% and 38%, respectively. Millet caused a significant yield loss in comparison with the weed-free plots. The combined weed-suppressive effects of the competitive cultivars, such as MR Goldrush, and high crop densities minimized the yield losses from the weeds. These results indicate that sorghum competition against grass weeds can be improved by choosing competitive cultivars and by using a high crop density of > 7.5 plants per m2. These non-chemical options should be included in an integrated weed management program for better weed management, particularly where the control options are limited by the evolution of herbicide resistance.
Resumo:
This project will develop and deliver improved integrated weed management strategies for weeds at risk of glyphosate resistance and species shift in transgenic farming landscapes. It will also facilitate the stewarship of glyphosate and transgenic technology, improving the sustainability of both the herbicide and the genes.