17 resultados para surface runoff
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Surface losses of nitrogen from horticulture farms in coastal Queensland, Australia, may have the potential to eutrophy sensitive coastal marine habitats nearby. A case-study of the potential extent of such losses was investigated in a coastal macadamia plantation. Nitrogen losses were quantified in 5 consecutive runoff events during the 13-month study. Irrigation did not contribute to surface flows. Runoff was generated by storms at combined intensities and durations that were 20–40 mm/h for >9 min. These intensities and durations were within expected short-term (1 year) and long-term (up to 20 years) frequencies of rainfall in the study area. Surface flow volumes were 5.3 ± 1.1% of the episodic rainfall generated by such storms. Therefore, the largest part of each rainfall event was attributed to infiltration and drainage in this farm soil (Kandosol). The estimated annual loss of total nitrogen in runoff was 0.26 kg N/ha.year, representing a minimal loading of nitrogen in surface runoff when compared to other studies. The weighted average concentrations of total sediment nitrogen (TSN) and total dissolved nitrogen (TDN) generated in the farm runoff were 2.81 ± 0.77% N and 1.11 ± 0.27 mg N/L, respectively. These concentrations were considerably greater than ambient levels in an adjoining catchment waterway. Concentrations of TSN and TDN in the waterway were 0.11 ± 0.02% N and 0.50 ± 0.09 mg N/L, respectively. The steep concentration gradient of TSN and TDN between the farm runoff and the waterway demonstrated the occurrence of nutrient loading from the farming landscapes to the waterway. The TDN levels in the stream exceeded the current specified threshold of 0.2–0.3 mg N/L for eutrophication of such a waterway. Therefore, while the estimate of annual loading of N from runoff losses was comparatively low, it was evident that the stream catchment and associated agricultural land uses were already characterised by significant nitrogen loadings that pose eutrophication risks. The reported levels of nitrogen and the proximity of such waterways (8 km) to the coastline may have also have implications for the nearshore (oligotrophic) marine environment during periods of turbulent flow.
Resumo:
Herbicide runoff from cropping fields has been identified as a threat to the Great Barrier Reef ecosystem. A field investigation was carried out to monitor the changes in runoff water quality resulting from four different sugarcane cropping systems that included different herbicides and contrasting tillage and trash management practices. These include (i) Conventional - Tillage (beds and inter-rows) with residual herbicides used; (ii) Improved - only the beds were tilled (zonal) with reduced residual herbicides used; (iii) Aspirational - minimum tillage (one pass of a single tine ripper before planting) with trash mulch, no residual herbicides and a legume intercrop after cane establishment; and (iv) New Farming System (NFS) - minimum tillage as in Aspirational practice with a grain legume rotation and a combination of residual and knockdown herbicides. Results suggest soil and trash management had a larger effect on the herbicide losses in runoff than the physico-chemical properties of herbicides. Improved practices with 30% lower atrazine application rates than used in conventional systems produced reduced runoff volumes by 40% and atrazine loss by 62%. There were a 2-fold variation in atrazine and >10-fold variation in metribuzin loads in runoff water between reduced tillage systems differing in soil disturbance and surface residue cover from the previous rotation crops, despite the same herbicide application rates. The elevated risk of offsite losses from herbicides was illustrated by the high concentrations of diuron (14mugL-1) recorded in runoff that occurred >2.5months after herbicide application in a 1st ratoon crop. A cropping system employing less persistent non-selective herbicides and an inter-row soybean mulch resulted in no residual herbicide contamination in runoff water, but recorded 12.3% lower yield compared to Conventional practice. These findings reveal a trade-off between achieving good water quality with minimal herbicide contamination and maintaining farm profitability with good weed control.
Resumo:
Runoff and sediment loss from forest roads were monitored for a two-year period in a Pinus plantation in southeast Queensland. Two classes of road were investigated: a gravelled road, which is used as a primary daily haulage route for the logging area, and an ungravelled road, which provides the main access route for individual logging compartments and is intensively used as a haulage route only during the harvest of these areas (approximately every 30 years). Both roads were subjected to routine traffic loads and maintenance during the study. Surface runoff in response to natural rainfall was measured and samples taken for the determination of sediment and nutrient (total nitrogen, total phosphorus, dissolved organic carbon and total iron) loads from each road. Results revealed that the mean runoff coefficient (runoff depth/rainfall depth) was consistently higher from the gravelled road plot with 0.57, as compared to the ungravelled road with 0.38. Total sediment loss over the two-year period was greatest from the gravelled road plot at 5.7 t km−1 compared to the ungravelled road plot with 3.9 t km−1. Suspended solids contributed 86% of the total sediment loss from the gravelled road, and 72% from the ungravelled road over the two years. Nitrogen loads from the two roads were both relatively constant throughout the study, and averaged 5.2 and 2.9 kg km−1 from the gravelled and ungravelled road, respectively. Mean annual phosphorus loads were 0.6 kg km−1 from the gravelled road and 0.2 kg km−1 from the ungravelled road. Organic carbon and total iron loads increased in the second year of the study, which was a much wetter year, and are thought to reflect the breakdown of organic matter in roadside drains and increased sediment generation, respectively. When road and drain maintenance (grading) was performed runoff and sediment loss were increased from both road types. Additionally, the breakdown of the gravel road base due to high traffic intensity during wet conditions resulted in the formation of deep (10 cm) ruts which increased erosion. The Water Erosion Prediction Project (WEPP):Road model was used to compare predicted to observed runoff and sediment loss from the two road classes investigated. For individual rainfall events, WEPP:Road predicted output showed strong agreement with observed values of runoff and sediment loss. WEPP:Road predictions for annual sediment loss from the entire forestry road network in the study area also showed reasonable agreement with the extrapolated observed values.
Resumo:
Rainfall simulation experiments were carried out to measure runoff and soil water fluxes of suspended solids, total nitrogen, total phosphorus, dissolved organic carbon and total iron from sites in Pinus plantations on the coastal lowlands of south-eastern Queensland subjected to various operations (treatments). The operations investigated were cultivated and nil-cultivated site preparation, fertilised site preparation, clearfall harvesting and prescribed burning; these treatments were compared with an 8-y-old established plantation. Flow-weighted mean concentrations of total nitrogen and total phosphorus in surface runoff from the cultivated and nil-cultivated site-preparation, clearfall harvest, prescribed burning and 8-y-old established plantation treatments were very similar. However, both the soil water and the runoff from the fertilised site preparation treatment contained more nitrogen (N) and phosphorus (P) than the other treatments - with 3.10 mg N L-1 and 4.32 mg P L-1 (4 and 20 times more) in the runoff. Dissolved organic carbon concentrations in runoff from the nil-cultivated site-preparation and prescribed burn treatments were elevated. Iron concentrations were highest in runoff from the nil-cultivated site-preparation and 8-y-old established plantation treatments. Concentrations of suspended solids in runoff were higher from cultivated site preparation and prescribed burn treatments, and reflect the great disturbance of surface soil at these sites. The concentrations of all analytes were highest in initial runoff from plots, and generally decreased with time. Total nitrogen (mean 7.28, range 0.11-13.27 mg L-1) and total phosphorus (mean 11.60, range 0.06-83.99 mg L-1) concentrations in soil water were between 2 and 10 times greater than in surface runoff, which highlights the potential for nutrient fluxes in interflow (i.e. in the soil above the water table) through the general plantation area. Implications in regard to forest management are discussed, along with results of larger catchment-scale studies.
Resumo:
In Australia communities are concerned about atrazine being detected in drinking water supplies. It is important to understand mechanisms by which atrazine is transported from paddocks to waterways if we are to reduce movement of agricultural chemicals from the site of application. Two paddocks cropped with grain sorghum on a Black Vertosol were monitored for atrazine, potassium chloride (KCl) extractable atrazine, desethylatrazine (DEA), and desisopropylatrazine (DIA) at 4 soil depths (0-0.05, 0.05-0.10, 0.10-0.20, and 0.20-0.30 m) and in runoff water and runoff sediment. Atrazine + DEA + DIA (total atrazine) had a half-life in soil of 16-20 days, more rapid dissipation than in many earlier reports. Atrazine extracted in dilute potassium chloride, considered available for weed control, was initially 34% of the total and had a half-life of 15-20 days until day 30, after which it dissipated rapidly with a half life of 6 days. We conclude that, in this region, atrazine may not pose a risk for groundwater contamination, as only 0.5% of applied atrazine moved deeper than 0.20 m into the soil, where it dissipated rapidly. In runoff (including suspended sediment) atrazine concentrations were greatest during the first runoff event (57 days after application) (85 μg/L) and declined with time. After 160 days, the total atrazine lost in runoff was 0.4% of the initial application. The total atrazine concentration in runoff was strongly related to the total concentration in soil, as expected. Even after 98% of the KCl-extractable atrazine had dissipated (and no longer provided weed control), runoff concentrations still exceeded the human health guideline value of 40 μg/L. For total atrazine in soil (0-0.05 m), the range for coefficient of soil sorption (Kd) was 1.9-28.4 mL/g and for soil organic carbon sorption (KOC) was 100-2184 mL/g, increasing with time of contact with the soil and rapid dissipation of the more soluble, available phase. Partition coefficients in runoff for total atrazine were initially 3, increasing to 32 and 51 with time, values for DEA being half these. To minimise atrazine losses, cultural practices that maximise rain infiltration, and thereby minimise runoff, and minimise concentrations in the soil surface should be adopted.
Resumo:
A panel of 19 monoclonal antibodies (mAbs) was used to study the immunological variability of Lettuce mosaic virus (LMV), a member of the genus Potyvirus, and to perform a first epitope characterization of this virus. Based on their specificity of recognition against a panel of 15 LMV isolates, the mAbs could be clustered in seven reactivity groups. Surface plasmon resonance analysis indicated the presence, on the LMV particles, of at least five independent recognition/binding regions, correlating with the seven mAbs reactivity groups. The results demonstrate that LMV shows significant serological variability and shed light on the LMV epitope structure. The various mAbs should prove a new and efficient tool for LMV diagnostic and field epidemiology studies.
Resumo:
We examined the effect of surface-applied treatments on the above-ground decay resistance of the tenon of mortice-and-tenon timber joints designed to simulate joinery that is exposed to the weather. Joints made from untreated radiata pine, Douglas-fir, brush box, spotted gum and copper-chrome-arsenic (CCA) treated radiata pine were exposed to the weather for 9 y on above-ground racks at five sites throughout eastern Australia. Results indicate (1) a poorly maintained external paint film generally accelerated decay, (2) a brush coat of water-repellent preservative inside the joints often extended serviceability (in some cases by a factor of up to seven times that of untreated joints) and (3) the level of protection provided by a coat of primer applied inside the joint varied and in most cases was not as effective as the water-repellent preservative treatment.
Resumo:
High-value fruit crops are exposed to a range of environmental conditions that can reduce fruit quality. Solar injury (SI) or sunburn is a common disorder in tropical, sub-tropical, and temperate climates and is related to: 1) high fruit surface temperature; 2) high visible light intensity; and, 3) ultraviolet radiation (UV). Positional changes in fruit that are caused by increased weight or abrupt changes that result from summer pruning, limb breakage, or other damage to the canopy can expose fruit to high solar radiation levels, increased fruit surface temperatures, and increased UV exposure that are higher than the conditions to which they are adapted. In our studies, we examined the effects of high fruit surface temperature, saturating photosynthetically-active radiation (PAR), and short-term UV exposure on chlorophyll fluorescence, respiration, and photosynthesis of fruit peel tissues from tropical and temperate fruit in a simulation of these acute environmental changes. All tropical fruits (citrus, macadamia, avocado, pineapple, and custard apple) and the apple cultivars 'Gala', 'Gold Rush', and 'Granny Smith' increased dark respiration (A0) when exposed to UV, suggesting that UV repair mechanisms were induced. The maximum quantum efficiency of photosystem II (Fv/Fm) and the quantum efficiency of photosystem II (ΦII) were unaffected, indicating no adverse effects on photosystem II (PSII). In contrast, 'Braeburn' apple had a reduced Fv/Fm with no increase in A0 on all sampling dates. There was a consistent pattern in all studies. When Fv/Fm was unaffected by UV treatment, A0 increased significantly. Conversely, when Fv/Fm was reduced by UV treatment, then A0 was unaffected. The pattern suggests that when UV repair mechanisms are effective, PSII is adequately protected, and that this protection occurs at the cost of higher respiration. However, when the UV repair mechanisms are ineffective, not only is PSII damaged, but there is additional short-term damage to the repair mechanisms, indicated by a lack of respiration to provide energy.
Resumo:
The first larval instar has been identified as a critical stage for population mortality in Lepidoptera, yet due to the body size of these larvae, the factors that contribute to mortality under field conditions are still not clear. Dispersal behaviour has been suggested as a significant, but ignored factor contributing to mortality in first-instar lepidopteran larvae. The impact that leaving the host plant has on the mortality rate of Helicoverpa armigera neonates was examined in field crops and laboratory trials. In this study the following are examined: (1) the effects of soil surface temperature, and the level of shade within the crop, on the mortality of neonates on the soil after dropping off from the host plant; (2) the percentage of neonates that dropped off from a host plant and landed on the soil; and (3) the effects of exposure to different soil surface temperatures on the development and mortality of neonates. The findings of this study showed that: (1) on the soil, surface temperatures above 43°C were lethal for neonates, and exposure to these temperatures contributed greatly to the overall mortality rate observed; however, the fate of neonates on the soil varied significantly depending on canopy closure within the crop; (2) at least 15% of neonates dropped off from the host plant and landed on the soil, meaning that the proportion of neonates exposed to these condition is not trivial; and (3) 30 min exposure to soil surface temperatures approaching the lethal level (>43°C) has no significant negative effects on the development and mortality of larvae through to the second instar. Overall leaving the plant through drop-off contributes to first-instar mortality in crops with open canopies; however, survival of neonates that have lost contact with a host plant is possible, and becomes more likely later in the crop growing season.
Resumo:
Adoption of conservation tillage practices on Red Ferrosol soils in the inland Burnett area of south-east Queensland has been shown to reduce runoff and subsequent soil erosion. However, improved infiltration resulting from these measures has not improved crop performance and there are suggestions of increased loss of soil water via deep drainage. This paper reports data monitoring soil water under real and artificial rainfall events in commercial fields and long-term tillage experiments, and uses the data to explore the rate and mechanisms of deep drainage in this soil type. Soils were characterised by large drainable porosities (≥0.10 m3/m3) in all parts of the profile to depths of 1.50 m, with drainable porosity similar to available water content (AWC) at 0.25 and 0.75 m, but >60% higher than AWC at 1.50 m. Hydraulic conductivity immediately below the tilled layer in both continuously cropped soils and those after a ley pasture phase was shown to decline with increasing soil moisture content, although the rate of decline was much greater in continuously cropped soil. At moisture contents approaching the drained upper limit (pore water pressure = -100cm H2O), estimates of saturated hydraulic conductivity after a ley pasture were 3-5 times greater than in continuously cropped soil, suggesting much greater rates of deep drainage in the former when soils are moist. Hydraulic tensiometers and fringe capacitance sensors monitored during real and artificial rainfall events showed evidence of soils approaching saturation in the surface layers (top 0.30-0.40 m), but there was no evidence of soil moistures exceeding the drained upper limit (i.e. pore water pressures ≤ -100 cm H2O) in deeper layers. Recovery of applied soil water within the top 1.00-1.20 m of the profile during or immediately after rainfall events declined as the starting profile moisture content increased. These effects were consistent with very rapid rates of internal drainage. Sensors deeper in the profile were unable to detect this drainage due to either non-uniformity of conducting macropores (i.e. bypass flow) or unsaturated conductivities in deeper layers that far exceed the saturated hydraulic conductivity of the infiltration throttle at the bottom of the cultivated layer. Large increases in unsaturated hydraulic conductivities are likely with only small increases in water content above the drained upper limit. Further studies with drainage lysimeters and large banks of hydraulic tensiometers are planned to quantify drainage risk in these soil types.
Resumo:
Runoff, soil loss, and nutrient loss were assessed on a Red Ferrosol in tropical Australia over 3 years. The experiment was conducted using bounded, 100-m(2) field plots cropped to peanuts, maize, or grass. A bare plot, without cover or crop, was also instigated as an extreme treatment. Results showed the importance of cover in reducing runoff, soil loss, and nutrient loss from these soils. Runoff ranged from 13% of incident rainfall for the conventional cultivation to 29% under bare conditions during the highest rainfall year, and was well correlated with event rainfall and rainfall energy. Soil loss ranged from 30 t/ha. year under bare conditions to <6 t/ha. year under cropping. Nutrient losses of 35 kg N and 35 kg P/ha. year under bare conditions and 17 kg N and 11 kg P/ha. year under cropping were measured. Soil carbon analyses showed a relationship with treatment runoff, suggesting that soil properties influenced the rainfall runoff response. The cropping systems model PERFECT was calibrated using runoff, soil loss, and soil water data. Runoff and soil loss showed good agreement with observed data in the calibration, and soil water and yield had reasonable agreement. Longterm runs using historical weather data showed the episodic nature of runoff and soil loss events in this region and emphasise the need to manage land using protective measures such as conservation cropping practices. Farmers involved in related, action-learning activities wished to incorporate conservation cropping findings into their systems but also needed clear production benefits to hasten practice change.
Resumo:
The impact of three cropping histories (sugarcane, maize and soybean) and two tillage practices (conventional tillage and direct drill) on plant-parasitic and free-living nematodes in the following sugarcane crop was examined in a field trial at Bundaberg. Soybean reduced populations of lesion nematode (Pratylenchus zeae) and root-knot nematode (Meloidogyne javanica) in comparison to previous crops of sugarcane or maize but increased populations of spiral nematode (Helicotylenchus dihystera) and maintained populations of dagger nematode (Xiphinema elongatum). However the effect of soybean on P zeae and M. javanica was no longer apparent 15 weeks after planting sugarcane, while later in the season, populations of these nematodes following soybean were as high as or higher than maize or sugarcane. Populations of P zeae were initially reduced by cultivation but due to strong resurgence tended to be higher in conventionally tilled than direct drill plots at the end of the plant crop. Even greater tillage effects were observed with M. javanica and X. elongatum, as nematode populations were significantly higher in conventionally tilled than direct drill plots late in the season. Populations of free-living nematodes in the upper 10 cm of soil were initially highest following soybean, but after 15, 35 and 59 weeks were lower than after sugarcane and contained fewer omnivorous and predatory nematodes. Conventional tillage increased populations of free-living nematodes in soil in comparison to direct drill and was also detrimental to omnivorous and predatory nematodes. These results suggest that crop rotation and tillage not only affect plant-parasitic nematodes directly, but also have indirect effects by impacting on natural enemies that regulate nematode populations. More than 2 million nematodes/m(2) were often present in crop residues on the surface of direct drill plots. Bacterial-feeding nematodes were predominant in residues early in the decomposition process but fungal-feeding nematodes predominated after 15 weeks. This indicates that fungi become an increasingly important component of the detritus food web as decomposition proceeds, and that that the rate of nutrient cycling decreases with time. Correlations between total numbers of free-living nematodes and mineral N concentrations in crop residues and surface soil suggested that the free-living nematode community may provide an indication of the rate of mineralisation of N from organic matter.
Resumo:
In 2002, AFL Queensland and the Brisbane Lions Football Club approached the Department of Primary Industries and Fisheries (Queensland) for advice on improving their Premier League sports fields. They were concerned about player safety and dissatisfaction with playing surfaces, particularly uneven turf cover and variable under-foot conditions. They wanted to get the best from new investments in ground maintenance equipment and irrigation infrastructure. Their sports fields were representative of community-standard, multi-use venues throughout Australia; generally ‘natural’ soil fields, with low maintenance budgets, managed by volunteers. Improvements such as reconstruction, drainage, or regular re-turfing are generally not affordable. Our project aimed to: (a) Review current world practice and performance benchmarks; (b) Demonstrate best-practice management for community-standard fields; (c) Adapt relevant methods for surface performance testing; (d) Assess current soils, and investigate useful amendments; (e) Improve irrigation system performance; and (e) Build industry capacity and encourage patterns for ongoing learning. Most global sports field research focuses on elite, sand-based fields. We adjusted elite standards for surface performance (hardness, traction, soil moisture, evenness, sward cover/height) and maintenance programs, to suit community-standard fields with lesser input resources. In regularly auditing ground conditions across 12 AFLQ fields in SE QLD, we discovered surface hardness (measured by Clegg Hammer) was the No. 1 factor affecting player safety and surface performance. Other important indices were turf coverage and surface compaction (measured by penetrometer). AFLQ now runs regularly audits affiliated fields, and closes grounds with hardness readings greater than 190 Gmax. Aerating every two months was the primary mechanical practice improving surface condition and reducing hardness levels to < 110 Gmax on the renovated project fields. With irrigation installation, these fields now record surface conditions comparable to elite fields. These improvements encouraged many other sporting organisations to seek advice / assistance from the project team. AFLQ have since substantially invested in an expanded ground improvement program, to cater for this substantially increased demand. In auditing irrigation systems across project fields, we identified low maintenance (with < 65% of sprinklers operating optimally) as a major problem. Retrofitting better nozzles and adjusting sprinklers improved irrigation distribution uniformity to 75-80%. Research showed that reducing irrigation frequency to weekly, and preparedness to withhold irrigation longer after rain, reduced irrigation requirement by 30-50%, compared to industry benchmarks of 5-6 ML/ha/annum. Project team consultation with regulatory authorities enhanced irrigation efficiency under imposed regional water restrictions. Laboratory studies showed incorporated biosolids / composts, or topdressed crumb rubber, improved compaction resistance of soils. Field evaluations confirmed compost incorporation significantly reduced surface hardness of high wear areas in dry conditions, whilst crumb rubber assisted turf persistence into early winter. Neither amendment was a panacea for poor agronomic practices. Under the auspices of the project Trade Mark Sureplay®, we published > 80 articles, and held > 100 extension activities involving > 2,000 participants. Sureplay® has developed a multi-level curator training structure and resource materials, subject to commercial implementation. The partnerships with industry bodies (particularly AFLQ), frequent extension activities, and engagement with government/regulatory sectors have been very successful, and are encouraged for any future work. Specific aspects of sports field management for further research include: (a) Understanding of factors affecting turf wear resistance and recovery, to improve turf persistence under wear; (b) Simple tests for pinpointing areas of fields with high hardness risk; and (c) Evaluation of new irrigation infrastructure, ‘water-saving’ devices, and irrigation protocols, in improving water use and turf cover outcomes.
Resumo:
The off-site transport of agricultural chemicals, such as herbicides, into freshwater and marine ecosystems is a world-wide concern. The adoption of farm management practices that minimise herbicide transport in rainfall-runoff is a priority for the Australian sugarcane industry, particularly in the coastal catchments draining into the World Heritage listed Great Barrier Reef (GBR) lagoon. In this study, residual herbicide runoff and infiltration were measured using a rainfall simulator in a replicated trial on a brown Chromosol with 90–100% cane trash blanket cover in the Mackay Whitsunday region, Queensland. Management treatments included conventional 1.5 m spaced sugarcane beds with a single row of sugarcane (CONV) and 2 m spaced, controlled traffic sugarcane beds with dual sugarcane rows (0.8 m apart) (2mCT). The aim was to simulate the first rainfall event after the application of the photosynthesis inhibiting (PSII) herbicides ametryn, atrazine, diuron and hexazinone, by broadcast (100% coverage, on bed and furrow) and banding (50–60% coverage, on bed only) methods. These events included heavy rainfall 1 day after herbicide application, considered a worst case scenario, or rainfall 21 days after application. The 2mCT rows had significantly (P < 0.05) less runoff (38%) and lower peak runoff rates (43%) than CONV rows for a rainfall average of 93 mm at 100 mm h−1 (1:20 yr Average Return Interval). Additionally, final infiltration rates were higher in 2mCT rows than CONV rows, with 72 and 52 mm h−1 respectively. This resulted in load reductions of 60, 55, 47, and 48% for ametryn, atrazine, diuron and hexazinone from 2mCT rows, respectively. Herbicide losses in runoff were also reduced by 32–42% when applications were banded rather than broadcast. When rainfall was experienced 1 day after application, a large percentage of herbicides were washed off the cane trash. However, by day 21, concentrations of herbicide residues on cane trash were lower and more resistant to washoff, resulting in lower losses in runoff. Consequently, ametryn and atrazine event mean concentrations in runoff were approximately 8 fold lower at day 21 compared with day 1, whilst diuron and hexazinone were only 1.6–1.9 fold lower, suggesting longer persistence of these chemicals. Runoff collected at the end of the paddock in natural rainfall events indicated consistent though smaller treatment differences to the rainfall simulation study. Overall, it was the combination of early application, banding and controlled traffic that was most effective in reducing herbicide losses in runoff. Crown copyright © 2012
Resumo:
Historical stocking methods of continuous, season-long grazing of pastures with little account of growing conditions have caused some degradation within grazed landscapes in northern Australia. Alternative stocking methods have been implemented to address this degradation and raise the productivity and profitability of the principal livestock, cattle. Because information comparing stocking methods is limited, an evaluation was undertaken to quantify the effects of stocking methods on pastures, soils and grazing capacity. The approach was to monitor existing stocking methods on nine commercial beef properties in north and south Queensland. Environments included native and exotic pastures and eucalypt (lighter soil) and brigalow (heavier soil) land types. Breeding and growing cattle were grazed under each method. The owners/managers, formally trained in pasture and grazing management, made all management decisions affecting the study sites. Three stocking methods were compared: continuous (with rest), extensive rotation and intensive rotation (commonly referred to as 'cell grazing'). There were two or three stocking methods examined on each property: in total 21 methods (seven continuous, six extensive rotations and eight intensive rotations) were monitored over 74 paddocks, between 2006 and 2009. Pasture and soil surface measurements were made in the autumns of 2006, 2007 and 2009, while the paddock grazing was analysed from property records for the period from 2006 to 2009. The first 2 years had drought conditions (rainfall average 3.4 decile) but were followed by 2 years of above-average rainfall. There were no consistent differences between stocking methods across all sites over the 4 years for herbage mass, plant species composition, total and litter cover, or landscape function analysis (LFA) indices. There were large responses to rainfall in the last 2 years with mean herbage mass in the autumn increasing from 1970 kg DM ha(-1) in 2006-07 to 3830 kg DM ha(-1) in 2009. Over the same period, ground and litter cover and LFA indices increased. Across all sites and 4 years, mean grazing capacity was similar for the three stocking methods. There were, however, significant differences in grazing capacity between stocking methods at four sites but these differences were not consistent between stocking methods or sites. Both the continuous and intensive rotation methods supported the highest average annual grazing capacity at different sites. The results suggest that cattle producers can obtain similar ecological responses and carry similar numbers of livestock under any of the three stocking methods.