988 resultados para No tillage
Resumo:
Long-term loss of soil C stocks under conventional tillage and accrual of soil C following adoption of no-tillage have been well documented. No-tillage use is spreading, but it is common to occasionally till within a no-till regime or to regularly alternate between till and no-till practices within a rotation of different crops. Short-term studies indicate that substantial amounts of C can be lost from the soil immediately following a tillage event, but there are few field studies that have investigated the impact of infrequent tillage on soil C stocks. How much of the C sequestered under no-tillage is likely to be lost if the soil is tilled? What are the longer-term impacts of continued infrequent no-tillage? If producers are to be compensated for sequestering C in soil following adoption of conservation tillage practices, the impacts of infrequent tillage need to be quantified. A few studies have examined the short-term impacts of tillage on soil C and several have investigated the impacts of adoption of continuous no-tillage. We present: (1) results from a modeling study carried out to address these questions more broadly than the published literature allows, (2) a review of the literature examining the short-term impacts of tillage on soil C, (3) a review of published studies on the physical impacts of tillage and (4) a synthesis of these components to assess how infrequent tillage impacts soil C stocks and how changes in tillage frequency could impact soil C stocks and C sequestration. Results indicate that soil C declines significantly following even one tillage event (1-11 % of soil C lost). Longer-term losses increase as frequency of tillage increases. Model analyses indicate that cultivating and ripping are less disruptive than moldboard plowing, and soil C for those treatments average just 6% less than continuous NT compared to 27% less for CT. Most (80%) of the soil C gains of NT can be realized with NT coupled with biannual cultivating or ripping. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
No-tillage (NT) management has been promoted as a practice capable of offsetting greenhouse gas (GHG) emissions because of its ability to sequester carbon in soils. However, true mitigation is only possible if the overall impact of NT adoption reduces the net global warming potential (GWP) determined by fluxes of the three major biogenic GHGs (i.e. CO2, N2O, and CH4). We compiled all available data of soil-derived GHG emission comparisons between conventional tilled (CT) and NT systems for humid and dry temperate climates. Newly converted NT systems increase GWP relative to CT practices, in both humid and dry climate regimes, and longer-term adoption (>10 years) only significantly reduces GWP in humid climates. Mean cumulative GWP over a 20-year period is also reduced under continuous NT in dry areas, but with a high degree of uncertainty. Emissions of N2O drive much of the trend in net GWP, suggesting improved nitrogen management is essential to realize the full benefit from carbon storage in the soil for purposes of global warming mitigation. Our results indicate a strong time dependency in the GHG mitigation potential of NT agriculture, demonstrating that GHG mitigation by adoption of NT is much more variable and complex than previously considered, and policy plans to reduce global warming through this land management practice need further scrutiny to ensure success.
Resumo:
The effect of a change of tillage and crop residue management practice on the chemical and micro-biological properties of a cereal-producing red duplex soil was investigated by superimposing each of three management practices (CC: conventional cultivation, stubble burnt, crop conventionally sown; DD: direct-drilling, stubble retained, no cultivation, crop direct-drilled; SI: stubble incorporated with a single cultivation, crop conventionally sown), for a 3-year period on plots previously managed with each of the same three practices for 14 years. A change from DD to CC or SI practice resulted in a significant decline, in the top 0-5 cm of soil, in organic C, total N, electrical conductivity, NH4-N, NO3-N, soil moisture holding capacity, microbial biomass and CO2 respiration as well as a decline in the microbial quotient (the ratio of microbial biomass C to organic C; P <0.05). In contrast, a change from SI to DD or CC practice or a change from CC to DD or SI practice had only negligible impact on soil chemical properties (P >0.05). However, there was a significant increase in microbial biomass and the microbial quotient in the top 0-5 cm of soil following the change from CC to DD or SI practice and with the change from SI to DD practice (P <0.05). Analysis of ester-linked fatty acid methyl esters (EL-FAMEs) extracted from the 0- to 5-cm and 5- to 10-cm layers of the soils of the various treatments detected changes in the FAME profiles following a change in tillage practice. A change from DD practice to SI or CC practice was associated with a significant decline in the ratio of fungal to bacterial fatty acids in the 0- to 5-cm soil (P <0.05). The results show that a change in tillage practice, particularly the cultivation of a previously minimum-tilled (direct-drilled) soil, will result in significant changes in soil chemical and microbiological properties within a 3-year period. They also show that soil microbiological properties are sensitive indicators of a change in tillage practice.
Resumo:
The effects of tillage practises and the methods of chemical application on atrazine and alachlor losses through run-off were evaluated for five treatments: conservation (untilled) and surface (US), disk and surface, plow and surface, disk and preplant-incorporated, and plow and preplant-incorporated treatments. A rainfall simulator was used to create 63.5 mm h-1 of rainfall for 60 min and 127 mm h-1 for 15 min. Rainfall simulation occurred 24-36 h after chemical application. There was no significant difference in the run-off volume among the treatments but the untilled treatment significantly reduced erosion loss. The untilled treatments had the highest herbicide concentration and the disk treatments were higher than the plow treatments. The surface treatments showed a higher concentration than the incorporated treatments. The concentration of herbicides in the water decreased with time. Among the experimental sites, the one with sandy loam soil produced the greatest losses, both in terms of the run-off volume and herbicide loss. The US treatments had the highest loss and the herbicide incorporation treatments had smaller losses through run-off as the residue cover was effective in preventing herbicide losses. Incorporation might be a favorable method of herbicide application to reduce the herbicide losses by run-off.
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
Tillage is defined here in a broad sense, including disturbance of the soil and crop residues, wheel traffic and sowing opportunities. In sub-tropical, semi-arid cropping areas in Australia, tillage systems have evolved from intensively tilled bare fallow systems, with high soil losses, to reduced and no tillage systems. In recent years, the use of controlled traffic has also increased. These conservation tillage systems are successful in reducing water erosion of soil and sediment-bound chemicals. Control of runoff of dissolved nutrients and weakly sorbed chemicals is less certain. Adoption of new practices appears to have been related to practical and economic considerations, and proved to be more profitable after a considerable period of research and development. However there are still challenges. One challenge is to ensure that systems that reduce soil erosion, which may involve greater use of chemicals, do not degrade water quality in streams. Another challenge is to ensure that systems that improve water entry do not increase drainage below the crop root zone, which would increase the risk of salinity. Better understanding of how tillage practices influence soil hydrology, runoff and erosion processes should lead to better tillage systems and enable better management of risks to water quality and soil health. Finally, the need to determine the effectiveness of in-field management practices in achieving stream water quality targets in large, multi-land use catchments will challenge our current knowledge base and the tools available.
Resumo:
Reduced supplies of nitrogen (N) in many soils of southern Queensland that were cropped exhaustively with cereals over many decades have been the focus of much research to avoid declines in profitability and sustainability of farming systems. A 45-month period of mixed grass (purple pigeon grass, Setaria incrassata Stapf; Rhodes grass, Chloris gayana Kunth.) and legume (lucerne, Medicago sativa L.; annual medics, M. scutellata L. Mill. and M. truncatula Gaertn.) pasture was one of several options that were compared at a fertility-depleted Vertosol at Warra, southern Queensland, to improve grain yields or increase grain protein concentration of subsequent wheat crops. Objectives of the study were to measure the productivity of a mixed grass and legume pasture grown over 45 months (cut and removed over 36 months) and its effects on yield and protein concentrations of the following wheat crops. Pasture production (DM t/ha) and aboveground plant N yield (kg/ha) for grass, legume (including a small amount of weeds) and total components of pasture responded linearly to total rainfall over the duration of each of 3 pastures sown in 1986, 1987 and 1988. Averaged over the 3 pastures, each 100 mm of rainfall resulted in 0.52 t/ha of grass, 0.44 t/ha of legume and 0.97 t/ha of total pasture DM, there being little variation between the 3 pastures. Aboveground plant N yield of the 3 pastures ranged from 17.2 to 20.5 kg/ha per 100 mm rainfall. Aboveground legume N in response to total rainfall was similar (10.6 - 13.2 kg/ha. 100 mm rainfall) across the 3 pastures in spite of very different populations of legumes and grasses at establishment. Aboveground grass N yield was 5.2 - 7.0 kg/ha per 100mm rainfall. In most wheat crops following pasture, wheat yields were similar to that of unfertilised wheat except in 1990 and 1994, when grain yields were significantly higher but similar to that for continuous wheat fertilised with 75 kg N/ha. In contrast, grain protein concentrations of most wheat crops following pasture responded positively, being substantially higher than unfertilised wheat but similar to that of wheat fertilised with 75 kg N/ha. Grain protein averaged over all years of assay was increased by 25 - 40% compared with that of unfertilised wheat. Stored water supplies after pasture were < 134mm (< 55% of plant available water capacity); for most assay crops water storages were 67 - 110 mm, an equivalent wet soil depth of only 0.3 - 0.45 m. Thus, the crop assays of pasture benefits were limited by low water supply to wheat crops. Moreover, the severity of common root rot in wheat crop was not reduced by pasture - wheat rotation.
Resumo:
No-tillage (NT) practice, where straw is retained on the soil surface, is increasingly being used in cereal cropping systems in Australia and elsewhere. Compared to conventional tillage (CT), where straw is mixed with the ploughed soil, NT practice may reduce straw decomposition, increase nitrogen immobilisation and increase organic carbon in the soil. This study examined 15N-labelled wheat straw (stubble) decomposition in four treatments (NT v. CT, with N rates of 0 and 75 kg/ha.year) and assessed the tillage and fertiliser N effects on mineral N and organic C and N levels over a 10-year period in a field experiment. NT practice decreased the rate of straw decomposition while fertiliser N application increased it. However, there was no tillage practice x N interaction. The mean residence time of the straw N in soil was more than twice as long under the NT (1.2 years) as compared to the CT practice (0.5 years). In comparison, differences in mean residence time due to N fertiliser treatment were small. However, tillage had generally very little effect on either the amounts of mineral N at sowing or soil organic C (and N) over the study period. While application of N fertiliser increased mineral N, it had very little effect on organic C over a 10-year period. Relatively rapid decomposition of straw and short mean residence time of straw N in a Vertisol is likely to have very little long-term effect on N immobilisation and organic C level in an annual cereal cropping system in a subtropical, semiarid environment. Thus, changing the tillage practice from CT to NT may not necessitate additional N requirement unless use is made of additional stored water in the soil or mineral N loss due to increased leaching is compensated for in N supply to crops.
Resumo:
Winter cereal cropping is marginal in south-west Queensland because of low and variable rainfall and declining soil fertility. Increasing the soil water storage and the efficiency of water and nitrogen (N) use is essential for sustainable cereal production. The effect of zero tillage and N fertiliser application on these factors was evaluated in wheat and barley from 1996 to 2001 on a grey Vertosol. Annual rainfall was above average in 1996, 1997, 1998 and 1999 and below average in 2000 and 2001. Due to drought, no crop was grown in the 2000 winter cropping season. Zero tillage improved fallow soil water storage by a mean value of 20 mm over 4 years, compared with conventional tillage. However, mean grain yield and gross margin of wheat were similar under conventional and zero tillage. Wheat grain yield and/or grain protein increased with N fertiliser application in all years, resulting in an increase in mean gross margin over 5 years from $86/ha, with no N fertiliser applied, to $250/ha, with N applied to target ≥13% grain protein. A similar increase in gross margin occurred in barley where N fertiliser was applied to target malting grade. The highest N fertiliser application rate in wheat resulted in a residual benefit to soil N supply for the following crop. This study has shown that profitable responses to N fertiliser addition in wheat and barley can be obtained on long-term cultivated Vertosols in south-west Queensland when soil water reserves at sowing are at least 60% of plant available water capacity, or rainfall during the growing season is above average. An integrative benchmark for improved N fertiliser management appears to be the gross margin/water use of ~$1/ha.mm. Greater fallow soil water storage or crop water use efficiency under zero tillage has the potential to improve winter cereal production in drier growing seasons than experienced during the period of this study.
Resumo:
The impact of cropping histories (sugarcane, maize and soybean), tillage practices (conventional tillage and direct drill) and fertiliser N in the plant and 1st ratoon (1R) crops of sugarcane were examined in field trials at Bundaberg and Ingham. Average yields at Ingham (Q200) and Bundaberg (Q151) were quite similar in both the plant crop (83 t/ha and 80 t/ha, respectively) and the 1R (89 t/ha v 94 t/ha, respectively), with only minor treatment effects on CCS at each site. Cane yield responses to tillage, break history and N fertiliser varied significantly between sites. There was a 27% yield increase in the plant crop from the soybean fallow at Ingham, with soybeans producing a yield advantage over continuous cane, but there were no clear break effects at Bundaberg - possibly due to a complex of pathogenic nematodes that responded differently to soybeans and maize breaks. There was no carryover benefit of the soybean break into the 1R crop at Ingham, while at Bundaberg the maize break produced a 15% yield advantage over soybeans and continuous cane. The Ingham site recorded positive responses to N fertiliser addition in both the plant (20% yield increase) and 1R (34% yield increase) crops, but there was negligible carryover benefit from plant crop N in the 1R crop, or of a reduced N response after a soybean rotation. By contrast, the Bundaberg site showed no N response in any history in the plant crop, and only a small (5%) yield increase with N applied in the 1R crop. There was again no evidence of a reduced N response in the 1R crop after a soybean fallow. There were no significant effects of tillage on cane yields at either site, although there were some minor interactions between tillage, breaks and N management in the 1R crop at both sites. Crop N contents at Bundaberg were more than 3 times those recorded at Ingham in both the plant and 1R crops, with N concentrations in millable stalk at Ingham suggesting N deficiencies in all treatments. There was negligible additional N recovered in crop biomass from N fertiliser application or soybean residues at the Ingham site. There was additional N recovered in crop biomass in response to N fertiliser and soybean breaks at Bundaberg, but effects were small and fertiliser use efficiencies poor. Loss pathways could not be quantified, but denitrification or losses in runoff were the likely causes at Ingham while leaching predominated at Bundaberg. Results highlight the complexity involved in developing sustainable farming systems for contrasting soil types and climatic conditions. A better understanding of key sugarcane pathogens and their host range, as well as improved capacity to predict in-crop N mineralisation, will be key factors in future improvements to sugarcane farming systems.
Resumo:
The impact of three cropping histories (sugarcane, maize and soybean) and two tillage practices (conventional tillage and direct drill) on plant-parasitic and free-living nematodes in the following sugarcane crop was examined in a field trial at Bundaberg. Soybean reduced populations of lesion nematode (Pratylenchus zeae) and root-knot nematode (Meloidogyne javanica) in comparison to previous crops of sugarcane or maize but increased populations of spiral nematode (Helicotylenchus dihystera) and maintained populations of dagger nematode (Xiphinema elongatum). However the effect of soybean on P zeae and M. javanica was no longer apparent 15 weeks after planting sugarcane, while later in the season, populations of these nematodes following soybean were as high as or higher than maize or sugarcane. Populations of P zeae were initially reduced by cultivation but due to strong resurgence tended to be higher in conventionally tilled than direct drill plots at the end of the plant crop. Even greater tillage effects were observed with M. javanica and X. elongatum, as nematode populations were significantly higher in conventionally tilled than direct drill plots late in the season. Populations of free-living nematodes in the upper 10 cm of soil were initially highest following soybean, but after 15, 35 and 59 weeks were lower than after sugarcane and contained fewer omnivorous and predatory nematodes. Conventional tillage increased populations of free-living nematodes in soil in comparison to direct drill and was also detrimental to omnivorous and predatory nematodes. These results suggest that crop rotation and tillage not only affect plant-parasitic nematodes directly, but also have indirect effects by impacting on natural enemies that regulate nematode populations. More than 2 million nematodes/m(2) were often present in crop residues on the surface of direct drill plots. Bacterial-feeding nematodes were predominant in residues early in the decomposition process but fungal-feeding nematodes predominated after 15 weeks. This indicates that fungi become an increasingly important component of the detritus food web as decomposition proceeds, and that that the rate of nutrient cycling decreases with time. Correlations between total numbers of free-living nematodes and mineral N concentrations in crop residues and surface soil suggested that the free-living nematode community may provide an indication of the rate of mineralisation of N from organic matter.
Resumo:
A field experiment was established in which an amendment of poultry manure and sawdust (200 t/ha) was incorporated into some plots but not others and then a permanent pasture or a sequence of biomass-producing crops was grown with and without tillage, with all biomass being returned to the soil. After 4 years, soil C levels were highest in amended plots, particularly those that had been cropped using minimum tillage, and lowest in non-amended and fallowed plots, regardless of how they had been tilled. When ginger was planted, symphylans caused severe damage to all treatments, indicating that cropping, tillage and organic matter management practices commonly used to improve soil health are not necessarily effective for all crops or soils. During the rotational phase of the experiment, the development of suppressiveness to three key pathogens of ginger was monitored using bioassays. Results for root-knot nematode (Meloidogyne javanica) indicated that for the first 2 years, amended soil was more suppressive than non-amended soil from the same cropping and tillage treatment, whereas under pasture, the amendment only enhanced suppressiveness in the first year. Suppressiveness was generally associated with higher C levels and enhanced biological activity (as measured by the rate of fluorescein diacetate (FDA) hydrolysis and numbers of free-living nematodes). Reduced tillage also enhanced suppressiveness, as gall ratings and egg counts in the second and third years were usually significantly lower in cropped soils under minimum rather than conventional tillage. Additionally, soil that was not disturbed during the process of setting up bioassays was more suppressive than soil which had been gently mixed by hand. Results of bioassays with Fusarium oxysporum f. sp. zingiberi were too inconsistent to draw firm conclusions, but the severity of fusarium yellows was generally higher in fumigated fallow soil than in other treatments, with soil management practices having little impact on disease severity. With regard to Pythium myriotylum, biological factors capable of reducing rhizome rot were present, but were not effective enough to suppress the disease under environmental conditions that were ideal for disease development.
Resumo:
Dairy farms located in the subtropical cereal belt of Australia rely on winter and summer cereal crops, rather than pastures, for their forage base. Crops are mostly established in tilled seedbeds and the system is vulnerable to fertility decline and water erosion, particularly over summer fallows. Field studies were conducted over 5 years on contrasting soil types, a Vertosol and Sodosol, in the 650-mm annual-rainfall zone to evaluate the benefits of a modified cropping program on forage productivity and the soil-resource base. Growing forage sorghum as a double-crop with oats increased total mean annual production over that of winter sole-crop systems by 40% and 100% on the Vertosol and Sodosol sites respectively. However, mean annual winter crop yield was halved and overall forage quality was lower. Ninety per cent of the variation in winter crop yield was attributable to fallow and in-crop rainfall. Replacing forage sorghum with the annual legume lablab reduced fertiliser nitrogen (N) requirements and increased forage N concentration, but reduced overall annual yield. Compared with sole-cropped oats, double-cropping reduced the risk of erosion by extending the duration of soil water deficits and increasing the time ground was under plant cover. When grown as a sole-crop, well fertilised forage sorghum achieved a mean annual cumulative yield of 9.64 and 6.05 t DM/ha on the Vertosol and Sodosol, respectively, being about twice that of sole-cropped oats. Forage sorghum established using zero-tillage practices and fertilised at 175 kg N/ha. crop achieved a significantly higher yield and forage N concentration than did the industry-standard forage sorghum (conventional tillage and 55 kg N/ha. crop) on the Vertosol but not on the Sodosol. On the Vertosol, mean annual yield increased from 5.65 to 9.64 t DM/ha (33 kg DM/kg N fertiliser applied above the base rate); the difference in the response between the two sites was attributed to soil type and fertiliser history. Changing both tillage practices and N-fertiliser rate had no affect on fallow water-storage efficiency but did improve fallow ground cover. When forage sorghum, grown as a sole crop, was replaced with lablab in 3 of the 5 years, overall forage N concentration increased significantly, and on the Vertosol, yield and soil nitrate-N reserves also increased significantly relative to industry-standard sorghum. All forage systems maintained or increased the concentration of soil nitrate-N (0-1.2-m soil layer) over the course of the study. Relative to sole-crop oats, alternative forage systems were generally beneficial to the concentration of surface-soil (0-0.1 m) organic carbon and systems that included sorghum showed most promise for increasing soil organic carbon concentration. We conclude that an emphasis on double-or summer sole-cropping rather than winter sole-cropping will advantage both farm productivity and the soil-resource base.