41 resultados para Soil organic C
Resumo:
Increasing organic carbon inputs to agricultural soils through the use of pastures or crop residues has been suggested as a means of restoring soil organic carbon lost via anthropogenic activities, such as land use change. However, the decomposition and retention of different plant residues in soil, and how these processes are affected by soil properties and nitrogen fertiliser application, is not fully understood. We evaluated the rate and extent of decomposition of 13C-pulse labelled plant material in response to nitrogen addition in four pasture soils of varying physico-chemical characteristics. Microbial respiration of buffel grass (Cenchrus ciliaris L.), wheat (Triticum aestivum L.) and lucerne (Medicago sativa L.) residues was monitored over 365-days. A double exponential model fitted to the data suggested that microbial respiration occurred as an early rapid and a late slow stage. A weighted three-compartment mixing model estimated the decomposition of both soluble and insoluble plant 13C (mg C kg−1 soil). Total plant material decomposition followed the alkyl C: O-alkyl C ratio of plant material, as determined by solid-state 13C nuclear magnetic resonance spectroscopy. Urea-N addition increased the decomposition of insoluble plant 13C in some soils (≤0.1% total nitrogen) but not others (0.3% total nitrogen). Principal components regression analysis indicated that 26% of the variability of plant material decomposition was explained by soil physico-chemical characteristics (P = 0.001), which was primarily described by the C:N ratio. We conclude that plant species with increasing alkyl C: O-alkyl C ratio are better retained as soil organic matter, and that the C:N stoichiometry of soils determines whether N addition leads to increases in soil organic carbon stocks.
Resumo:
Attention is directed at land application of piggery effluent (containing urine, faeces, water, and wasted feed) as a potential source of water resource contamination with phosphorus (P). This paper summarises P-related properties of soil from 0-0.05 m depth at 11 piggery effluent application sites, in order to explore the impact that effluent application has had on the potential for run-off transport of P. The sites investigated were situated on Alfisol, Mollisol, Vertisol, and Spodosol soils in areas that received effluent for 1.5-30 years (estimated effluent-P applications of 100-310000 kg P/ha in total). Total (PT), bicarbonate extractable (PB), and soluble P forms were determined for the soil (0-0.05 m) at paired effluent and no-effluent sites, as well as texture, oxalate-extractable Fe and Al, organic carbon, and pH. All forms of soil P at 0-0.05 m depth increased with effluent application (PB at effluent sites was 1.7-15 times that at no-effluent sites) at 10 of the 11 sites. Increases in PB were strongly related to net P applications (regression analysis of log values for 7 sites with complete data sets: 82.6 % of variance accounted for, p <0.01). Effluent irrigation tended to increase the proportion of soil PT in dilute CaCl2-extractable forms (PTC: effluent average 2.0 %; no-effluent average 0.6%). The proportion of PTC in non-molybdate reactive forms (centrifuged supernatant) decreased (no-effluent average, 46.4 %; effluent average, 13.7 %). Anaerobic lagoon effluent did not reliably acidify soil, since no consistent relationship was observed for pH with effluent application. Soil organic carbon was increased in most of the effluent areas relative to the no-effluent areas. The four effluent areas where organic carbon was reduced had undergone intensive cultivation and cropping. Current effluent management at many of the piggeries failed to maximise the potential for waste P recapture. Ten of the case-study effluent application areas have received effluent-P in excess of crop uptake. While this may not represent a significant risk of leaching where sorption retains P, it has increased the risk of transport of P by run-off. Where such sites are close to surface water, run-off P loads should be managed.
Resumo:
Piggery pond sludge (PPS) was applied, as-collected (Wet PPS) and following stockpiling for 12 months (Stockpiled PPS), to a sandy Sodosol and clay Vertosol at sites on the Darling Downs of Queensland. Laboratory measures of N availability were carried out on unamended and PPS-amended soils to investigate their value in estimating supplementary N needs of crops in Australia's northern grains region. Cumulative net N mineralised from the long-term (30 weeks) leached aerobic incubation was described by a first-order single exponential model. The mineralisation rate constant (0.057/week) was not significantly different between Control and PPS treatments or across soil types, when the amounts of initial mineral N applied in PPS treatments were excluded. Potentially mineralisable N (No) was significantly increased by the application of Wet PPS, and increased with increasing rate of application. Application of Wet PPS significantly increased the total amount of inorganic N leached compared with the Control treatments. Mineral N applied in Wet PPS contributed as much to the total mineral N status of the soil as did that which mineralised over time from organic N. Rates of C02 evolution during 30 weeks of aerobic leached incubation indicated that the Stockpiled PPS was more stabilised (19-28% of applied organic C mineralised) than the WetPPS (35-58% of applied organic C mineralised), due to higher lignin content in the former. Net nitrate-N produced following 12 weeks of aerobic non-leached incubation was highly correlated with net nitrate-N leached during 12 weeks of aerobic incubation (R^2 = 0.96), although it was <60% of the latter in both sandy and clayey soils. Anaerobically mineralisable N determined by waterlogged incubation of laboratory PPS-amended soil samples increased with increasing application rate of Wet PPS. Anaerobically minemlisable N from field-moist soil was well correlated with net N mineralised during 30 weeks of aerobic leached incubation (R^2 =0.90 sandy soil; R^2=0.93 clay soil). In the clay soil, the amount of mineral N produced from all the laboratory incubations was significantly correlated with field-measured nitrate-N in the soil profile (0-1.5 m depth) after 9 months of weed-free fallow following PPS application. In contrast, only anaerobic mineralisable N was significantly correlated with field nitrate-N in the sandy soil. Anaerobic incubation would, therefore, be suitable as a rapid practical test to estimate potentially mineralisable N following applications of different PPS materials in the field.
Resumo:
Assessing the sustainability of crop and soil management practices in wheat-based rotations requires a well-tested model with the demonstrated ability to sensibly predict crop productivity and changes in the soil resource. The Agricultural Production Systems Simulator (APSIM) suite of models was parameterised and subsequently used to predict biomass production, yield, crop water and nitrogen (N) use, as well as long-term soil water and organic matter dynamics in wheat/chickpea systems at Tel Hadya, north-western Syria. The model satisfactorily simulated the productivity and water and N use of wheat and chickpea crops grown under different N and/or water supply levels in the 1998-99 and 1999-2000 experimental seasons. Analysis of soil-water dynamics showed that the 2-stage soil evaporation model in APSIM's cascading water-balance module did not sufficiently explain the actual soil drying following crop harvest under conditions where unused water remained in the soil profile. This might have been related to evaporation from soil cracks in the montmorillonitic clay soil, a process not explicitly simulated by APSIM. Soil-water dynamics in wheat-fallow and wheat-chickpea rotations (1987-98) were nevertheless well simulated when the soil water content in 0-0.45 m soil depth was set to 'air dry' at the end of the growing season each year. The model satisfactorily simulated the amounts of NO3-N in the soil, whereas it underestimated the amounts of NH 4-N. Ammonium fixation might be part of the soil mineral-N dynamics at the study site because montmorillonite is the major clay mineral. This process is not simulated by APSIM's nitrogen module. APSIM was capable of predicting long-term trends (1985-98) in soil organic matter in wheat-fallow and wheat-chickpea rotations at Tel Hadya as reported in literature. Overall, results showed that the model is generic and mature enough to be extended to this set of environmental conditions and can therefore be applied to assess the sustainability of wheat-chickpea rotations at Tel Hadya.
Resumo:
Prescribed fire is one of the most widely-used management tools for reducing fuel loads in managed forests. However the long-term effects of repeated prescribed fires on soil carbon (C) and nitrogen (N) pools are poorly understood. This study aimed to investigate how different fire frequency regimes influence C and N pools in the surface soils (0–10 cm). A prescribed fire field experiment in a wet sclerophyll forest established in 1972 in southeast Queensland was used in this study. The fire frequency regimes included long unburnt (NB), burnt every 2 years (2yrB) and burnt every 4 years (4yrB), with four replications. Compared with the NB treatment, the 2yrB treatment lowered soil total C by 44%, total N by 54%, HCl hydrolysable C and N by 48% and 59%, KMnO4 oxidizable C by 81%, microbial biomass C and N by 42% and 33%, cumulative CO2–C by 28%, NaOCl-non-oxidizable C and N by 41% and 51%, and charcoal-C by 17%, respectively. The 4yrB and NB treatments showed no significant differences for these soil C and N pools. All soil labile, biologically active and recalcitrant and total C and N pools were correlated positively with each other and with soil moisture content, but negatively correlated with soil pH. The C:N ratios of different C and N pools were greater in the burned treatments than in the NB treatments. This study has highlighted that the prescribed burning at four year interval is a more sustainable management practice for this subtropical forest ecosystem.
Resumo:
With the aim of increasing peanut production in Australia, the Australian peanut industry has recently considered growing peanuts in rotation with maize at Katherine in the Northern Territory—a location with a semi-arid tropical climate and surplus irrigation capacity. We used the well-validated APSIM model to examine potential agronomic benefits and long-term risks of this strategy under the current and warmer climates of the new region. Yield of the two crops, irrigation requirement, total soil organic carbon (SOC), nitrogen (N) losses and greenhouse gas (GHG) emissions were simulated. Sixteen climate stressors were used; these were generated by using global climate models ECHAM5, GFDL2.1, GFDL2.0 and MRIGCM232 with a median sensitivity under two Special Report of Emissions Scenarios over the 2030 and 2050 timeframes plus current climate (baseline) for Katherine. Effects were compared at three levels of irrigation and three levels of N fertiliser applied to maize grown in rotations of wet-season peanut and dry-season maize (WPDM), and wet-season maize and dry-season peanut (WMDP). The climate stressors projected average temperature increases of 1°C to 2.8°C in the dry (baseline 24.4°C) and wet (baseline 29.5°C) seasons for the 2030 and 2050 timeframes, respectively. Increased temperature caused a reduction in yield of both crops in both rotations. However, the overall yield advantage of WPDM increased from 41% to up to 53% compared with the industry-preferred sequence of WMDP under the worst climate projection. Increased temperature increased the irrigation requirement by up to 11% in WPDM, but caused a smaller reduction in total SOC accumulation and smaller increases in N losses and GHG emission compared with WMDP. We conclude that although increased temperature will reduce productivity and total SOC accumulation, and increase N losses and GHG emissions in Katherine or similar northern Australian environments, the WPDM sequence should be preferable over the industry-preferred sequence because of its overall yield and sustainability advantages in warmer climates. Any limitations of irrigation resulting from climate change could, however, limit these advantages.
Resumo:
Land-use change can have a major influence on soil organic carbon (SOC) and above-ground C pools. We assessed a change from native vegetation to introduced Pinus species plantations on C pools using eight paired sites. At each site we determined the impacts on 0–50 cm below-ground (SOC, charcoal C, organic matter C, particulate organic C, humic organic C, resistant organic C) and above-ground (litter, coarse woody debris, standing trees and woody understorey plants) C pools. In an analysis across the different study sites there was no significant difference (P > 0.05) in SOC or above-ground tree C stocks between paired native vegetation and pine plantations, although significant differences did exist at specific sites. SOC (calculated based on an equivalent soil mass basis) was higher in the pine plantations at two sites, higher in the native vegetation at two sites and did not differ for the other four sites. The site to site variation in SOC across the landscape was far greater than the variation observed with a change from native vegetation to introduced Pinus plantation. Differences between sites were not explained by soil type, although tree basal area was positively correlated with 0–50 cm SOC. In fact, in the native vegetation there was a significant linear relationship between above-ground biomass and SOC that explained 88.8% of the variation in the data. Fine litter C (0–25 mm diameter) tended to be higher in the pine forest than in the adjacent native vegetation and was significantly higher in the pine forest at five of the eight paired sites. Total litter C (0–100 mm diameter) increased significantly with plantation age (R2 = 0.64). Carbon stored in understorey woody plants (2.5–10 cm DBH) was higher in the native vegetation than in the adjacent pine forest. Total site C varied greatly across the study area from 58.8 Mg ha−1 at a native heathland site to 497.8 Mg ha−1 at a native eucalypt forest site. Our findings suggest that the effects of change from native vegetation to introduced Pinus sp. forest are highly site-specific and may be positive, negative, or have no influence on various C pools, depending on local site characteristics (e.g. plantation age and type of native vegetation).
Resumo:
Two field experiments were carried out in Taveuni, Fiji to study the effects of mucuna (Mucuna pruriens) and grass fallow systems at 6 and 12 month durations on changes in soil properties (Experiment 1) and taro yields (Experiment 2). Biomass accumulation of mucuna fallow crop was significantly higher (P<0.05) than grass fallow crop at both 6 and 12 month durations. The longer fallow duration resulted in higher (P<0.05) total soil organic carbon, total soil nitrogen and earthworm numbers regardless of fallow type. Weed suppression in taro grown under mucuna was significantly greater (P<0.05) than under natural grass fallow. Taro grown under mucuna fallow significantly outyielded taro grown under grass fallow (11.8 vs. 8.8 t ha-1). Also, the gross margin of taro grown under mucuna fallow was 52% higher than that of taro grown under grass fallow. © ISHS.
Resumo:
Forty-four study sites were established in remnant woodland in the Burdekin River catchment in tropical north-east Queensland, Australia, to assess recent (decadal) vegetation change. The aim of this study was further to evaluate whether wide-scale vegetation 'thickening' (proliferation of woody plants in formerly more open woodlands) had occurred during the last century, coinciding with significant changes in land management. Soil samples from several depth intervals were size separated into different soil organic carbon (SOC) fractions, which differed from one another by chemical composition and turnover times. Tropical (C4) grasses dominate in the Burdekin catchment, and thus δ13C analyses of SOC fractions with different turnover times can be used to assess whether the relative proportion of trees (C3) and grasses (C4) had changed over time. However, a method was required to permit standardized assessment of the δ13C data for the individual sites within the 13 Mha catchment, which varied in soil and vegetation characteristics. Thus, an index was developed using data from three detailed study sites and global literature to standardize individual isotopic data from different soil depths and SOC fractions to reflect only the changed proportion of trees (C3) to grasses (C3) over decadal timescales. When applied to the 44 individual sites distributed throughout the Burdekin catchment, 64% of the sites were shown to have experienced decadal vegetation thickening, while 29% had remained stable and the remaining 7% had thinned. Thus, the development of this index enabled regional scale assessment and comparison of decadal vegetation patterns without having to rely on prior knowledge of vegetation changes or aerial photography.
Resumo:
Two species of root-lesion nematode (predominantly Pratylenchus thornei but also P. neglectus) are widespread pathogens of wheat and other crops in Australia's northern grain belt, a subtropical region with deep, fertile clay soils and a summer-dominant rainfall pattern. Losses in grain yield from P. thornei can be as high as 70% for intolerant wheat cultivars. This review focuses on research which has led to the development of effective integrated management programs for these nematodes. It highlights the importance of correct identification in managing Pratylenchus species, reviews the plant breeding work done in developing tolerant and resistant cultivars, outlines the methods used to screen for tolerance and resistance, and discusses how planned crop sequencing with tolerant and partially resistant wheat cultivars, together with crops such as sorghum, sunflower, millets and canaryseed, can be used to reduce nematode populations and limit crop damage. The declining levels of soil organic matter in cropped soils are also discussed with reference to their effect on soil health and biological suppression of root-lesion nematodes.
Resumo:
In Australia communities are concerned about atrazine being detected in drinking water supplies. It is important to understand mechanisms by which atrazine is transported from paddocks to waterways if we are to reduce movement of agricultural chemicals from the site of application. Two paddocks cropped with grain sorghum on a Black Vertosol were monitored for atrazine, potassium chloride (KCl) extractable atrazine, desethylatrazine (DEA), and desisopropylatrazine (DIA) at 4 soil depths (0-0.05, 0.05-0.10, 0.10-0.20, and 0.20-0.30 m) and in runoff water and runoff sediment. Atrazine + DEA + DIA (total atrazine) had a half-life in soil of 16-20 days, more rapid dissipation than in many earlier reports. Atrazine extracted in dilute potassium chloride, considered available for weed control, was initially 34% of the total and had a half-life of 15-20 days until day 30, after which it dissipated rapidly with a half life of 6 days. We conclude that, in this region, atrazine may not pose a risk for groundwater contamination, as only 0.5% of applied atrazine moved deeper than 0.20 m into the soil, where it dissipated rapidly. In runoff (including suspended sediment) atrazine concentrations were greatest during the first runoff event (57 days after application) (85 μg/L) and declined with time. After 160 days, the total atrazine lost in runoff was 0.4% of the initial application. The total atrazine concentration in runoff was strongly related to the total concentration in soil, as expected. Even after 98% of the KCl-extractable atrazine had dissipated (and no longer provided weed control), runoff concentrations still exceeded the human health guideline value of 40 μg/L. For total atrazine in soil (0-0.05 m), the range for coefficient of soil sorption (Kd) was 1.9-28.4 mL/g and for soil organic carbon sorption (KOC) was 100-2184 mL/g, increasing with time of contact with the soil and rapid dissipation of the more soluble, available phase. Partition coefficients in runoff for total atrazine were initially 3, increasing to 32 and 51 with time, values for DEA being half these. To minimise atrazine losses, cultural practices that maximise rain infiltration, and thereby minimise runoff, and minimise concentrations in the soil surface should be adopted.
Resumo:
An experiment using herds of similar to 20 cows (farmlets) assessed the effects of high stocking rates on production and profitability of feeding systems based on dryland and irrigated perennial ryegrass-based pastures in a Mediterranean environment in South Australia over 4 years. A target level of milk production of 7000 L/cow.year was set, based on predicted intakes of 2.7 t DM/cow.year as concentrates, pasture intakes from 1.5 to 2.7 t/cow.year and purchased fodder. In years 1 and 2, up to 1.5 t DM/cow.year of purchased fodder was used and in years 3 and 4 the amounts were increased if necessary to enable levels of milk production per cow to be maintained at target levels. Cows in dryland farmlets calved in March to May inclusive and were stocked at 2.5, 2.9, 3.3, 3.6 and 4.1 cows/ha, while those in irrigated farmlets calved in August to October inclusive and were stocked at 4.1, 5.2, 6.3 and 7.4 cows/ha. In the first 2 years, when inputs of purchased fodder were limited, milk production per cow was reduced with higher stocking rates (P < 0.01), but in years 3 and 4 there were no differences. Mean production was 7149 kg/cow.year in years 1 and 2, and 8162 kg/cow.year in years 3 and 4. Production per hectare was very closely related to stocking rate in all years (P < 0.01), increasing from 18 to 34 t milk/ha.year for dryland farmlets (1300 to 2200 kg milk solids/ha) and from 30 to 60 t milk/ha.year for irrigated farmlets (2200 to 4100 kg milk solids/ha). Almost all of these increases were attributed to the increases in grain and purchased fodder inputs associated with the increases in stocking rate. Net pasture accumulation rates and pasture harvest were generally not altered with stocking rate, though as stocking rate increased there was a change to more of the pasture being grazed and less conserved in both dryland and irrigated farmlets. Total pasture harvest averaged similar to 8 and 14 t DM/ha.year for dryland and irrigated pastures, respectively. An exception was at the highest stocking rate under irrigation, where pugging during winter was associated with a 14% reduction in annual pasture growth. There were several indications that these high stocking rates may not be sustainable without substantial changes in management practice. There were large and positive nutrient balances and associated increases in soil mineral content (P < 0.01), especially for phosphorus and nitrate nitrogen, with both stocking rate and succeeding years. Levels under irrigation were considerably higher (up to 90 and 240 mg/kg of soil for nitrate nitrogen and phosphorus, respectively) than under dryland pastures (60 and 140 mg/kg, respectively). Soil organic carbon levels did not change with stocking rate, indicating a high level of utilisation of forage grown. Weed ingress was also high (to 22% DM) in all treatments and especially in heavily stocked irrigated pastures during winter. It was concluded the higher stocking rates used exceeded those that are feasible for Mediterranean pastures in this environment and upper levels of stocking are suggested to be 2.5 cows/ha for dryland pastures and 5.2 cows/ha for irrigated pastures. To sustain these suggested stocking rates will require further development of management practices to avoid large increases in soil minerals and weed invasion of pastures.
Resumo:
We investigated aspects of the reproductive ecology of Ochna serrulata (Hochst.) Walp., an invasive plant in eastern Australia. O. serrulata drupes were similar in size to fleshy fruits of other local invasive plants, but showed some distinct differences in quality, with a very high pulp lipid content (32.8% of dry weight), and little sugar and water. Seeds were dispersed by figbirds, Sphecotheres viridis Vieillot, a locally abundant frugivore, and comprised between 10 and 50% of all non-Ficus spp. fruit consumed during October and November. The rate of removal of O. serrulata drupes was greater in bushland than suburban habitats, indicating that control in bushland habitats should be a priority, but also that suburban habitats are likely to act as significant seed sources for reinvasion of bushland. Germination occurred under all seed-processing treatments (with and without pulp, and figbird gut passage), suggesting that although frugivores are important for dispersal, they are not essential for germination. Recruitment of buried and surface-sown seed differed between greenhouse and field experiments, with minimal recruitment of surface-sown seed in the field. Seed persistence was low, particularly under field conditions, with 0.75% seed viability after 6 months and 0% at 12 months. This provides an opportunity to target control efforts in south-eastern Queensland in spring before fruit set, when there is predicted to be few viable seeds in the soil.
Resumo:
The project uses participatory methods to engage primary producers and advisers in central Queensland, southern Queensland, and north east New South Wales on-farm trials and demonstrations to adapt mixed farming systems to changed climate conditions. The focus is adaptation to climate change but will support abatement of greenhouse gas emissions by building soil carbon, better managing soil nitrogen and soil organic carbon. Data will be collected and integrated with data from Round 1 of the Climate Change Research Program to extend industry understanding beyond a general awareness of ‘climate change’. Nitrous oxide and soil carbon data will help farmers/advisers understand the implications of climate change and develop adaptation strategies for a more sustainable, climate sensitive future.
Resumo:
Dairy farms located in the subtropical cereal belt of Australia rely on winter and summer cereal crops, rather than pastures, for their forage base. Crops are mostly established in tilled seedbeds and the system is vulnerable to fertility decline and water erosion, particularly over summer fallows. Field studies were conducted over 5 years on contrasting soil types, a Vertosol and Sodosol, in the 650-mm annual-rainfall zone to evaluate the benefits of a modified cropping program on forage productivity and the soil-resource base. Growing forage sorghum as a double-crop with oats increased total mean annual production over that of winter sole-crop systems by 40% and 100% on the Vertosol and Sodosol sites respectively. However, mean annual winter crop yield was halved and overall forage quality was lower. Ninety per cent of the variation in winter crop yield was attributable to fallow and in-crop rainfall. Replacing forage sorghum with the annual legume lablab reduced fertiliser nitrogen (N) requirements and increased forage N concentration, but reduced overall annual yield. Compared with sole-cropped oats, double-cropping reduced the risk of erosion by extending the duration of soil water deficits and increasing the time ground was under plant cover. When grown as a sole-crop, well fertilised forage sorghum achieved a mean annual cumulative yield of 9.64 and 6.05 t DM/ha on the Vertosol and Sodosol, respectively, being about twice that of sole-cropped oats. Forage sorghum established using zero-tillage practices and fertilised at 175 kg N/ha. crop achieved a significantly higher yield and forage N concentration than did the industry-standard forage sorghum (conventional tillage and 55 kg N/ha. crop) on the Vertosol but not on the Sodosol. On the Vertosol, mean annual yield increased from 5.65 to 9.64 t DM/ha (33 kg DM/kg N fertiliser applied above the base rate); the difference in the response between the two sites was attributed to soil type and fertiliser history. Changing both tillage practices and N-fertiliser rate had no affect on fallow water-storage efficiency but did improve fallow ground cover. When forage sorghum, grown as a sole crop, was replaced with lablab in 3 of the 5 years, overall forage N concentration increased significantly, and on the Vertosol, yield and soil nitrate-N reserves also increased significantly relative to industry-standard sorghum. All forage systems maintained or increased the concentration of soil nitrate-N (0-1.2-m soil layer) over the course of the study. Relative to sole-crop oats, alternative forage systems were generally beneficial to the concentration of surface-soil (0-0.1 m) organic carbon and systems that included sorghum showed most promise for increasing soil organic carbon concentration. We conclude that an emphasis on double-or summer sole-cropping rather than winter sole-cropping will advantage both farm productivity and the soil-resource base.