959 resultados para Soil nutrient
Resumo:
Irrigation is known to stimulate soil microbial carbon and nitrogen turnover and potentially the emissions of nitrous oxide (N2O) and carbon dioxide (CO2). We conducted a study to evaluate the effect of three different irrigation intensities on soil N2O and CO2 fluxes and to determine if irrigation management can be used to mitigate N2O emissions from irrigated cotton on black vertisols in South-Eastern Queensland, Australia. Fluxes were measured over the entire 2009/2010 cotton growing season with a fully automated chamber system that measured emissions on a sub-daily basis. Irrigation intensity had a significant effect on CO2 emission. More frequent irrigation stimulated soil respiration and seasonal CO2 fluxes ranged from 2.7 to 4.1 Mg-C ha−1 for the treatments with the lowest and highest irrigation frequency, respectively. N2O emission happened episodic with highest emissions when heavy rainfall or irrigation coincided with elevated soil mineral N levels and seasonal emissions ranged from 0.80 to 1.07 kg N2O-N ha−1 for the different treatments. Emission factors (EF = proportion of N fertilizer emitted as N2O) over the cotton cropping season, uncorrected for background emissions, ranged from 0.40 to 0.53 % of total N applied for the different treatments. There was no significant effect of the different irrigation treatments on soil N2O fluxes because highest emission happened in all treatments following heavy rainfall caused by a series of summer thunderstorms which overrode the effect of the irrigation treatment. However, higher irrigation intensity increased the cotton yield and therefore reduced the N2O intensity (N2O emission per lint yield) of this cropping system. Our data suggest that there is only limited scope to reduce absolute N2O emissions by different irrigation intensities in irrigated cotton systems with summer dominated rainfall. However, the significant impact of the irrigation treatments on the N2O intensity clearly shows that irrigation can easily be used to optimize the N2O intensity of such a system.
Resumo:
Rainfall simulation experiments were carried out to measure runoff and soil water fluxes of suspended solids, total nitrogen, total phosphorus, dissolved organic carbon and total iron from sites in Pinus plantations on the coastal lowlands of south-eastern Queensland subjected to various operations (treatments). The operations investigated were cultivated and nil-cultivated site preparation, fertilised site preparation, clearfall harvesting and prescribed burning; these treatments were compared with an 8-y-old established plantation. Flow-weighted mean concentrations of total nitrogen and total phosphorus in surface runoff from the cultivated and nil-cultivated site-preparation, clearfall harvest, prescribed burning and 8-y-old established plantation treatments were very similar. However, both the soil water and the runoff from the fertilised site preparation treatment contained more nitrogen (N) and phosphorus (P) than the other treatments - with 3.10 mg N L-1 and 4.32 mg P L-1 (4 and 20 times more) in the runoff. Dissolved organic carbon concentrations in runoff from the nil-cultivated site-preparation and prescribed burn treatments were elevated. Iron concentrations were highest in runoff from the nil-cultivated site-preparation and 8-y-old established plantation treatments. Concentrations of suspended solids in runoff were higher from cultivated site preparation and prescribed burn treatments, and reflect the great disturbance of surface soil at these sites. The concentrations of all analytes were highest in initial runoff from plots, and generally decreased with time. Total nitrogen (mean 7.28, range 0.11-13.27 mg L-1) and total phosphorus (mean 11.60, range 0.06-83.99 mg L-1) concentrations in soil water were between 2 and 10 times greater than in surface runoff, which highlights the potential for nutrient fluxes in interflow (i.e. in the soil above the water table) through the general plantation area. Implications in regard to forest management are discussed, along with results of larger catchment-scale studies.
Resumo:
Nutrient mass balances have been used to assess a variety of land resource scenarios, at various scales. They are widely used as a simple basis for policy, planning, and regulatory decisions but it is not clear how accurately they reflect reality. This study provides a critique of broad-scale nutrient mass balances, with particular application to the fertiliser use of beef lot-feeding manure in Queensland. Mass balances completed at the district and farm scale were found to misrepresent actual manure management behaviour and potentially the risk of nutrient contamination of water resources. The difficulties of handling stockpile manure and concerns about soil compaction mean that manure is spread thickly over a few paddocks at a time and not evenly across a whole farm. Consequently, higher nutrient loads were applied to a single paddock less frequently than annually. This resulted in years with excess nitrogen, phosphorus, and potassium remaining in the soil profile. This conclusion was supported by evidence of significant nutrient movement in several of the soil profiles studied. Spreading manure is profitable, but maximum returns can be associated with increased risk of nutrient leaching relative to conventional inorganic fertiliser practices. Bio-economic simulations found this increased risk where manure was applied to supply crop nitrogen requirements (the practice of the case study farms, 200-5000 head lot-feeders). Thus, the use of broad-scale mass balances can be misleading because paddock management is spatially heterogeneous and this leads to increased local potential for nutrient loss. In response to the effect of spatial heterogeneity policy makers who intend to use mass balance techniques to estimate potential for nutrient contamination should apply these techniques conservatively.
Resumo:
Single or multiple factors implicated in subsoil constraints including salinity, sodicity, and phytotoxic concentrations of chloride (Cl) are present in many Vertosols including those occurring in Queensland, Australia. The variable distribution and the complex interactions that exist between these constraints limit the agronomic or management options available to manage the soil with these subsoil constraints. The identification of crops and cultivars adapted to these adverse subsoil conditions and/or able to exploit subsoil water may be an option to maintain productivity of these soils. We evaluated relative performance of 5 winter crop species, in terms of grain yields, nutrient concentration, and ability to extract soil water, grown on soils with various levels and combinations of subsoil constraints in 19 field experiments over 2 years. Subsoil constraints were measured by levels of soil Cl, electrical conductivity of the saturation extract (ECse), and exchangeable sodium percentage (ESP). Increasing levels of subsoil constraints significantly decreased maximum depth of water extraction, grain yield, and plant-available water capacity for all the 5 crops and more so for chickpea and durum wheat than bread wheat, barley, or canola. Increasing soil Cl levels had a greater restricting effect on water availability than did ECse and ESP. We developed empirical relationships between soil Cl, ECse, and ESP and crop lower limit (CLL) for estimating subsoil water extraction by 5 winter crops. However, the presence of gypsum influenced the ability to predict CLL based on the levels of ECse. Stronger relationships between apparent unused plant-available water (CLL - LL15; LL15 is lower limit at -1.5 MPa) and soil Cl concentrations than ESP or ECse suggested that the presence of high Cl in these soils most likely inhibited the subsoil water extraction by the crops. This was supported by increased sodium (Na) and Cl concentration with a corresponding decrease in calcium (Ca) and potassium (K) in young mature leaf of bread wheat, durum wheat, and chickpea with increasing levels of subsoil constraints. Of the 2 ions, Na and Cl, the latter appears to be more damaging than the former, resulting in plant dieback and reduced grain yields.
Resumo:
Soil water repellency occurs widely in horticultural and agricultural soils when very dry. The gradual accumulation and breakdown of surface organic matter over time produces wax-like organic acids, which coat soil particles preventing uniform entry of water into the soil. Water repellency is usually managed by regular surfactant applications. Surfactants, literally, are surface active agents (SURFace ACTive AgeNTS). Their mode of action is to reduce the surface tension of water, allowing it to penetrate and wet the soil more easily and completely. This practice improves water use efficiency (by requiring less water to wet the soil and by capturing rainfall and irrigation more effectively and rapidly). It also reduces nutrient losses through run-off erosion or leaching. These nutrients have the potential to pollute the surrounding environment and water courses. This project investigated potential improvements to standard practices (product combination and scheduling) for surfactant use to overcome localised dry spots on water repellent soils and thus improve turf quality and water use efficiency. Weather conditions for the duration of the trial prevented the identification of improved practices in terms of combination and scheduling. However, the findings support previous research that the use of soil surfactants decreased the time for water to infiltrate dry soil samples taken from a previously severely hydrophobic site. Data will be continually collected from this trial site on a private contractual basis, with the hope that improvements to standard practices will be observed during the drier winter months when moisture availability is a limiting factor for turfgrass growth and quality.
Resumo:
Lantana camara is a recognized weed of worldwide significance due to its extensive distribution and its impacts on primary industries and nature conservation. However, quantitative data on the impact of the weed on soil ecosystem properties are scanty, especially in SE Australia, despite the pervasive presence of the weed along its coastal and inland regions. Consequently, mineral soils for physicochemical analyses were collected beneath and away from L. camara infestations in four sites west of Brisbane, SE Australia. These sites (hoop pine plantation, cattle farm, and two eucalyptus forests with occasional grazing and a fire regime, respectively) vary in landscape and land-use types. Significant site effect was more frequently observed than effect due to invasion status. Nonetheless, after controlling for site differences, ~50% of the 23 soil traits examined differed significantly between infested and non-infested soils. Moisture, pH, Ca, total and organic C, and total N (but not exchangeable N in form of NO3-) were significantly elevated, while sodium, chloride, copper, iron, sulfur, and manganese, many of which can be toxic to plant growth if present in excess levels, were present at lower levels in soils supporting L. camara compared to soils lacking the weed. These results indicate that L. camara can improve soil fertility and influence nutrient cycling, making the substratum ideal for its own growth and might explain the ability of the weed to outcompete other species, especially native ones.
Resumo:
The impact of three cropping histories (sugarcane, maize and soybean) and two tillage practices (conventional tillage and direct drill) on plant-parasitic and free-living nematodes in the following sugarcane crop was examined in a field trial at Bundaberg. Soybean reduced populations of lesion nematode (Pratylenchus zeae) and root-knot nematode (Meloidogyne javanica) in comparison to previous crops of sugarcane or maize but increased populations of spiral nematode (Helicotylenchus dihystera) and maintained populations of dagger nematode (Xiphinema elongatum). However the effect of soybean on P zeae and M. javanica was no longer apparent 15 weeks after planting sugarcane, while later in the season, populations of these nematodes following soybean were as high as or higher than maize or sugarcane. Populations of P zeae were initially reduced by cultivation but due to strong resurgence tended to be higher in conventionally tilled than direct drill plots at the end of the plant crop. Even greater tillage effects were observed with M. javanica and X. elongatum, as nematode populations were significantly higher in conventionally tilled than direct drill plots late in the season. Populations of free-living nematodes in the upper 10 cm of soil were initially highest following soybean, but after 15, 35 and 59 weeks were lower than after sugarcane and contained fewer omnivorous and predatory nematodes. Conventional tillage increased populations of free-living nematodes in soil in comparison to direct drill and was also detrimental to omnivorous and predatory nematodes. These results suggest that crop rotation and tillage not only affect plant-parasitic nematodes directly, but also have indirect effects by impacting on natural enemies that regulate nematode populations. More than 2 million nematodes/m(2) were often present in crop residues on the surface of direct drill plots. Bacterial-feeding nematodes were predominant in residues early in the decomposition process but fungal-feeding nematodes predominated after 15 weeks. This indicates that fungi become an increasingly important component of the detritus food web as decomposition proceeds, and that that the rate of nutrient cycling decreases with time. Correlations between total numbers of free-living nematodes and mineral N concentrations in crop residues and surface soil suggested that the free-living nematode community may provide an indication of the rate of mineralisation of N from organic matter.
Resumo:
The main outputs anticipated include enchanced knowledge of key water-nutrient dynamics in relation to key soil management techniques and a suite of improved and practical soil management options in sweet potatoes.
Resumo:
The potential for fertiliser use in Lockyer Valleys intensive vegetable production to impact on the Moreton Bay Waterways (MBW) is not well defined. Notwithstanding nutrient runoff through soil erosion of agricultural lands has been identified as a process that significantly contributes artificial fertiliser to the MBW (SEQ Healthy Waterways Draft Strategy 2006). In order to better understand this issue the present study undertakes a nutrient mass balance to evaluate nitrogen use efficiency in the intensive horticultural industry of the Lockyer Valley.
Resumo:
Radopholus similis is a major constraint to banana production in Australia and growers have relied on nematicides to manage production losses. The use of organic amendments is one method that may reduce the need for nematicides, but there is limited knowledge of the influence of organic amendments on endo-migratory nematodes, such as R. similis. Nine different amendments, namely, mill mud, mill ash, biosolids, municipal waste compost, banana residue, grass hay, legume hay, molasses and calcium silicate were applied to the three major soil types of the wet tropics region used for banana production. The nutrient content of the amendments was also determined. Banana plants were inoculated with R. similis and grown in the soil-amendment mix for 12-weeks in a glasshouse experiment. Assessments of plant growth, plant-parasitic nematodes and soil nematode community characteristics were made at the termination of the experiment. Significant suppression of plant-parasitic nematodes occurred in soils amended with legume hay, grass hay, banana residue and mill mud relative to untreated soil. These amendments were found to have the highest N and C content. The application of banana residue and mill mud significantly increased shoot dry weight at the termination of the experiment relative to untreated soil. Furthermore, the applications of banana residue, grass hay, mill mud and municipal waste compost increased the potential for suppression of plant-parasitic nematodes through antagonistic activity. The application of amendments that are high in C and N appeared to be able to induce suppression of plant-parasitic nematodes in bananas, by developing a more favourable environment for antagonistic organisms.
Resumo:
Increased sediment and nutrient losses resulting from unsustainable grazing management in the Burdekin River catchment are major threats to water quality in the Great Barrier Reef Lagoon. To test the effects of grazing management on soil and nutrient loss, five 1 ha mini-catchments were established in 1999 under different grazing strategies on a sedimentary landscape near Charters Towers. Reference samples were also collected from watercourses in the Burdekin catchment during major flow events.Soil and nutrient loss were relatively low across all grazing strategies due to a combination of good cover, low slope and low rainfall intensities. Total soil loss varied from 3 to 20 kg haˉ¹ per event while losses of N and P ranged from 10 to 1900 g haˉ¹ and from 1 to 71 g haˉ¹ per event respectively. Water quality of runoff was considered moderate across all strategies with relatively low levels of total suspended sediment (range: 8-1409 mg lˉ¹), total N (range: 101-4000 ug lˉ¹) and total P (range: 14-609 ug lˉ¹). However, treatment differences are likely to emerge with time as the impacts of the different grazing strategies on land condition become more apparent.Samples collected opportunistically from rivers and creeks during flow events displayed significantly higher levels of total suspended sediment (range: 10-6010 mg lˉ¹), total N (range: 650-6350 ug lˉ¹) and total P (range: 50-1500 ug lˉ¹) than those collected at the grazing trial. These differences can largely be attributed to variation in slope, geology and cover between the grazing trial and different catchments. In particular, watercourses draining hillier, grano-diorite landscapes with low cover had markedly higher sediment and nutrient loads compared to those draining flatter, sedimentary landscapes.These preliminary data suggest that on relatively flat, sedimentary landscapes, extensive cattle grazing is compatible with achieving water quality targets, provided high levels of ground cover are maintained. In contrast, sediment and nutrient loss under grazing on more erodable land types is cause for serious concern. Long-term empirical research and monitoring will be essential to quantify the impacts of changed land management on water quality in the spatially and temporally variable Burdekin River catchment.
Resumo:
Fire is an important driver of nutrient cycling in savannas. Here, we determined the impact of fire frequency on total and soluble soil nitrogen (N) pools in tropical savanna. The study sites consisted of 1-ha experimental plots near Darwin, Australia, which remained unburnt for at least 14 years or were burnt at 1-, 2- or 5-year intervals over the past 6 years. Soil was analysed from patches underneath tree canopies and in inter-canopy patches at 1, 12, 28, 55 and 152 days after fire. Patch type had a significant effect on all soil N pools, with greater concentrations of total and soluble (nitrate, ammonium, amino acids) N under tree canopies than inter-canopy patches. The time since the last fire had no significant effect on N pools. Fire frequency similarly did not affect total soil N but it did influence soluble soil N. Soil amino acids were most prominent in burnt savanna, ammonium was highest in infrequently burnt (5-year interval) savanna and nitrate was highest in unburnt savanna. We suggest that the main effect of fire on soil N relations occurs indirectly through altered tree-grass dynamics. Previous studies have shown that high fire frequencies reduce tree cover by lowering recruitment and increasing mortality. Our findings suggest that these changes in tree cover could result in a 30% reduction in total soil N and 1060% reductions in soluble N pools. This finding is consistent with studies from savannas globally, providing further evidence for a general theory of patchiness as a key driver of nutrient cycling in the savanna biome.
Resumo:
In boreal forests, microorganisms have a pivotal role in nutrient and water supply of trees as well as in litter decomposition and nutrient cycling. This reinforces the link between above-ground and below-ground communities in the context of sustainable productivity of forest ecosystems. In northern boreal forests, the diversity of microbes associated with the trees is high compared to the number of distinct tree species. In this thesis, the aim was to study whether conspecific tree individuals harbour different soil microbes and whether the growth of the trees and the community structure of the associated microbes are connected. The study was performed in a clonal field trial of Norway spruce, which was established in a randomized block design in a clear-cut area. Since out-planting in 1994, the spruce clones showed two-fold growth differences. The fast-growing spruce clones were associated with a more diverse community of ectomycorrhizal fungi than the slow-growing spruce clones. These growth performance groups also differed with respect to other aspects of the associated soil microorganisms: the species composition of ectomycorrhizal fungi, in the amount of extraradical fungal mycelium, in the structure of bacterial community associated with the mycelium, and in the structure of microbial community in the organic layer. The communities of fungi colonizing needle litter of the spruce clones in the field did not differ and the loss of litter mass after two-years decomposition was equal. In vitro, needles of the slow-growing spruce clones were colonized by a more diverse community of endophytic fungi that were shown to be significant needle decomposers. This study showed a relationship between the growth of Norway spruce clones and the community structure of the associated soil microbes. Spatial heterogeneity in soil microbial community was connected with intraspecific variation of trees. The latter may therefore influence soil biodiversity in monospecific forests.
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.
Resumo:
Fusarium wilt, caused by Fusarium oxysporum f. sp. cubense (Foc), is one of the most destructive diseases of banana. One potential method to manage fusarium wilt of banana is by manipulating the nutrient status in the soil. This study was conducted to determine the quality of Foc suppressive and conducive soil, the influence of soil application of silica and manure on the incidence of fusarium wilt of banana. Surveys were conducted in five banana plantations in three provinces in Indonesia: Lampung-Sumatra, West Java and Central Java. From the five locations, one location (Sala-man-Central Java) was heavily infected by Foc, another location (NTF Lampung-Sumatera) was slightly infected by Foc, while the rest (Sarampad-West Java, Talaga-West Java and GGP Lampung-Sumatra) were healthy banana plantations without Foc infection. Labile carbon analysis showed that the Foc suppressive soil had greater labile carbon content than conducive soil. Also, the analysis of fluorescein diacetate hydrolysis (FDA) and ?-glucosidase showed greater microbial activity in suppressive soil than the conducive soil. Observations of the incidence of necrotic rhizome of Foc susceptible 'Ambon Kuning' (AAA) banana cultivar showed that in the suppressive soil taken from Sarampad West Java, the application of silica and manure helped suppress fusarium wilt disease development. In the conducive soil taken from Salaman-Central Java, silica and manure applications were not able to suppress disease incidence. The result of this study indicated that in suppressive soil, the application of silica can increase plant resistance to Foc infection, while manure application can increase soil microbial activity, and suppress Foc development.