145 resultados para Soil applied
Resumo:
Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.
Resumo:
Nitrous oxide is the foremost greenhouse gas (GHG)generated by land-applied manures and chemical fertilisers (Australian Government 2013). This research project was part of the National Agricultural Manure Management Program and investigated the potential for sorbers (i.e. specific naturally-occurring minerals) to decrease GHG emissions from spent piggery litter (as well as other manures)applied to soils. The sorbers investigated in this research were vermiculite and bentonite. Both are clays with high cation exchange capacities, of approximately 100–150 cmol/kg Faure 1998). The hypothesis tested in this study was that the sorbers bind ammonium in soil solution thereby suppressing ammonia (NH3)volatilisation and in doing so, slowing the kinetics of nitrate formation and associated nitrous oxide (N2O) emissions. A series of laboratory, glasshouse and field experiments were conducted to assess the sorbers’ effectiveness. The laboratory experiments comprised 64 vessels containing manure and sorber/manure ratios ranging from 1 : 10 to 1 : 1 incorporated into a sandy Sodosol via mixing. The glasshouse trial involved 240 pots comprising manure/sorber incubations placed 5 cm below the soil surface, two soil types (sandy Sodosol and Ferrosol) and two different nitrogen (N) application rates (50 kg N/ha and 150 kg N/ha) with a model plant (kikuyu grass). The field trial consisted of 96, 2 m · 2 m plots on a Ferrosol site with digit grass used as a model plant. Manure/ sorber mixtures were applied in trenches (5 cm below surface) to these plots at increasing sorber levels at anNloading rate of 200 kg/ha. Gas produced in all experiments was plumbed into a purpose-built automated gas analysis (N2O, NH3, CH4, CO2) system. In the laboratory experiments, the sorbers showed strong capacity to decreaseNH3 emissions (up to 80% decrease). Ammonia emissions were close to the detection limit in all treatments in the glasshouse and field trial. In all experiments, considerable N2O decreases (>40%) were achieved by the sorbers. As an example, mean N2O emission decreases from the field trial phase of the project are shown in Fig. 1a. The decrease inGHGemissions brought about by the clays did not negatively impact agronomic performance. Both vermiculite and bentonite resulted in a significant increase in dry matter yields in the field trial (Fig. 1b). Continuing work will optimise the sorber technology for improved environmental and agronomic performance across a range of soils (Vertosol, Dermosol in addition to Ferrosol and Sodosols) and environmental parameters (moisture, temperature, porosity, pH).
Resumo:
Like all high yielding farming systems nitrogen (N) is a key component to their productivity and profitability and Australian irrigated cotton growers are tending to apply more N than is required for the level of lint yield that is being achieved. This suggests either over application of N or inefficient systems limiting the response of cotton to N inputs. To investigate this four replicated trials were established in commercial fields during the 2014/15 season. The trials were aiming to measure the difference in response of irrigated cotton to the application of N under flood and overhead irrigation systems. The application treatments utilized eight upfront rates of applied N, ranging from 0 N kg/ha to a maximum of 410 kg N/ha, with three of the fours trials receiving a growerdetermined in-crop application of N in the irrigation water. The two flood irrigation systems had lower lint yields from similar levels of N input compared to one of the overhead irrigated sites; the result from the second overhead site was impacted by disease. This paper discusses the response of plant N uptake, lint yield and fertilizer N recovery to N application..
Resumo:
Effects of fire on biogeochemical cycling in terrestrial ecosystem are widely acknowledged, while few studies have focused on the bacterial community under the disturbance of long-term frequent prescribed fire. In this study, three treatments (burning every two years (B2), burning every four years (B4) and no burning (B0)) were applied for 38 years in an Australian wet sclerophyll forest. Results showed that bacterial alpha diversity (i.e. bacterial OTU) in the top soil (0-10 cm) was significantly higher in the B2 treatment compared with the B0 and B4 treatments. Non-metric multidimensional analysis (NMDS) of bacterial community showed clear separation of the soil bacterial community structure among different fire frequency regimes and between the depths. Different frequency fire did not have a substantial effect on bacterial composition at phylum level or bacterial 16S rRNA gene abundance. Soil pH and C:N ratio were the major drivers for bacterial community structure in the most frequent fire treatment (B2), while other factors (EC, DOC, DON, MBC, NH 4 +, TC and TN) were significant in the less frequent burning and no burning treatments (B4 and B0). This study suggested that burning had a dramatic impact on bacterial diversity but not abundance with more frequent fire.
Resumo:
The mechanisms and control of hardseededness in the 3 Australian cultivars of the genus Desmanthus were investigated in a series of experiments in which the effects of various seedsoftening treatments, particularly boiling water, were measured. Desmanthus seed is predominantly hard, only defective seeds being normally otherwise. As it has only very brief, early embryo dormancy, hardseededness is the only serious barrier to germination. Seed is most readily softened through rupture of the palisade at the lens (strophiole). The lens is of a typically mimosaceous type which is readily ruptured by immersion in boiling water or less readily by application of pressure to adjacent parts of the testa. Ruptures may consist only of separation of the palisade from underlying tissue, which alone does not confer permeability; mostly they also result in fractures to the palisade that then render seeds irreversibly permeable. The palisade becomes reflective as it separates, which allows the event to be witnessed at the moment of separation if suitable pressure is applied to the testa of an individual seed while it is viewed under magnification. Brief (4–10 seconds) immersion of highquality seed in boiling water consistently softened a high proportion of seeds without causing serious damage. Extending the duration of immersion led to a progressive increase in the proportion of seed deaths. Neither previous boiling water treatment nor scarification damage to the testa materially affected results of treatment, but immature and small seeds behaved differently, being more vulnerable to damage than mature seed, and less likely to undergo lens rupture. Adaptation of boiling water treatment to farm-scale seed handling was simple and reliable. Commercial treatment of seed by an alternative method suitable for greater bulks and consisting of passage through a rice-whitener was checked and found to be successful through a combination of gentle scarification and lens rupture, both attributable to the numerous minor impacts of the process. Percentage emergence of seedlings from soil in the greenhouse closely followed percentage laboratory germination, except when inferior seed grades were included in the comparison, when emergence was poor. Very little seed softened in soil. Already-permeable seed either germinated rapidly or died, while buried hard seed mostly remained hard and viable even more than a year after sowing.
Resumo:
Seed production and soil seed hanks of H. contortus were studied in a subset of treatments within an extensive grazing study conducted in H. contortus pasture in southern Queensland between 1990 and 1996. Seed production of H. contortus in autumn ranged from 260 to 1800 seeds/m2 with much of this variation due to differences in rainfall between years. Seed production was generally higher in the silver-leaved ironbark than in the narrow-leaved ironbark land class and was also influenced by a consistent stocking rate x pasture type interaction. Inflorescence density was the main factor contributing to the variable seed production and was related to the rainfall received during February. The number of seeds per inflorescence was unaffected by seasonal rainfall, landscape position, stocking rate or legume oversowing. Seed viability was related to the rainfall received during March. Soil seed banks in spring varied from 130 to 520 seeds/m2 between 1990 and 1995 with generally more seed present in the silver-leaved ironbark than in the narrow-leaved ironbark land class. There were poor relationships between viable seed production and the size of the soil seed bank, and between the size of the soil seed bank and seedling recruitment. This study indicates that H. contortus has the potential to produce relatively large amounts of seed and showed that the seasonal pattern of rainfall plays a major role in achieving this potential
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
Attention is directed at land application of piggery effluent (containing urine, faeces, water, and wasted feed) as a potential source of water resource contamination with phosphorus (P). This paper summarises P-related properties of soil from 0-0.05 m depth at 11 piggery effluent application sites, in order to explore the impact that effluent application has had on the potential for run-off transport of P. The sites investigated were situated on Alfisol, Mollisol, Vertisol, and Spodosol soils in areas that received effluent for 1.5-30 years (estimated effluent-P applications of 100-310000 kg P/ha in total). Total (PT), bicarbonate extractable (PB), and soluble P forms were determined for the soil (0-0.05 m) at paired effluent and no-effluent sites, as well as texture, oxalate-extractable Fe and Al, organic carbon, and pH. All forms of soil P at 0-0.05 m depth increased with effluent application (PB at effluent sites was 1.7-15 times that at no-effluent sites) at 10 of the 11 sites. Increases in PB were strongly related to net P applications (regression analysis of log values for 7 sites with complete data sets: 82.6 % of variance accounted for, p <0.01). Effluent irrigation tended to increase the proportion of soil PT in dilute CaCl2-extractable forms (PTC: effluent average 2.0 %; no-effluent average 0.6%). The proportion of PTC in non-molybdate reactive forms (centrifuged supernatant) decreased (no-effluent average, 46.4 %; effluent average, 13.7 %). Anaerobic lagoon effluent did not reliably acidify soil, since no consistent relationship was observed for pH with effluent application. Soil organic carbon was increased in most of the effluent areas relative to the no-effluent areas. The four effluent areas where organic carbon was reduced had undergone intensive cultivation and cropping. Current effluent management at many of the piggeries failed to maximise the potential for waste P recapture. Ten of the case-study effluent application areas have received effluent-P in excess of crop uptake. While this may not represent a significant risk of leaching where sorption retains P, it has increased the risk of transport of P by run-off. Where such sites are close to surface water, run-off P loads should be managed.
Resumo:
Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.
Resumo:
Piggery pond sludge (PPS) was applied, as-collected (Wet PPS) and following stockpiling for 12 months (Stockpiled PPS), to a sandy Sodosol and clay Vertosol at sites on the Darling Downs of Queensland. Laboratory measures of N availability were carried out on unamended and PPS-amended soils to investigate their value in estimating supplementary N needs of crops in Australia's northern grains region. Cumulative net N mineralised from the long-term (30 weeks) leached aerobic incubation was described by a first-order single exponential model. The mineralisation rate constant (0.057/week) was not significantly different between Control and PPS treatments or across soil types, when the amounts of initial mineral N applied in PPS treatments were excluded. Potentially mineralisable N (No) was significantly increased by the application of Wet PPS, and increased with increasing rate of application. Application of Wet PPS significantly increased the total amount of inorganic N leached compared with the Control treatments. Mineral N applied in Wet PPS contributed as much to the total mineral N status of the soil as did that which mineralised over time from organic N. Rates of C02 evolution during 30 weeks of aerobic leached incubation indicated that the Stockpiled PPS was more stabilised (19-28% of applied organic C mineralised) than the WetPPS (35-58% of applied organic C mineralised), due to higher lignin content in the former. Net nitrate-N produced following 12 weeks of aerobic non-leached incubation was highly correlated with net nitrate-N leached during 12 weeks of aerobic incubation (R^2 = 0.96), although it was <60% of the latter in both sandy and clayey soils. Anaerobically mineralisable N determined by waterlogged incubation of laboratory PPS-amended soil samples increased with increasing application rate of Wet PPS. Anaerobically minemlisable N from field-moist soil was well correlated with net N mineralised during 30 weeks of aerobic leached incubation (R^2 =0.90 sandy soil; R^2=0.93 clay soil). In the clay soil, the amount of mineral N produced from all the laboratory incubations was significantly correlated with field-measured nitrate-N in the soil profile (0-1.5 m depth) after 9 months of weed-free fallow following PPS application. In contrast, only anaerobic mineralisable N was significantly correlated with field nitrate-N in the sandy soil. Anaerobic incubation would, therefore, be suitable as a rapid practical test to estimate potentially mineralisable N following applications of different PPS materials in the field.
Resumo:
The chemical control of groundnut white grubs, Holotrichia serrata F. and H. reynaudi Blanchard (Coleoptera: Scarabaeidae), was studied in south--central India. Microplot trials demonstrated that chlorpyrifos and imidacloprid seed--dressings were effective against H. serrata at rates as low as 0.6 and 3.5 g a.i. kg-1, respectively, while microplot and on--farm trials showed that 1.2 and 3.5 g a.i. kg-1of chlorpyrifos and imidacloprid, respectively, were required for H. reynaudi. Chlorpyrifos residue analyses indicated that at 20 days after sowing (d.a.s.) rates up to 5.0 g a.i. kg-1 produced residues in soil and groundnut seedlings markedly below the relevant MRL, and no detectable residues at harvest under the southern Indian rainy--season environment. A farmer survey found that in Andhra Pradesh (AP), insecticides (chlorpyrifos and phorate) were applied for white grub control in 37.5% of farms sampled, while no insecticides were applied for this purpose in Karnataka and Tamil Nadu. The white grub density on farms in AP where insecticide had been applied averaged 0.07 larvae m-2, compared to 1.04 larvae m-2 in the remaining AP farms. In AP, Karnataka and Tamil Nadu, 70%, 42% and 39% of currently untreated groundnut fields, respectively, exceed the provisional economic threshold. A survey in the Anantapur district of AP found that farmer’s target and achieved rates for seed treatment averaged 0.44 and 0.52 g a.i. kg-1, both below optimal rates determined in microplot experiments. These data provide the foundation for an effective and sustainable program of management for groundnut white grubs in south--central India by providing key efficacy data and baseline data on farmer insecticide- use patterns.
Resumo:
The size of the soil microbial biomass carbon (SMBC) has been proposed as a sensitive indicator for measuring the adverse effects of contaminants on the soil microbial community. In this study of Australian agricultural systems, we demonstrated that field variability of SMBC measured using the fumigation-extraction procedure limited its use as a robust ecotoxicological endpoint. The SMBC varied up to 4-fold across control samples collected from a single field site, due to small-scale spatial heterogeneity in the soil physicochemical environment. Power analysis revealed that large numbers of replicates (3-93) were required to identify 20% or 50% decreases in the size of the SMBC of contaminated soil samples relative to their uncontaminated control samples at the 0.05% level of statistical significance. We question the value of the routine measurement of SMBC as an ecotoxicological endpoint at the field scale, and suggest more robust and predictive microbiological indicators.
Resumo:
We examined the effect of surface-applied treatments on the above-ground decay resistance of the tenon of mortice-and-tenon timber joints designed to simulate joinery that is exposed to the weather. Joints made from untreated radiata pine, Douglas-fir, brush box, spotted gum and copper-chrome-arsenic (CCA) treated radiata pine were exposed to the weather for 9 y on above-ground racks at five sites throughout eastern Australia. Results indicate (1) a poorly maintained external paint film generally accelerated decay, (2) a brush coat of water-repellent preservative inside the joints often extended serviceability (in some cases by a factor of up to seven times that of untreated joints) and (3) the level of protection provided by a coat of primer applied inside the joint varied and in most cases was not as effective as the water-repellent preservative treatment.
Resumo:
Forty-four study sites were established in remnant woodland in the Burdekin River catchment in tropical north-east Queensland, Australia, to assess recent (decadal) vegetation change. The aim of this study was further to evaluate whether wide-scale vegetation 'thickening' (proliferation of woody plants in formerly more open woodlands) had occurred during the last century, coinciding with significant changes in land management. Soil samples from several depth intervals were size separated into different soil organic carbon (SOC) fractions, which differed from one another by chemical composition and turnover times. Tropical (C4) grasses dominate in the Burdekin catchment, and thus δ13C analyses of SOC fractions with different turnover times can be used to assess whether the relative proportion of trees (C3) and grasses (C4) had changed over time. However, a method was required to permit standardized assessment of the δ13C data for the individual sites within the 13 Mha catchment, which varied in soil and vegetation characteristics. Thus, an index was developed using data from three detailed study sites and global literature to standardize individual isotopic data from different soil depths and SOC fractions to reflect only the changed proportion of trees (C3) to grasses (C3) over decadal timescales. When applied to the 44 individual sites distributed throughout the Burdekin catchment, 64% of the sites were shown to have experienced decadal vegetation thickening, while 29% had remained stable and the remaining 7% had thinned. Thus, the development of this index enabled regional scale assessment and comparison of decadal vegetation patterns without having to rely on prior knowledge of vegetation changes or aerial photography.
Resumo:
In south-eastern Queensland, Australia, sorghum planted in early spring usually escapes sorghum midge, Stenodiplosis sorghicola, attack. Experiments were conducted to better understand the role of winter diapause in the population dynamics of this pest. Emergence patterns of adult midge from diapausing larvae on the soil surface and at various depths were investigated during spring to autumn of 1987/88–1989/90. From 1987/88 to 1989/90, 89%, 65% and 98% of adult emergence, respectively, occurred during November and December. Adult emergence from larvae diapausing on the soil surface was severely reduced due to high mortality attributed to surface soil temperatures in excess of 40°C, with much of this mortality occurring between mid-September and mid-October. Emergence of adults from the soil surface was considerably delayed in the 1988/89 season compared with larvae buried at 5 or 10 cm which had similar emergence patterns for all three seasons. In 1989/90, when a 1-cm-deep treatment was included, there was a 392% increase in adult emergence from this treatment compared with deeper treatments. Some diapausing larvae on the surface did not emerge at the end of summer in only 1 year (1989/90), when 28.0% of the larvae on the surface remained in diapause, whereas only 0.8% of the buried larvae remained in diapause. We conclude that the pattern of emergence explains why spring plantings of sorghum in south-eastern Queensland usually escape sorghum midge attack.