28 resultados para nanoparticles in soil
em eResearch Archive - Queensland Department of Agriculture
Resumo:
The impact of three cropping histories (sugarcane, maize and soybean) and two tillage practices (conventional tillage and direct drill) on plant-parasitic and free-living nematodes in the following sugarcane crop was examined in a field trial at Bundaberg. Soybean reduced populations of lesion nematode (Pratylenchus zeae) and root-knot nematode (Meloidogyne javanica) in comparison to previous crops of sugarcane or maize but increased populations of spiral nematode (Helicotylenchus dihystera) and maintained populations of dagger nematode (Xiphinema elongatum). However the effect of soybean on P zeae and M. javanica was no longer apparent 15 weeks after planting sugarcane, while later in the season, populations of these nematodes following soybean were as high as or higher than maize or sugarcane. Populations of P zeae were initially reduced by cultivation but due to strong resurgence tended to be higher in conventionally tilled than direct drill plots at the end of the plant crop. Even greater tillage effects were observed with M. javanica and X. elongatum, as nematode populations were significantly higher in conventionally tilled than direct drill plots late in the season. Populations of free-living nematodes in the upper 10 cm of soil were initially highest following soybean, but after 15, 35 and 59 weeks were lower than after sugarcane and contained fewer omnivorous and predatory nematodes. Conventional tillage increased populations of free-living nematodes in soil in comparison to direct drill and was also detrimental to omnivorous and predatory nematodes. These results suggest that crop rotation and tillage not only affect plant-parasitic nematodes directly, but also have indirect effects by impacting on natural enemies that regulate nematode populations. More than 2 million nematodes/m(2) were often present in crop residues on the surface of direct drill plots. Bacterial-feeding nematodes were predominant in residues early in the decomposition process but fungal-feeding nematodes predominated after 15 weeks. This indicates that fungi become an increasingly important component of the detritus food web as decomposition proceeds, and that that the rate of nutrient cycling decreases with time. Correlations between total numbers of free-living nematodes and mineral N concentrations in crop residues and surface soil suggested that the free-living nematode community may provide an indication of the rate of mineralisation of N from organic matter.
Resumo:
TRFLP (terminal restriction fragment length polymorphism) was used to assess whether management practices that improved disease suppression and/or yield in a 4-year ginger field trial were related to changes in soil microbial community structure. Bacterial and fungal community profiles were defined by presence and abundance of terminal restriction fragments (TRFs), where each TRF represents one or more species. Results indicated inclusion of an organic amendment and minimum tillage increased the relative diversity of dominant fungal populations in a system dependant way. Inclusion of an organic amendment increased bacterial species richness in the pasture treatment. Redundancy analysis showed shifts in microbial community structure associated with different management practices and treatments grouped according to TRF abundance in relation to yield and disease incidence. ANOVA also indicated the abundance of certain TRFs was significantly affected by farming system management practices, and a number of these TRFs were also correlated with yield or disease suppression. Further analyses are required to determine whether identified TRFs can be used as general or soil-type specific bio-indicators of productivity (increased and decreased) and Pythium myriotylum suppressiveness.
Resumo:
Reforestation of agricultural land with mixed-species environmental plantings (native trees and shrubs) can contribute to mitigation of climate change through sequestration of carbon. Although soil carbon sequestration following reforestation has been investigated at site- and regional-scales, there are few studies across regions where the impact of a broad range of site conditions and management practices can be assessed. We collated new and existing data on soil organic carbon (SOC, 0–30 cm depth, N = 117 sites) and litter (N = 106 sites) under mixed-species plantings and an agricultural pair or baseline across southern and eastern Australia. Sites covered a range of previous land uses, initial SOC stocks, climatic conditions and management types. Differences in total SOC stocks following reforestation were significant at 52% of sites, with a mean rate of increase of 0.57 ± 0.06 Mg C ha−1 y−1. Increases were largely in the particulate fraction, which increased significantly at 46% of sites compared with increases at 27% of sites for the humus fraction. Although relative increase was highest in the particulate fraction, the humus fraction was the largest proportion of total SOC and so absolute differences in both fractions were similar. Accumulation rates of carbon in litter were 0.39 ± 0.02 Mg C ha−1 y−1, increasing the total (soil + litter) annual rate of carbon sequestration by 68%. Previously-cropped sites accumulated more SOC than previously-grazed sites. The explained variance differed widely among empirical models of differences in SOC stocks following reforestation according to SOC fraction and depth for previously-grazed (R2 = 0.18–0.51) and previously-cropped (R2 = 0.14–0.60) sites. For previously-grazed sites, differences in SOC following reforestation were negatively related to total SOC in the pasture. By comparison, for previously-cropped sites, differences in SOC were positively related to mean annual rainfall. This improved broad-scale understanding of the magnitude and predictors of changes in stocks of soil and litter C following reforestation is valuable for the development of policy on carbon markets and the establishment of future mixed-species environmental plantings.
Resumo:
Highly productive sown pasture systems can result in high growth rates of beef cattle and lead to increases in soil nitrogen and the production of subsequent crops. The nitrogen dynamics and growth of grain sorghum following grazed annual legume leys or a grass pasture were investigated in a no-till system in the South Burnett district of Queensland. Two years of the tropical legumes Macrotyloma daltonii and Vigna trilobata (both self regenerating annual legumes) and Lablab purpureus (a resown annual legume) resulted in soil nitrate N (0-0.9 m depth), at sorghum sowing, ranging from 35 to 86 kg/ha compared with 4 kg/ha after pure grass pastures. Average grain sorghum production in the 4 cropping seasons following the grazed legume leys ranged from 2651 to 4012 kg/ha. Following the grass pasture, grain sorghum production in the first and second year was < 1900 kg/ha and by the third year grain yield was comparable to the legume systems. Simulation studies utilising the farming systems model APSIM indicated that the soil N and water dynamics following 2-year ley phases could be closely represented over 4 years and the prediction of sorghum growth during this time was reasonable. In simulated unfertilised sorghum crops grown from 1954 to 2004, grain yield did not exceed 1500 kg/ha in 50% of seasons following a grass pasture, while following 2-year legume leys, grain exceeded 3000 kg/ha in 80% of seasons. It was concluded that mixed farming systems that utilise short term legume-based pastures for beef production in rotation with crop production enterprises can be highly productive.
Resumo:
Cultivation and cropping of soils results in a decline in soil organic carbon and soil nitrogen, and can lead to reduced crop yields. The CENTURY model was used to simulate the effects of continuous cultivation and cereal cropping on total soil organic matter (C and N), carbon pools, nitrogen mineralisation, and crop yield from 6 locations in southern Queensland. The model was calibrated for each replicate from the original datasets, allowing comparisons for each replicate rather than site averages. The CENTURY model was able to satisfactorily predict the impact of long-term cultivation and cereal cropping on total organic carbon, but was less successful in simulating the different fractions and nitrogen mineralisation. The model firstly over-predicted the initial (pre-cropping) soil carbon and nitrogen concentration of the sites. To account for the unique shrinking and swelling characteristics of the Vertosol soils, the default annual decomposition rates of the slow and passive carbon pools were doubled, and then the model accurately predicted initial conditions. The ability of the model to predict carbon pool fractions varied, demonstrating the difficulty inherent in predicting the size of these conceptual pools. The strength of the model lies in the ability to closely predict the starting soil organic matter conditions, and the ability to predict the impact of clearing, cultivation, fertiliser application, and continuous cropping on total soil carbon and nitrogen.
Resumo:
Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.
Resumo:
Recolonisation of soil by macrofauna (especially ants, termites and earthworms) in rehabilitated open-cut mine sites is inevitable and, in terms of habitat restoration and function, typically of great value. In these highly disturbed landscapes, soil invertebrates play a major role in soil development (macropore configuration, nutrient cycling, bioturbation, etc.) and can influence hydrological processes such as infiltration, seepage, runoff generation and soil erosion. Understanding and quantifying these ecosystem processes is important in rehabilitation design, establishment and subsequent management to ensure progress to the desired end goal, especially in waste cover systems designed to prevent water reaching and transporting underlying hazardous waste materials. However, the soil macrofauna is typically overlooked during hydrological modelling, possibly due to uncertainties on the extent of their influence, which can lead to failure of waste cover systems or rehabilitation activities. We propose that scientific experiments under controlled conditions and field trials on post-mining lands are required to quantify (i) macrofauna–soil structure interactions, (ii) functional dynamics of macrofauna taxa,and (iii) their effects on macrofauna and soil development over time. Such knowledge would provide crucial information for soil water models, which would increase confidence in mine waste cover design recommendations and eventually lead to higher likelihood of rehabilitation success of open-cut mining land.
Resumo:
In vitro experimental environments are used to study interactions between microorganisms, and predict dynamics in natural ecosystems. This study highlights that experimental in vitro environments should be selected to closely match the natural environment of interest during in vitro studies to strengthen extrapolations about aflatoxin production by Aspergillus and competing organisms. Fungal competition and aflatoxin accumulation was studied in soil, cotton wool or tube (water-only) environments, for Aspergillus flavus competition with Penicillium purpurogenum, Fusarium oxysporum or Sarocladium zeae within maize grains. Inoculated grains were incubated in each environment at two temperature regimes (25oC and 30oC). Competition experiments showed interaction between main effects of aflatoxin accumulation and environment at 25oC, but not so at 30oC. However, competition experiments showed fungal populations were always interacting with their environments. Fungal survival differed after the 72-hour incubation in different experimental environments. Whereas, all fungi incubated within the soil environment survived; in the cotton-wool environment, none of the competitors of A. flavus survived at 30 oC. With aflatoxin accumulation, F. oxysporum was the only fungus able to interdict aflatoxin production at both temperatures. This occurred only in the soil environment and fumonisins accumulated instead. Smallholder farmers in developing countries face serious mycotoxin contamination of their grains, and soil is a natural reservoir for the associated fungal propagules, and a drying and storage surface for grains on these farms. Studying fungal dynamics in the soil environment and other environments in vitro can provide insights into aflatoxin accumulation post harvest.
Resumo:
Two field experiments were carried out in Taveuni, Fiji to study the effects of mucuna (Mucuna pruriens) and grass fallow systems at 6 and 12 month durations on changes in soil properties (Experiment 1) and taro yields (Experiment 2). Biomass accumulation of mucuna fallow crop was significantly higher (P<0.05) than grass fallow crop at both 6 and 12 month durations. The longer fallow duration resulted in higher (P<0.05) total soil organic carbon, total soil nitrogen and earthworm numbers regardless of fallow type. Weed suppression in taro grown under mucuna was significantly greater (P<0.05) than under natural grass fallow. Taro grown under mucuna fallow significantly outyielded taro grown under grass fallow (11.8 vs. 8.8 t ha-1). Also, the gross margin of taro grown under mucuna fallow was 52% higher than that of taro grown under grass fallow. © ISHS.
Resumo:
The mechanisms and control of hardseededness in the 3 Australian cultivars of the genus Desmanthus were investigated in a series of experiments in which the effects of various seedsoftening treatments, particularly boiling water, were measured. Desmanthus seed is predominantly hard, only defective seeds being normally otherwise. As it has only very brief, early embryo dormancy, hardseededness is the only serious barrier to germination. Seed is most readily softened through rupture of the palisade at the lens (strophiole). The lens is of a typically mimosaceous type which is readily ruptured by immersion in boiling water or less readily by application of pressure to adjacent parts of the testa. Ruptures may consist only of separation of the palisade from underlying tissue, which alone does not confer permeability; mostly they also result in fractures to the palisade that then render seeds irreversibly permeable. The palisade becomes reflective as it separates, which allows the event to be witnessed at the moment of separation if suitable pressure is applied to the testa of an individual seed while it is viewed under magnification. Brief (4–10 seconds) immersion of highquality seed in boiling water consistently softened a high proportion of seeds without causing serious damage. Extending the duration of immersion led to a progressive increase in the proportion of seed deaths. Neither previous boiling water treatment nor scarification damage to the testa materially affected results of treatment, but immature and small seeds behaved differently, being more vulnerable to damage than mature seed, and less likely to undergo lens rupture. Adaptation of boiling water treatment to farm-scale seed handling was simple and reliable. Commercial treatment of seed by an alternative method suitable for greater bulks and consisting of passage through a rice-whitener was checked and found to be successful through a combination of gentle scarification and lens rupture, both attributable to the numerous minor impacts of the process. Percentage emergence of seedlings from soil in the greenhouse closely followed percentage laboratory germination, except when inferior seed grades were included in the comparison, when emergence was poor. Very little seed softened in soil. Already-permeable seed either germinated rapidly or died, while buried hard seed mostly remained hard and viable even more than a year after sowing.
Resumo:
Two-spotted mite, Tetranychus urticae Koch, was until recently regarded as a minor and infrequent pest of papaya in Queensland through the dry late winter/early summer months. The situation has changed over the past 4-5 years, so that now some growers consider spider mites significant pests all year round. This altered pest status corresponded with a substantial increase in the use of fungicides to control black spot (Asperisporium caricae). A project was initiated in 1998 to examine the potential reasons for escalating mite problems in commercially-grown papaya, which included regular sampling over a 2 year period for mites, mite damage and beneficial arthropods on a number of farms on the wet tropical coast and drier Atherton Tableland. Differences in soil type, papaya variety, chemical use and some agronomic practices were included in this assessment. Monthly visits were made to each site where 20 randomly-selected plants from each of 2 papaya lines (yellow and red types) were surveyed. Three leaves were selected from each plant, one from each of the bottom, middle and top strata of leaves. The numbers of mobile predators were recorded, along with visual estimates of the percentage and age of mite damage on each leaf. Leaves were then sprayed with hairspray to fix the mites and immature predators to the leaf surface. Four leaf disks, 25 mm in diameter, were then punched from each leaf into a 50 ml storage container with a purpose-built disk-cutting tool. Disks from each leaf position were separated by tissue paper, within the container. On return to the laboratory, each leaf disk was scrutinised under a binocular microscope to determine the numbers of two-spotted mites and eggs, predatory mites and eggs, and the immature stages of predatory insects (mainly Stethorus, Halmus and lacewings). A total of 2160 leaf disks have been examined each month. All data have been entered into an Access database to facilitate comparisons between sites.
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.
Resumo:
The chemical control of groundnut white grubs, Holotrichia serrata F. and H. reynaudi Blanchard (Coleoptera: Scarabaeidae), was studied in south--central India. Microplot trials demonstrated that chlorpyrifos and imidacloprid seed--dressings were effective against H. serrata at rates as low as 0.6 and 3.5 g a.i. kg-1, respectively, while microplot and on--farm trials showed that 1.2 and 3.5 g a.i. kg-1of chlorpyrifos and imidacloprid, respectively, were required for H. reynaudi. Chlorpyrifos residue analyses indicated that at 20 days after sowing (d.a.s.) rates up to 5.0 g a.i. kg-1 produced residues in soil and groundnut seedlings markedly below the relevant MRL, and no detectable residues at harvest under the southern Indian rainy--season environment. A farmer survey found that in Andhra Pradesh (AP), insecticides (chlorpyrifos and phorate) were applied for white grub control in 37.5% of farms sampled, while no insecticides were applied for this purpose in Karnataka and Tamil Nadu. The white grub density on farms in AP where insecticide had been applied averaged 0.07 larvae m-2, compared to 1.04 larvae m-2 in the remaining AP farms. In AP, Karnataka and Tamil Nadu, 70%, 42% and 39% of currently untreated groundnut fields, respectively, exceed the provisional economic threshold. A survey in the Anantapur district of AP found that farmer’s target and achieved rates for seed treatment averaged 0.44 and 0.52 g a.i. kg-1, both below optimal rates determined in microplot experiments. These data provide the foundation for an effective and sustainable program of management for groundnut white grubs in south--central India by providing key efficacy data and baseline data on farmer insecticide- use patterns.
Resumo:
A study was undertaken from 2004 to 2007 to investigate factors associated with decreased efficacy of metalaxyl to manage damping-off of cucumber in Oman. A survey over six growing seasons showed that growers lost up to 14.6% of seedlings following application of metalaxyl. No resistance to metalaxyl was found among Pythium isolates. Damping-off disease in the surveyed greenhouses followed two patterns. In most (69%) greenhouses, seedling mortality was found to occur shortly after transplanting and decrease thereafter (Phase-I). However, a second phase of seedling mortality (Phase-II) appeared 9-14 d after transplanting in about 31% of the surveyed greenhouses. Analysis of the rate of biodegradation of metalaxyl in six greenhouses indicated a significant increase in the rate of metalaxyl biodegradation in greenhouses, which encountered Phase-II damping-off. The half-life of metalaxyl dropped from 93 d in soil, which received no previous metalaxyl treatment to 14 d in soil, which received metalaxyl for eight consecutive seasons, indicating an enhanced rate of metalaxyl biodegradation after repeated use. Multiple applications of metalaxyl helped reduce the appearance of Phase-II damping-off. This appears to be the first report of rapid biodegradation of metalaxyl in greenhouse soils and the first report of its association with appearance of a second phase of mortality in cucumber seedlings.
Resumo:
An urgent need exists for indicators of soil health and patch functionality in extensive rangelands that can be measured efficiently and at low cost. Soil mites are candidate indicators, but their identification and handling is so specialised and time-consuming that their inclusion in routine monitoring is unlikely. The aim of this study was to measure the relationship between patch type and mite assemblages using a conventional approach. An additional aim was to determine if a molecular approach traditionally used for soil microbes could be adapted for soil mites to overcome some of the bottlenecks associated with soil fauna diversity assessment. Soil mite species abundance and diversity were measured using conventional ecological methods in soil from patches with perennial grass and litter cover (PGL), and compared to soil from bare patches with annual grasses and/or litter cover (BAL). Soil mite assemblages were also assessed using a molecular method called terminal-restriction fragment length polymorphism (T-RFLP) analysis. The conventional data showed a relationship between patch type and mite assemblage. The Prostigmata and Oribatida were well represented in the PGL sites, particularly the Aphelacaridae (Oribatida). For T-RFLP analysis, the mite community was represented by a series of DNA fragment lengths that reflected mite sequence diversity. The T-RFLP data showed a distinct difference in the mite assemblage between the patch types. Where possible, T-RFLP peaks were matched to mite families using a reference 18S rDNA database, and the Aphelacaridae prevalent in the conventional samples at PGL sites were identified, as were prostigmatids and oribatids. We identified limits to the T-RFLP approach and this included an inability to distinguish some species whose DNA sequences were similar. Despite these limitations, the data still showed a clear difference between sites, and the molecular taxonomic inferences also compared well with the conventional ecological data. The results from this study indicated that the T-RFLP approach was effective in measuring mite assemblages in this system. The power of this technique lies in the fact that species diversity and abundance data can be obtained quickly because of the time taken to process hundreds of samples, from soil DNA extraction to data output on the gene analyser, can be as little as 4 days.