14 resultados para Cycling.
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Seeds in the field experience wet-dry cycling that is akin to the well-studied commercial process of seed priming in which seeds are hydrated and then re-dried to standardise their germination characteristics. To investigate whether the persistence (defined as in situ longevity) and antioxidant capacity of seeds are influenced by wet-dry cycling, seeds of the global agronomic weed Avena sterilis ssp. ludoviciana were subjected to (1) controlled ageing at 60% relative humidity and 53.5°C for 31 days, (2) controlled ageing then priming, or (3) ageing in the field in three soils for 21 months. Changes in seed viability (total germination), mean germination time, seedling vigour (mean seedling length), and the concentrations of the glutathione (GSH) / glutathione disulphide (GSSG) redox couple were recorded over time. As controlled-aged seeds lost viability, GSH levels declined and the relative proportion of GSSG contributing to total glutathione increased, indicative of a failing antioxidant capacity. Subjecting seeds that were aged under controlled conditions to a wet-dry cycle (to −1 MPa) prevented viability loss and increased GSH levels. Field-aged seeds that underwent numerous wet-dry cycles due to natural rainfall maintained high viability and high GSH levels. Thus wet-dry cycles in the field may enhance seed longevity and persistence coincident with re-synthesis of protective compounds such as GSH.
Resumo:
Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.
Resumo:
Bemisia tabaci, biotype B, commonly known as the silverleaf whitefly (SLW) is an alien species that invaded Australia in the mid-90s. This paper reports on the invasion ecology of SLW and the factors that are likely to have contributed to the first outbreak of this major pest in an Australian cotton cropping system, population dynamics of SLW within whitefly-susceptible crop (cotton and cucurbit) and non-crop vegetation (sowthistle, Sonchus spp.) components of the cropping system were investigated over four consecutive growing seasons (September-June) 2001/02-2004/05 in the Emerald Irrigation Area (EIA) of Queensland, Australia. Based on fixed geo-referenced sampling sites, variation in spatial and temporal abundance of SLW within each system component was quantified to provide baseline data for the development of ecologically sustainable pest management strategies. Parasitism of large (3rd and 4th instars) SLW nymphs by native aphelinid wasps was quantified to determine the potential for natural control of SLW populations. Following the initial outbreak in 2001/02, SLW abundance declined and stabilised over the next three seasons. The population dynamics of SLW is characterised by inter-seasonal population cycling between the non-crop (weed) and cotton components of the EIA cropping system. Cotton was the largest sink for and source of SLW during the study period. Over-wintering populations dispersed from weed host plant sources to cotton in spring followed by a reverse dispersal in late summer and autumn to broad-leaved crops and weeds. A basic spatial source-sink analysis showed that SLW adult and nymph densities were higher in cotton fields that were closer to over-wintering weed sources throughout spring than in fields that were further away. Cucurbit fields were not significant sources of SLW and did not appear to contribute significantly to the regional population dynamics of the pest. Substantial parasitism of nymphal stages throughout the study period indicates that native parasitoid species and other natural enemies are important sources of SLW mortality in Australian cotton production systems. Weather conditions and use of broad-spectrum insecticides for pest control are implicated in the initial outbreak and on-going pest status of SLW in the region.
Resumo:
Lantana camara is a recognized weed of worldwide significance due to its extensive distribution and its impacts on primary industries and nature conservation. However, quantitative data on the impact of the weed on soil ecosystem properties are scanty, especially in SE Australia, despite the pervasive presence of the weed along its coastal and inland regions. Consequently, mineral soils for physicochemical analyses were collected beneath and away from L. camara infestations in four sites west of Brisbane, SE Australia. These sites (hoop pine plantation, cattle farm, and two eucalyptus forests with occasional grazing and a fire regime, respectively) vary in landscape and land-use types. Significant site effect was more frequently observed than effect due to invasion status. Nonetheless, after controlling for site differences, ~50% of the 23 soil traits examined differed significantly between infested and non-infested soils. Moisture, pH, Ca, total and organic C, and total N (but not exchangeable N in form of NO3-) were significantly elevated, while sodium, chloride, copper, iron, sulfur, and manganese, many of which can be toxic to plant growth if present in excess levels, were present at lower levels in soils supporting L. camara compared to soils lacking the weed. These results indicate that L. camara can improve soil fertility and influence nutrient cycling, making the substratum ideal for its own growth and might explain the ability of the weed to outcompete other species, especially native ones.
Resumo:
The impact of three cropping histories (sugarcane, maize and soybean) and two tillage practices (conventional tillage and direct drill) on plant-parasitic and free-living nematodes in the following sugarcane crop was examined in a field trial at Bundaberg. Soybean reduced populations of lesion nematode (Pratylenchus zeae) and root-knot nematode (Meloidogyne javanica) in comparison to previous crops of sugarcane or maize but increased populations of spiral nematode (Helicotylenchus dihystera) and maintained populations of dagger nematode (Xiphinema elongatum). However the effect of soybean on P zeae and M. javanica was no longer apparent 15 weeks after planting sugarcane, while later in the season, populations of these nematodes following soybean were as high as or higher than maize or sugarcane. Populations of P zeae were initially reduced by cultivation but due to strong resurgence tended to be higher in conventionally tilled than direct drill plots at the end of the plant crop. Even greater tillage effects were observed with M. javanica and X. elongatum, as nematode populations were significantly higher in conventionally tilled than direct drill plots late in the season. Populations of free-living nematodes in the upper 10 cm of soil were initially highest following soybean, but after 15, 35 and 59 weeks were lower than after sugarcane and contained fewer omnivorous and predatory nematodes. Conventional tillage increased populations of free-living nematodes in soil in comparison to direct drill and was also detrimental to omnivorous and predatory nematodes. These results suggest that crop rotation and tillage not only affect plant-parasitic nematodes directly, but also have indirect effects by impacting on natural enemies that regulate nematode populations. More than 2 million nematodes/m(2) were often present in crop residues on the surface of direct drill plots. Bacterial-feeding nematodes were predominant in residues early in the decomposition process but fungal-feeding nematodes predominated after 15 weeks. This indicates that fungi become an increasingly important component of the detritus food web as decomposition proceeds, and that that the rate of nutrient cycling decreases with time. Correlations between total numbers of free-living nematodes and mineral N concentrations in crop residues and surface soil suggested that the free-living nematode community may provide an indication of the rate of mineralisation of N from organic matter.
Resumo:
There are many potential bioremediation approaches that may be suitable for prawn farms in Queensland. Although most share generally accepted bioremediation principles, advocacy for different methods tends to vary widely. This diversity of approach is particularly driven by the availability and knowledge of functional species at different localities around the world. In Australia, little is known about the abilities of many native species in this regard, and translocation and biosecurity issues prevent the use of exotic species that have shown potential in other countries. Species selected must be tolerant of eutrophic conditions and ecological shifts, because prawn pond nutrient levels and pathways can vary with different assemblages of autotrophic and heterotrophic organisms. Generally, they would be included in a constructed ecosystem because of their functional contributions to nutrient cycling and uptake, and to create nutrient sinks in forms of harvestable biomass. Wide salinity, temperature and water quality tolerances are also valuable attributes for selected species due to the sometimes-pronounced effects of environmental extremes, and to provide over-wintering options and adequate safety margins in avoiding mass mortalities. To practically achieve these bioremediation polycultures on a large scale, and in concert with the operations of a prawn farm, methods involving seed production, stock management, and a range of other farm engineering and product handling systems need to be reliably achievable and economically viable. Research funding provided by the Queensland Government through the Aquaculture Industry Development Initiative (AIDI) 2002-04 has enabled a number of technical studies into biological systems to treat prawn farm effluent for recirculation and improved environmental sustainability. AIDI bioremediation research in southern Queensland was based at the Bribie Island Aquaculture Research Centre (BIARC), and was conducted in conjunction with AIDI genetics and selection research, and a Natural Heritage Trust (NHT) funded program (Coast and Clean Seas Project No.717757). This report compilation provides a summary of some of the work conducted within these programs.
Resumo:
Macfadyena unguis-cati (L.) Gentry (Bignoniaceae) is a major environmental weed in coastal Queensland, Australia. There is a lack of quantitative data on its leaf chemistry and its impact on soil properties. Soils from infested vs uninfested areas, and leaves of M. unguis-cati and three co-occurring vine species (one exotic, two native) were collected at six sites (riparian and non-riparian) in south-eastern Queensland. Effects of invasion status, species, site and habitat type were examined using univariate and multivariate analyses. Habitat type had a greater effect on soil nutrients than on leaf chemistry. Invasion effect of M. unguis-cati on soil chemistry was more pronounced in non-riparian than in riparian habitat. Significantly higher values were obtained in M. unguis-cati infested (vs. uninfested) soils for ~50% of traits. Leaf ion concentrations differed significantly between exotic and native vines. Observed higher leaf-nutrient load (especially nitrogen, phosphorus and potassium) in exotic plants aligns with the preference of invasive plant species for disturbed habitats with higher nutrient input. Higher load of trace elements (aluminium, boron, cadmium and iron) in its leaves suggests that cycling of heavy-metal ions, many of which are potentially toxic at excess level, could be accelerated in soils of M. unguis-cati-invaded landscape. Although inferences from the present study are based on correlative data, the consistency of the patterns across many sites suggests that M. unguis-cati may improve soil fertility and influence nutrient cycling, perhaps through legacy effects of its own litter input.
Resumo:
Leaf-litter thrips were much more common and diverse in dry sclerophyll forest than in wetter forest types in subtropical southeast Queensland, Australia. In dry sclerophyll forest, the species composition of thrips in leaf-litter was strongly differentiated from the thrips fauna associated with bark of the trees Eucalyptus major and Acacia melanoxylon (4 of 34 species in common). The species composition of bark-dwelling thrips was similar across the two tree species and also across two eucalypts with different bark types, Eucalyptus major (flaky) and Eucalyptus siderophloia (rough). The diversity of thrips from the leaf-litter was not differentiated across all of these tree species. Virtually all thrips collected were Phlaeothripidae, subfamilies Idolothripinae and Phlaeothripinae. Idolothripinae were associated almost exclusively with leaf-litter, but Phlaeothripinae were in leaf-litter and bark. The association of fungal-feeding thrips with dry sclerophyll forest raises questions about their ecological requirements and the role they play in nutrient cycling. © 2012 Copyright Taylor and Francis Group, LLC.
Resumo:
Fire is an important driver of nutrient cycling in savannas. Here, we determined the impact of fire frequency on total and soluble soil nitrogen (N) pools in tropical savanna. The study sites consisted of 1-ha experimental plots near Darwin, Australia, which remained unburnt for at least 14 years or were burnt at 1-, 2- or 5-year intervals over the past 6 years. Soil was analysed from patches underneath tree canopies and in inter-canopy patches at 1, 12, 28, 55 and 152 days after fire. Patch type had a significant effect on all soil N pools, with greater concentrations of total and soluble (nitrate, ammonium, amino acids) N under tree canopies than inter-canopy patches. The time since the last fire had no significant effect on N pools. Fire frequency similarly did not affect total soil N but it did influence soluble soil N. Soil amino acids were most prominent in burnt savanna, ammonium was highest in infrequently burnt (5-year interval) savanna and nitrate was highest in unburnt savanna. We suggest that the main effect of fire on soil N relations occurs indirectly through altered tree-grass dynamics. Previous studies have shown that high fire frequencies reduce tree cover by lowering recruitment and increasing mortality. Our findings suggest that these changes in tree cover could result in a 30% reduction in total soil N and 1060% reductions in soluble N pools. This finding is consistent with studies from savannas globally, providing further evidence for a general theory of patchiness as a key driver of nutrient cycling in the savanna biome.
Resumo:
Fire is a major driver of ecosystem change and can disproportionately affect the cycling of different nutrients. Thus, a stoichiometric approach to investigate the relationships between nutrient availability and microbial resource use during decomposition is likely to provide insight into the effects of fire on ecosystem functioning. We conducted a field litter bag experiment to investigate the long-term impact of repeated fire on the stoichiometry of leaf litter C, N and P pools, and nutrient-acquiring enzyme activities during decomposition in a wet sclerophyll eucalypt forest in Queensland, Australia. Fire frequency treatments have been maintained since 1972, including burning every two years (2yrB), burning every four years (4yrB) and no burning (NB). C:N ratios in freshly fallen litter were 29-42% higher and C:P ratios were 6-25% lower for 2yrB than NB during decomposition, with correspondingly lower 2yrB N:P ratios (27-32) than for NB (34-49). Trends in litter soluble and microbial N:P ratios were similar to the overall litter N:P ratios across fire treatments. Consistent with these, the ratio of activities for N-acquiring to P-acquiring enzymes in litter was higher for 2yrB than NB while 4yrB was generally intermediate between 2yrB and NB. Decomposition rates of freshly fallen litter were significantly lower for 2yrB (72±2% mass remaining at the end of experiment) than for 4yrB (59±3%) and NB (62±3%), a difference that may be related to effects of N limitation, lower moisture content, and/or litter C quality. Results for older mixed-age litter were similar to those for freshly fallen litter although treatment differences were less pronounced. Overall, these findings show that frequent fire (2yrB) decoupled N and P cycling, as manifested in litter C:N:P stoichiometry and in microbial biomass N:P ratio and enzymatic activities. These data indicate that fire induced a transient shift to N-limited ecosystem conditions during the post-fire recovery phase. This article is protected by copyright. All rights reserved.
Resumo:
Assessing storage impacts on manure properties is relevant to research associated with nutrient-use efficiency and greenhouse gas (GHG) emissions. We examined the impact of cold storage on physicochemical properties, biochemical methane-emitting potential (BMP) and the composition of microbial communities of beef feedlot manure and poultry broiler litter. Manures were analysed within 2 days of collection and after 2 and 8 weeks in refrigerated (4 °C) or frozen (–20 °C) storage. Compared with fresh manure, stored manures had statistically significant (p < 0.05) but comparatively minor (<10%) changes in electrical conductivity, chloride and ammonium concentrations. Refrigeration and freezing did not significantly affect (p > 0.05) BMP in both manure types. We did not detect ammonium- or nitrite-oxidising bacterial taxa (AOB, NOB) using fluorescence in situ hybridisation (FISH). Importantly, the viability of microbes was unchanged by storage. We conclude that storage at –20 °C or 4 °C adequately preserves the investigated traits of the studied manures for research aimed at improving nutrient cycling and reducing GHG emissions.
Resumo:
Methane is a potent greenhouse gas with a global warming potential ∼28 times that of carbon dioxide. Consequently, sources and sinks that influence the concentration of methane in the atmosphere are of great interest. In Australia, agriculture is the primary source of anthropogenic methane emissions (60.4% of national emissions, or 3260kt-1methaneyear-1, between 1990 and 2011), and cropping and grazing soils represent Australia's largest potential terrestrial methane sink. As of 2011, the expansion of agricultural soils, which are ∼70% less efficient at consuming methane than undisturbed soils, to 59% of Australia's land mass (456Mha) and increasing livestock densities in northern Australia suggest negative implications for national methane flux. Plant biomass burning does not appear to have long-term negative effects on methane flux unless soils are converted for agricultural purposes. Rice cultivation contributes marginally to national methane emissions and this fluctuates depending on water availability. Significant available research into biological, geochemical and agronomic factors has been pertinent for developing effective methane mitigation strategies. We discuss methane-flux feedback mechanisms in relation to climate change drivers such as temperature, atmospheric carbon dioxide and methane concentrations, precipitation and extreme weather events. Future research should focus on quantifying the role of Australian cropping and grazing soils as methane sinks in the national methane budget, linking biodiversity and activity of methane-cycling microbes to environmental factors, and quantifying how a combination of climate change drivers will affect total methane flux in these systems.
Resumo:
Soil biogeochemical cycles are largely mediated by microorganisms, while fire significantly modifies biogeochemical cycles mainly via altering microbial community and substrate availability. Majority of studies on fire effects have focused on the surface soil; therefore, our understanding of the vertical distribution of microbial communities and the impacts of fire on nitrogen (N) dynamics in the soil profile is limited. Here, we examined the changes of soil denitrification capacity (DNC) and denitrifying communities with depth under different burning regimes, and their interaction with environmental gradients along the soil profile. Results showed that soil depth had a more pronounced impact than the burning treatment on the bacterial community size. The abundance of 16S rRNA and denitrification genes (narG, nirK, and nirS) declined exponentially with soil depth. Surprisingly, the nosZ-harboring denitrifiers were enriched in the deeper soil layers, which was likely to indicate that the nosZ-harboring denitrifiers could better adapt to the stress conditions (i.e., oxygen deficiency, nutrient limitation, etc.) than other denitrifiers. Soil nutrients, including dissolved organic carbon (DOC), total soluble N (TSN), ammonium (NH4 +), and nitrate (NO3 −), declined significantly with soil depth, which probably contributed to the vertical distribution of denitrifying communities. Soil DNC decreased significantly with soil depth, which was negligible in the depths below 20 cm. These findings have provided new insights into niche separation of the N-cycling functional guilds along the soil profile, under a varied fire disturbance regime.
Resumo:
Recolonisation of soil by macrofauna (especially ants, termites and earthworms) in rehabilitated open-cut mine sites is inevitable and, in terms of habitat restoration and function, typically of great value. In these highly disturbed landscapes, soil invertebrates play a major role in soil development (macropore configuration, nutrient cycling, bioturbation, etc.) and can influence hydrological processes such as infiltration, seepage, runoff generation and soil erosion. Understanding and quantifying these ecosystem processes is important in rehabilitation design, establishment and subsequent management to ensure progress to the desired end goal, especially in waste cover systems designed to prevent water reaching and transporting underlying hazardous waste materials. However, the soil macrofauna is typically overlooked during hydrological modelling, possibly due to uncertainties on the extent of their influence, which can lead to failure of waste cover systems or rehabilitation activities. We propose that scientific experiments under controlled conditions and field trials on post-mining lands are required to quantify (i) macrofauna–soil structure interactions, (ii) functional dynamics of macrofauna taxa,and (iii) their effects on macrofauna and soil development over time. Such knowledge would provide crucial information for soil water models, which would increase confidence in mine waste cover design recommendations and eventually lead to higher likelihood of rehabilitation success of open-cut mining land.