44 resultados para Carbon storage
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.
Resumo:
Five species of commercial prawns Penaeus plebejus, P. merguiensis, P. semisulcatus/P. esculentus and M. bennettae, were obtained from South-East and North Queensland, chilled soon after capture and then stored either whole or deheaded on ice and ice slurry, until spoilage. Total bacterial counts, total volatile nitrogen, K-values and total demerit scores were assessed at regular intervals. Their shelf lives ranged from 10-17 days on ice and >20 days on ice slurry. Initial bacterial flora on prawns from shallower waters (4-15m) were dominated by Gram-positives and had lag periods around 7 days, whereas prawns from deeper waters (100m) were dominant in Pseudomonas spp. with no lag periods in bacterial growth. The dominant spoiler in ice was mainly Pseudomonas fragi whereas the main spoiler in ice slurry was Shewanella putrefaciens. Bacterial interactions seem to play a major role in the patterns of spoilage in relation to capture environment and pattern of storage
Resumo:
Passionfruit (Passiflora edulis) concentrates (542 g/kg soluble solids) prepared in a wiped-film evaporator were stored for up to 6 months at - 18°, 4° and 20°C. Yeast and mould counts were taken and colour changes noted during storage. When suitable diluted concentrate colour and flavour were acceptable for 1 month at 20°C, 3 months at 4°C and 6 months at -18°C. Commercial short-term storage of concentrate at temperatures above -18°C appears to be feasible. An address presented to the 20th Annual Convention AIFST, Albury NSW, 16th- 20th May, 1987
Resumo:
The effect of moisture content and storage temperature on the high quality storage life on macadamia nut-in-shell (NIS), and the subsequent influence of NIS storage on the shelf-life of roasted kernel, is being investigated. Macadamia integrifolia 'Keauhou" (HAES 246) NIS is being stored at 5°, 25°C and 40°C with a moisture content of 15.0, 12.5, 10.0, 7.5 and 3.5% for a maximum of 12 months. Preliminary results showed that unacceptable levels of visual mould developed on NIS with 15.0 and 12.5% moisture at 25°C following relatively short periods of storage. Discolouration and the production of an off-flavour in the raw kernel resulted after 1 month's storage of NIS with a moisture content of 10.0% at 40°C. Roasting times were reduced with increased storage duration of NIS with a moisture content of 15.0, 12.5 and 10.0% at 25°C, 15.0 and 12.5% at 5°C and 3.5% at 40°C. The percentage of roasted kernel rejects increased with increased storage duration of NIS with a moisture content of 15.0 and 12.5% at 25°C.
Resumo:
The effect of cold storage on glucosinolate concentration was examined in 7-day-old seed-sprouts of broccoli, kohl rabi, white radish and rocket. Principal glucosinolates identified were glucoraphanin and glucoerucin (in broccoli, kohl rabi and rocket), glucoiberin (in broccoli and kohl rabi), and glucoraphenin and glucodehydroerucin (in white radish). Generally, sprouts showed no significant changes in individual glucosinolate concentrations during storage at 4°C for 3 weeks. The exception to this was rocket, which showed a significant decline in glucoerucin and glucoraphanin after 1 and 2 weeks, respectively. These preliminary results indicate that as there is no significant loss of glucosinolates in broccoli, radish and kohl rabi sprouts, these sprouts may be stored under domestic refrigeration conditions without significant loss of potential anti-cancer compounds. Rocket sprouts, on the other hand, should be consumed soon after purchase if glucosinolate levels are to be maintained.
Resumo:
The size of the soil microbial biomass carbon (SMBC) has been proposed as a sensitive indicator for measuring the adverse effects of contaminants on the soil microbial community. In this study of Australian agricultural systems, we demonstrated that field variability of SMBC measured using the fumigation-extraction procedure limited its use as a robust ecotoxicological endpoint. The SMBC varied up to 4-fold across control samples collected from a single field site, due to small-scale spatial heterogeneity in the soil physicochemical environment. Power analysis revealed that large numbers of replicates (3-93) were required to identify 20% or 50% decreases in the size of the SMBC of contaminated soil samples relative to their uncontaminated control samples at the 0.05% level of statistical significance. We question the value of the routine measurement of SMBC as an ecotoxicological endpoint at the field scale, and suggest more robust and predictive microbiological indicators.
Resumo:
Detailed data on seagrass distribution, abundance, growth rates and community structure information were collected at Orman Reefs in March 2004 to estimate the above-ground productivity and carbon assimilated by seagrass meadows. Seagrass meadows were re-examined in November 2004 for comparison at the seasonal extremes of seagrass abundance. Ten seagrass species were identified in the meadows on Orman Reefs. Extensive seagrass coverage was found in March (18,700 ha) and November (21,600 ha), with seagrass covering the majority of the intertidal reef-top areas and a large proportion of the subtidal areas examined. There were marked differences in seagrass above-ground biomass, distribution and species composition between the two surveys. Major changes between March and November included a substantial decline in biomass for intertidal meadows and an expansion in area of subtidal meadows. Changes were most likely a result of greater tidal exposure of intertidal meadows prior to November leading to desiccation and temperature-related stress. The Orman Reef seagrass meadows had a total above-ground productivity of 259.8 t DW day-1 and estimated carbon assimilation of 89.4 t C day-1 in March. The majority of this production came from the intertidal meadows which accounted for 81% of the total production. Intra-annual changes in seagrass species composition, shoot density and size of meadows measured in this study were likely to have a strong influence on the total above-ground production during the year. The net estimated above-ground productivity of Orman Reefs meadows in March 2004 (1.19 g C m-2 day-1) was high compared with other tropical seagrass areas that have been studied and also higher than many other marine, estuarine and terrestrial plant communities.
Resumo:
Swordfish are kept chilled, not frozen, for up to 15 days before being unloaded at Australian ports. Swordfish landed alive, and to a lesser extent prerigor, have better quality when unloaded. Warmer fishing waters did not lead to poorer quality at unloading. There was a serious loss of quality during long fishing trips. Sex had no influence on swordfish quality. Three methods of chilling were evaluated: refrigerated seawater (RSW) chilling for up to 2 days followed by storage under ice, refrigerated brine (seawater with extra salt added) for up to 2 days followed by storage in a freshwater ice slurry, and ice slurry (freshwater ice mixed with seawater) for up to 2 days followed by storage under ice only. Two fishing trips were monitored for each method. The freshness indicator K value was used to determine which method produced the best quality swordfish when unloaded at the factory. Storage method played a larger role in quality loss than capture conditions. Refrigerated brine produced the best quality swordfish when the machinery functioned properly closely followed by RSW. Ice slurry chilling of large fish such as swordfish exhibited initial delays in the reduction of core temperature which led to lower quality. This method could be improved with the addition of mechanical circulation. Mechanical problems, which resulted in minor increases of temperature during brine storage, led to a much larger loss of quality than would be expected.
Resumo:
Clonal forestry is the approach used for deployment of Pinus elliottii x P. caribaea hybrids in Queensland, Australia. Clonal forestry relies on the ability to maintain juvenility of stock plants while selections are made in field tests, so that genetic gains are not eroded by the effects of stock plant maturation. Two parallel approaches are employed in Queensland to maintain juvenility of clonal material. Firstly, the ortet and several ramets of each clone are maintained as archive hedges <20-cm height for the duration of field tests. Secondly, shoots from archive hedges are stored in tissue culture at low temperature and low irradiance to slow growth and slow maturation. Once the best clones have been identified, production hedges are derived from both archive hedges and tissue culture shoots. About 6 million rooted cuttings are produced annually, representing almost the entire planting program of Pinus in subtropical Queensland.
Resumo:
Cultivation and cropping of soils results in a decline in soil organic carbon and soil nitrogen, and can lead to reduced crop yields. The CENTURY model was used to simulate the effects of continuous cultivation and cereal cropping on total soil organic matter (C and N), carbon pools, nitrogen mineralisation, and crop yield from 6 locations in southern Queensland. The model was calibrated for each replicate from the original datasets, allowing comparisons for each replicate rather than site averages. The CENTURY model was able to satisfactorily predict the impact of long-term cultivation and cereal cropping on total organic carbon, but was less successful in simulating the different fractions and nitrogen mineralisation. The model firstly over-predicted the initial (pre-cropping) soil carbon and nitrogen concentration of the sites. To account for the unique shrinking and swelling characteristics of the Vertosol soils, the default annual decomposition rates of the slow and passive carbon pools were doubled, and then the model accurately predicted initial conditions. The ability of the model to predict carbon pool fractions varied, demonstrating the difficulty inherent in predicting the size of these conceptual pools. The strength of the model lies in the ability to closely predict the starting soil organic matter conditions, and the ability to predict the impact of clearing, cultivation, fertiliser application, and continuous cropping on total soil carbon and nitrogen.
Resumo:
No-tillage (NT) practice, where straw is retained on the soil surface, is increasingly being used in cereal cropping systems in Australia and elsewhere. Compared to conventional tillage (CT), where straw is mixed with the ploughed soil, NT practice may reduce straw decomposition, increase nitrogen immobilisation and increase organic carbon in the soil. This study examined 15N-labelled wheat straw (stubble) decomposition in four treatments (NT v. CT, with N rates of 0 and 75 kg/ha.year) and assessed the tillage and fertiliser N effects on mineral N and organic C and N levels over a 10-year period in a field experiment. NT practice decreased the rate of straw decomposition while fertiliser N application increased it. However, there was no tillage practice x N interaction. The mean residence time of the straw N in soil was more than twice as long under the NT (1.2 years) as compared to the CT practice (0.5 years). In comparison, differences in mean residence time due to N fertiliser treatment were small. However, tillage had generally very little effect on either the amounts of mineral N at sowing or soil organic C (and N) over the study period. While application of N fertiliser increased mineral N, it had very little effect on organic C over a 10-year period. Relatively rapid decomposition of straw and short mean residence time of straw N in a Vertisol is likely to have very little long-term effect on N immobilisation and organic C level in an annual cereal cropping system in a subtropical, semiarid environment. Thus, changing the tillage practice from CT to NT may not necessitate additional N requirement unless use is made of additional stored water in the soil or mineral N loss due to increased leaching is compensated for in N supply to crops.
Resumo:
Parthenium hysterophorus L. (Asteraceae) is a weed of national significance in Australia. Among the several arthropod agents introduced into Australia to control populations of P. hysterophorus biologically, Epiblema strenuana Walker (Lepidoptera: Tortricidae) is the most widespread and abundant agent. By intercepting the normal transport mechanisms of P. hysterophorus, the larvae of E. strenuana drain nutrients, other metabolic products, and energy, and place the host plant under intense metabolic stress. In this study, determinations of total non-structural carbohydrates (TNC) levels and carbon and nitrogen isotope ratios of fixed products in different parts of the plant tissue, including the gall, have been made to establish the function of gall as a sink for the nutrients. Values of δ13C and δ15N in galls were significantly different than those in proximal and distal stems, whereas the TNC levels were insignificant, when measured in the total population of P. hysterophorus, regardless of plant age. However, carbon, nitrogen, and TNC signatures presented significant results, when assayed in different developmental stages of P. hysterophorus. Carbon isotope ratios in galls were consistently more negative than those from the compared plant organs. Nitrogen isotope ratios in galls, on the contrary, were either similar to or less negative than the compared plant organs, especially within a single host-plant stage population (i.e., either rosette, preflowering, or flowering stage). TNC levels varied within compared plant populations. The stem distal to the gall functioned more efficiently as a nodal channel than the stem proximal to the gall, especially in the translocation of nitrogenous nutrients. Our findings indicate that the gall induced by E. strenuana functions as a sink for the assayed nutrients, although some variations have been observed in the patterns of nutrient mobilization. By creating a sink for the nutrients in the gall, E. strenuana is able to place the overall plant metabolism under stress, and this ability indicates E. strenuana has the necessary potential for use as a biological-control agent.
Resumo:
BACKGROUND: Wheat can be stored for many months before being fumigated with phosphine to kill insects, so a study was undertaken to investigate whether the sorptive capacity of wheat changes as it ages. Wheat was stored at 15 or 25C and 55% RH for up to 5.5 months, and samples were fumigated at intervals to determine sorption. Sealed glass flasks (95% full) were injected with 1.5 mg L-1 of phosphine based on flask volume. Concentrations were monitored for 11 days beginning 2 h after injection. Some wheat samples were refumigated after a period of ventilation. Several fumigations of wheat were conducted to determine the pattern of sorption during the first 24 h. RESULTS: Phosphine concentration declined exponentially with time from 2 h after injection. Rate of sorption decreased with time spent in storage at either 15 or 25C and 55% RH. Rate of sorption tended to be lower when wheat was refumigated, but this could be explained by time in storage rather than by refumigation per se. The data from the 24 h fumigations did not fit a simple exponential decay equation. Instead, there was a rapid decline in the first hour, with phosphine concentration falling much more slowly thereafter. CONCLUSIONS: The results have implications for phosphine fumigation of insects in stored wheat. Both the time wheat has spent in storage and the temperature at which it has been stored are factors that must be considered when trying to understand the impact of sorption on phosphine concentrations in commercial fumigations.
Resumo:
Background and Aims: Success of invasive plant species is thought to be linked with their higher leaf carbon fixation strategy, enabling them to capture and utilize resources better than native species, and thus pre-empt and maintain space. However, these traits are not well-defined for invasive woody vines. Methods: In a glass house setting, experiments were conducted to examine how leaf carbon gain strategies differ between non-indigenous invasive and native woody vines of south-eastern Australia, by investigating their biomass gain, leaf structural, nutrient and physiological traits under changing light and moisture regimes. Key Results: Leaf construction cost (CC), calorific value and carbon : nitrogen (C : N) ratio were lower in the invasive group, while ash content, N, maximum photosynthesis, light-use efficiency, photosynthetic energyuse efficiency (PEUE) and specific leaf area (SLA) were higher in this group relative to the native group. Trait plasticity, relative growth rate (RGR), photosynthetic nitrogen-use efficiency and water-use efficiency did not differ significantly between the groups. However, across light resource, regression analyses indicated that at a common (same) leaf CC and PEUE, a higher biomass RGR resulted for the invasive group; also at a common SLA, a lower CC but higher N resulted for the invasive group. Overall, trait co-ordination (using pair-wise correlation analyses) was better in the invasive group. Ordination using 16 leaf traits indicated that the major axis of invasive-native dichotomy is primarily driven by SLA and CC (including its components and/or derivative of PEUE) and was significantly linked with RGR. Conclusions: These results demonstrated that while not all measures of leaf resource traits may differ between the two groups, the higher level of trait correlation and higher revenue returned (RGR) per unit of major resource need (CC) and use (PEUE) in the invasive group is in line with their rapid spread where introduced.
Resumo:
Exotic and invasive woody vines are major environmental weeds of riparian areas, rainforest communities and remnant natural vegetation in coastal eastern Australia, where they smother standing vegetation, including large trees, and cause canopy collapse. We investigated, through glasshouse resource manipulative experiments, the ecophysiological traits that might facilitate faster growth, better resource acquisition and/or utilization and thus dominance of four exotic and invasive vines of South East Queensland, Australia, compared with their native counterparts. Relative growth rate was not significantly different between the two groups but water use efficiency (WUE) was higher in the native species while the converse was observed for light use efficiency (quantum efficiency, AQE) and maximum photosynthesis on a mass basis (Amax mass). The invasive species, as a group, also exhibited higher respiration load, higher light compensation point and higher specific leaf area. There were stronger correlations of leaf traits and greater structural (but not physiological) plasticity in invasive species than in their native counterparts. The scaling coefficients of resource use efficiencies (WUE, AQE and respiration efficiency) as well as those of fitness (biomass accumulated) versus many of the performance traits examined did not differ between the two species-origin groups, but there were indications of significant shifts in elevation (intercept values) and shifts along common slopes in many of these relationships – signalling differences in carbon economy (revenue returned per unit energy invested) and/or resource usage. Using ordination and based on 14 ecophysiological attributes, a fair level of separation between the two groups was achieved (51.5% explanatory power), with AQE, light compensation point, respiration load, WUE, specific leaf area and leaf area ratio, in decreasing order, being the main drivers. This study suggests similarity in trait plasticity, especially for physiological traits, but there appear to be fundamental differences in carbon economy and resource conservation between native and invasive vine species.