24 resultados para Fonction cumulative


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dairy farms located in the subtropical cereal belt of Australia rely on winter and summer cereal crops, rather than pastures, for their forage base. Crops are mostly established in tilled seedbeds and the system is vulnerable to fertility decline and water erosion, particularly over summer fallows. Field studies were conducted over 5 years on contrasting soil types, a Vertosol and Sodosol, in the 650-mm annual-rainfall zone to evaluate the benefits of a modified cropping program on forage productivity and the soil-resource base. Growing forage sorghum as a double-crop with oats increased total mean annual production over that of winter sole-crop systems by 40% and 100% on the Vertosol and Sodosol sites respectively. However, mean annual winter crop yield was halved and overall forage quality was lower. Ninety per cent of the variation in winter crop yield was attributable to fallow and in-crop rainfall. Replacing forage sorghum with the annual legume lablab reduced fertiliser nitrogen (N) requirements and increased forage N concentration, but reduced overall annual yield. Compared with sole-cropped oats, double-cropping reduced the risk of erosion by extending the duration of soil water deficits and increasing the time ground was under plant cover. When grown as a sole-crop, well fertilised forage sorghum achieved a mean annual cumulative yield of 9.64 and 6.05 t DM/ha on the Vertosol and Sodosol, respectively, being about twice that of sole-cropped oats. Forage sorghum established using zero-tillage practices and fertilised at 175 kg N/ha. crop achieved a significantly higher yield and forage N concentration than did the industry-standard forage sorghum (conventional tillage and 55 kg N/ha. crop) on the Vertosol but not on the Sodosol. On the Vertosol, mean annual yield increased from 5.65 to 9.64 t DM/ha (33 kg DM/kg N fertiliser applied above the base rate); the difference in the response between the two sites was attributed to soil type and fertiliser history. Changing both tillage practices and N-fertiliser rate had no affect on fallow water-storage efficiency but did improve fallow ground cover. When forage sorghum, grown as a sole crop, was replaced with lablab in 3 of the 5 years, overall forage N concentration increased significantly, and on the Vertosol, yield and soil nitrate-N reserves also increased significantly relative to industry-standard sorghum. All forage systems maintained or increased the concentration of soil nitrate-N (0-1.2-m soil layer) over the course of the study. Relative to sole-crop oats, alternative forage systems were generally beneficial to the concentration of surface-soil (0-0.1 m) organic carbon and systems that included sorghum showed most promise for increasing soil organic carbon concentration. We conclude that an emphasis on double-or summer sole-cropping rather than winter sole-cropping will advantage both farm productivity and the soil-resource base.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phosphine fumigation is commonly used to disinfest grain of insect pests. In fumigations which allow insect survival the question of whether sublethal exposure to phosphine affects reproduction is important for predicting population recovery and the spread of resistance. Two laboratory experiments addressed this question using strongly phosphine resistant lesser grain borer, Rhyzopertha dominica (F.). Offspring production was examined in individual females which had been allowed to mate before being fumigated for 48 h at 0.25 mg L -1. Surviving females produced offspring but at a reduced rate during a two-week period post fumigation compared to unfumigated controls. Cumulative fecundity of fumigated females from 4 weeks of oviposition post fumigation was 25% lower than the cumulative fecundity of unfumigated females. Mating potential post fumigation was examined when virgin adults (either or both sexes) were fumigated individually (48 h at 0.25 mg L -1) and the survivors were allowed to mate and reproduce in wheat. All mating combinations produced offspring but production in the first week post fumigation was significantly suppressed compared to the unfumigated controls. Offspring suppression was greatest when both sexes were exposed to phosphine followed by the pairing of fumigated females with unfumigated males and the least suppression was observed when males only were fumigated. Cumulative fecundity from 4 weeks oviposition post fumigation of fumigated females paired with fumigated males was 17% lower than the fecundity of unfumigated adult pairings. Both of these experiments confirmed that sublethal exposure to phosphine can reduce fecundity in R. dominica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to unravel the effects of climate, topography, soil, and grazing management on soil organic carbon (SOC) stocks in the grazing lands of north-eastern Australia. We sampled for SOC stocks at 98 sites from 18 grazing properties across Queensland, Australia. These samples covered four nominal grazing management classes (Continuous, Rotational, Cell, and Exclosure), eight broad soil types, and a strong tropical to subtropical climatic gradient. Temperature and vapour-pressure deficit explained >80% of the variability of SOC stocks at cumulative equivalent mineral masses nominally representing 0-0.1 and 0-0.3m depths. Once detrended of climatic effects, SOC stocks were strongly influenced by total standing dry matter, soil type, and the dominant grass species. At 0-0.3m depth only, there was a weak negative association between stocking rate and climate-detrended SOC stocks, and Cell grazing was associated with smaller SOC stocks than Continuous grazing and Exclosure. In future, collection of quantitative information on stocking intensity, frequency, and duration may help to improve understanding of the effect of grazing management on SOC stocks. Further exploration of the links between grazing management and above- and below-ground biomass, perhaps inferred through remote sensing and/or simulation modelling, may assist large-area mapping of SOC stocks in northern Australia. © CSIRO 2013.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The in vivo faecal egg count reduction test (FECRT) is the most commonly used test to detect anthelmintic resistance (AR) in gastrointestinal nematodes (GIN) of ruminants in pasture based systems. However, there are several variations on the method, some more appropriate than others in specific circumstances. While in some cases labour and time can be saved by just collecting post-drench faecal worm egg counts (FEC) of treatment groups with controls, or pre- and post-drench FEC of a treatment group with no controls, there are circumstances when pre- and post-drench FEC of an untreated control group as well as from the treatment groups are necessary. Computer simulation techniques were used to determine the most appropriate of several methods for calculating AR when there is continuing larval development during the testing period, as often occurs when anthelmintic treatments against genera of GIN with high biotic potential or high re-infection rates, such as Haemonchus contortus of sheep and Cooperia punctata of cattle, are less than 100% efficacious. Three field FECRT experimental designs were investigated: (I) post-drench FEC of treatment and controls groups, (II) pre- and post-drench FEC of a treatment group only and (III) pre- and post-drench FEC of treatment and control groups. To investigate the performance of methods of indicating AR for each of these designs, simulated animal FEC were generated from negative binominal distributions with subsequent sampling from the binomial distributions to account for drench effect, with varying parameters for worm burden, larval development and drench resistance. Calculations of percent reductions and confidence limits were based on those of the Standing Committee for Agriculture (SCA) guidelines. For the two field methods with pre-drench FEC, confidence limits were also determined from cumulative inverse Beta distributions of FEC, for eggs per gram (epg) and the number of eggs counted at detection levels of 50 and 25. Two rules for determining AR: (1) %reduction (%R) < 95% and lower confidence limit <90%; and (2) upper confidence limit <95%, were also assessed. For each combination of worm burden, larval development and drench resistance parameters, 1000 simulations were run to determine the number of times the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been declared. When continuing larval development occurs during the testing period of the FECRT, the simulations showed AR should be calculated from pre- and post-drench worm egg counts of an untreated control group as well as from the treatment group. If the widely used resistance rule 1 is used to assess resistance, rule 2 should also be applied, especially when %R is in the range 90 to 95% and resistance is suspected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vegetable cropping systems are often characterised by high inputs of nitrogen fertiliser. Elevated emissions of nitrous oxide (N2O) can be expected as a consequence. In order to mitigate N2O emissions from fertilised agricultural fields, the use of nitrification inhibitors, in combination with ammonium based fertilisers, has been promoted. However, no data is currently available on the use of nitrification inhibitors in sub-tropical vegetable systems. A field experiment was conducted to investigate the effect of the nitrification inhibitor 3,4-dimethylpyrazole phosphate (DMPP) on N2O emissions and yield from broccoli production in sub-tropical Australia. Soil N2O fluxes were monitored continuously (3 h sampling frequency) with fully automated, pneumatically operated measuring chambers linked to a sampling control system and a gas chromatograph. Cumulative N2O emissions over the 5 month observation period amounted to 298 g-N/ha, 324 g-N/ha, 411 g-N/ha and 463 g-N/ha in the conventional fertiliser (CONV), the DMPP treatment (DMPP), the DMMP treatment with a 10% reduced fertiliser rate (DMPP-red) and the zero fertiliser (0N), respectively. The temporal variation of N2O fluxes showed only low emissions over the broccoli cropping phase, but significantly elevated emissions were observed in all treatments following broccoli residues being incorporated into the soil. Overall 70–90% of the total emissions occurred in this 5 weeks fallow phase. There was a significant inhibition effect of DMPP on N2O emissions and soil mineral N content over the broccoli cropping phase where the application of DMPP reduced N2O emissions by 75% compared to the standard practice. However, there was no statistical difference between the treatments during the fallow phase or when the whole season was considered. This study shows that DMPP has the potential to reduce N2O emissions from intensive vegetable systems, but also highlights the importance of post-harvest emissions from incorporated vegetable residues. N2O mitigation strategies in vegetable systems need to target these post-harvest emissions and a better evaluation of the effect of nitrification inhibitors over the fallow phase is needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hendra virus (HeV), a highly pathogenic zoonotic paramyxovirus recently emerged from bats, is a major concern to the horse industry in Australia. Previous research has shown that higher temperatures led to lower virus survival rates in the laboratory. We develop a model of survival of HeV in the environment as influenced by temperature. We used 20 years of daily temperature at six locations spanning the geographic range of reported HeV incidents to simulate the temporal and spatial impacts of temperature on HeV survival. At any location, simulated virus survival was greater in winter than in summer, and in any month of the year, survival was higher in higher latitudes. At any location, year-to-year variation in virus survival 24 h post-excretion was substantial and was as large as the difference between locations. Survival was higher in microhabitats with lower than ambient temperature, and when environmental exposure was shorter. The within-year pattern of virus survival mirrored the cumulative within-year occurrence of reported HeV cases, although there were no overall differences in survival in HeV case years and non-case years. The model examines the effect of temperature in isolation; actual virus survivability will reflect the effect of additional environmental factors

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Indospicine is a non-proteinogenic amino acid which occurs in Indigofera species with widespread prevalence in grazing pastures across tropical Africa, Asia, Australia, and the Americas. It accumulates in the tissues of grazing livestock after ingestion of Indigofera. It is a competitive inhibitor of arginase and causes both liver degeneration and abortion. Indospicine hepatoxicity occurs universally across animal species but the degree varies considerably between species, with dogs being particularly sensitive. The magnitude of canine sensitivity is such that ingestion of naturally indospicine-contaminated horse and camel meat has caused secondary poisoning of dogs, raising significant industry concern. Indospicine impacts on the health and production of grazing animals per se has been less widely documented. Livestock grazing Indigofera have a chronic and cumulative exposure to this toxin, with such exposure experimentally shown to induce both hepatotoxicity and embryo-lethal effects in cattle and sheep. In extensive pasture systems, where animals are not closely monitored, the resultant toxicosis may well occur after prolonged exposure but either be undetected, or even if detected not be attributable to a particular cause. Indospicine should be considered as a possible cause of animal poor performance, particularly reduced weight gain or reproductive losses, in pastures where Indigofera are prevalent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prescribed fire is one of the most widely-used management tools for reducing fuel loads in managed forests. However the long-term effects of repeated prescribed fires on soil carbon (C) and nitrogen (N) pools are poorly understood. This study aimed to investigate how different fire frequency regimes influence C and N pools in the surface soils (0–10 cm). A prescribed fire field experiment in a wet sclerophyll forest established in 1972 in southeast Queensland was used in this study. The fire frequency regimes included long unburnt (NB), burnt every 2 years (2yrB) and burnt every 4 years (4yrB), with four replications. Compared with the NB treatment, the 2yrB treatment lowered soil total C by 44%, total N by 54%, HCl hydrolysable C and N by 48% and 59%, KMnO4 oxidizable C by 81%, microbial biomass C and N by 42% and 33%, cumulative CO2–C by 28%, NaOCl-non-oxidizable C and N by 41% and 51%, and charcoal-C by 17%, respectively. The 4yrB and NB treatments showed no significant differences for these soil C and N pools. All soil labile, biologically active and recalcitrant and total C and N pools were correlated positively with each other and with soil moisture content, but negatively correlated with soil pH. The C:N ratios of different C and N pools were greater in the burned treatments than in the NB treatments. This study has highlighted that the prescribed burning at four year interval is a more sustainable management practice for this subtropical forest ecosystem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Clays could underpin a viable agricultural greenhouse gas (GHG) abatement technology given their affinity for nitrogen and carbon compounds. We provide the first investigation into the efficacy of clays to decrease agricultural nitrogen GHG emissions (i.e., N2O and NH3). Via laboratory experiments using an automated closed-vessel analysis system, we tested the capacity of two clays (vermiculite and bentonite) to decrease N2O and NH3 emissions and organic carbon losses from livestock manures (beef, pig, poultry, and egg layer) incorporated into an agricultural soil. Clay addition levels varied, with a maximum of 1:1 to manure (dry weight). Cumulative gas emissions were modeled using the biological logistic function, with 15 of 16 treatments successfully fitted (P < 0.05) by this model. When assessing all of the manures together, NH3 emissions were lower (×2) at the highest clay addition level compared with no clay addition, but this difference was not significant (P = 0.17). Nitrous oxide emissions were significantly lower (×3; P < 0.05) at the highest clay addition level compared with no clay addition. When assessing manures individually, we observed generally decreasing trends in NH3 and N2O emissions with increasing clay addition, albeit with widely varying statistical significance between manure types. Most of the treatments also showed strong evidence of increased C retention with increasing clay additions, with up to 10 times more carbon retained in treatments containing clay compared with treatments containing no clay. This preliminary assessment of the efficacy of clays to mitigate agricultural GHG emissions indicates strong promise.