979 resultados para Fonction cumulative


Relevância:

10.00% 10.00%

Publicador:

Resumo:

In recent years mirids and stinkbugs have emerged as important sucking pests in cotton. While stinkbugs are causing damage to bolls, mirids are causing damage to seedlings, squares and bolls. With the increasing adoption of Bollgard II and IPM approaches the use of broad-spectrum chemicals to kill Helicoverpa has been reduced and as a result mirids and stinkbugs are building to levels causing damage to bolls later in crop growth stages. Studies on stinkbugs by Dr Moazzem Khan revealed that green vegetable bug (GVB) caused significant boll damage and yield loss. A preliminary study by Dr Khan on mirids revealed that high mirid numbers at later growth stages also caused significant boll damage and that damage caused by mirids and GVB were similar. Mirids and stinkbugs therefore demand greater attention in order to minimise losses caused by these pests and to develop IPM strategies against these pests to enhance gains in IPM that have been made with Bt-transgenic cotton. Progress in this area of research will maintain sustainability and profitability of the Australian cotton industry. Mirid damage at early growth stages of cotton (up to squaring stage) has been studied in detail by Dr Khan. He found that all ages of mirids cause damage to young plants and damage by mirid nymphs is cumulative. Maximum damage occurs when the insect reaches the 4th and 5th nymphal stages. He also found that mirid feeding causes shedding of small and medium squares, and damaged large squares develop as ‘parrot beak’ bolls. Detailed studies at the boll stage, such as which stage of mirids is most damaging or which age boll is most vulnerable to feeding, is lacking. This information is a prerequisite to developing an IPM strategy for the pest in later crop growth stages. Understanding population change of the pest over time in relation to crop development is an important aspect for developing management strategies for the pest which is lacking for mirids in BollgardII. Predators and parasitoids are integral components of any IPM system and play an important part in regulating pest populations. Some generalist predators such as ants, spiders, damsel bugs and assassin bugs are known to predate on mirids. Nothing is known about parasitoids of mirids. Since green mirid (GM), Creontiades dilutus, is indigenous to Australia it is likely that we have one or more parasitoids of this mirid in Australia, but that possibility has not been investigated yet. The impact of the GVB adult parasitoid, Trichopoda giacomelli, has been studied by Dr Khan who found that the fly is established in the released areas and continues to spread. However, to get wider and greater impact, the fly should be released in new locations across the valleys. The insecticides registered for mirids and stinkbugs are mostly non-selective and are extremely disruptive to a wide range of beneficial insects. Use of these insecticides at stage I and II will minimise the impact of existing IPM programs. Therefore less disruptive control tactics including soft chemicals for mirids and stinkbugs are necessary. As with soft chemicals, salt mixtures, biopesticides based on fungal pathogens and attractants based on plant volatiles may be useful tools in managing mirids and stinkbugs with less or no disruption. Dr Khan has investigated salt mixture against mirids and GVB. While salt mixtures are quite effective and less disruptive, they are quite chemical specific. Not all chemicals mixed with salt will give the desired benefit. Therefore further investigation is needed to identify those chemicals that are effective with salt mixture against mirids and 3 of 37 GVB. Dr Caroline Hauxwell of DPI&F is working on fungal pathogen-based biopesticides against mirids and GVB and Drs Peter Gregg and Alice Del Socorro of Australian Cotton CRC are working on plant volatile-based attractants against mirids. Depending on their findings, inclusion of fungal-based biopestcides and plant volatile-based attractants in developing a management system against mirids and stinkbugs in cotton could be an important component of an IPM approach.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The effectiveness of pre-plant dips of crowns in potassium phosphonate and phosphorous acid was investigated in a systematic manner to develop an effective strategy for the control of root and heart rot diseases caused by Phytophthora cinnamomi in the pineapple hybrids 'MD2' and '73-50' and cultivar Smooth Cayenne. Our results clearly indicate that a high volume spray at planting was much less effective when compared to a pre-plant dip. 'Smooth Cayenne' was found to be more resistant to heart rot than 'MD2' and '73-50', and 'Smooth Cayenne' to be more responsive to treatment with potassium phosphonate. Based on cumulative heart rot incidence over time 'MD2' was more susceptible to heart rot than '73-50' and was more responsive to an application of phosphorous acid. The highest levels of phosphonate in roots were reached one month after planting and levels declined during the next two months. Pre-plant dipping of crowns prior to planting is highly effective to control root and heart rot in the first few months but is not sufficient to maintain health of the mother plant root system up until plant crop harvest when weather conditions continue to favour infection.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Dairy farms located in the subtropical cereal belt of Australia rely on winter and summer cereal crops, rather than pastures, for their forage base. Crops are mostly established in tilled seedbeds and the system is vulnerable to fertility decline and water erosion, particularly over summer fallows. Field studies were conducted over 5 years on contrasting soil types, a Vertosol and Sodosol, in the 650-mm annual-rainfall zone to evaluate the benefits of a modified cropping program on forage productivity and the soil-resource base. Growing forage sorghum as a double-crop with oats increased total mean annual production over that of winter sole-crop systems by 40% and 100% on the Vertosol and Sodosol sites respectively. However, mean annual winter crop yield was halved and overall forage quality was lower. Ninety per cent of the variation in winter crop yield was attributable to fallow and in-crop rainfall. Replacing forage sorghum with the annual legume lablab reduced fertiliser nitrogen (N) requirements and increased forage N concentration, but reduced overall annual yield. Compared with sole-cropped oats, double-cropping reduced the risk of erosion by extending the duration of soil water deficits and increasing the time ground was under plant cover. When grown as a sole-crop, well fertilised forage sorghum achieved a mean annual cumulative yield of 9.64 and 6.05 t DM/ha on the Vertosol and Sodosol, respectively, being about twice that of sole-cropped oats. Forage sorghum established using zero-tillage practices and fertilised at 175 kg N/ha. crop achieved a significantly higher yield and forage N concentration than did the industry-standard forage sorghum (conventional tillage and 55 kg N/ha. crop) on the Vertosol but not on the Sodosol. On the Vertosol, mean annual yield increased from 5.65 to 9.64 t DM/ha (33 kg DM/kg N fertiliser applied above the base rate); the difference in the response between the two sites was attributed to soil type and fertiliser history. Changing both tillage practices and N-fertiliser rate had no affect on fallow water-storage efficiency but did improve fallow ground cover. When forage sorghum, grown as a sole crop, was replaced with lablab in 3 of the 5 years, overall forage N concentration increased significantly, and on the Vertosol, yield and soil nitrate-N reserves also increased significantly relative to industry-standard sorghum. All forage systems maintained or increased the concentration of soil nitrate-N (0-1.2-m soil layer) over the course of the study. Relative to sole-crop oats, alternative forage systems were generally beneficial to the concentration of surface-soil (0-0.1 m) organic carbon and systems that included sorghum showed most promise for increasing soil organic carbon concentration. We conclude that an emphasis on double-or summer sole-cropping rather than winter sole-cropping will advantage both farm productivity and the soil-resource base.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Phosphine fumigation is commonly used to disinfest grain of insect pests. In fumigations which allow insect survival the question of whether sublethal exposure to phosphine affects reproduction is important for predicting population recovery and the spread of resistance. Two laboratory experiments addressed this question using strongly phosphine resistant lesser grain borer, Rhyzopertha dominica (F.). Offspring production was examined in individual females which had been allowed to mate before being fumigated for 48 h at 0.25 mg L -1. Surviving females produced offspring but at a reduced rate during a two-week period post fumigation compared to unfumigated controls. Cumulative fecundity of fumigated females from 4 weeks of oviposition post fumigation was 25% lower than the cumulative fecundity of unfumigated females. Mating potential post fumigation was examined when virgin adults (either or both sexes) were fumigated individually (48 h at 0.25 mg L -1) and the survivors were allowed to mate and reproduce in wheat. All mating combinations produced offspring but production in the first week post fumigation was significantly suppressed compared to the unfumigated controls. Offspring suppression was greatest when both sexes were exposed to phosphine followed by the pairing of fumigated females with unfumigated males and the least suppression was observed when males only were fumigated. Cumulative fecundity from 4 weeks oviposition post fumigation of fumigated females paired with fumigated males was 17% lower than the fecundity of unfumigated adult pairings. Both of these experiments confirmed that sublethal exposure to phosphine can reduce fecundity in R. dominica.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study addresses three important issues in tree bucking optimization in the context of cut-to-length harvesting. (1) Would the fit between the log demand and log output distributions be better if the price and/or demand matrices controlling the bucking decisions on modern cut-to-length harvesters were adjusted to the unique conditions of each individual stand? (2) In what ways can we generate stand and product specific price and demand matrices? (3) What alternatives do we have to measure the fit between the log demand and log output distributions, and what would be an ideal goodness-of-fit measure? Three iterative search systems were developed for seeking stand-specific price and demand matrix sets: (1) A fuzzy logic control system for calibrating the price matrix of one log product for one stand at a time (the stand-level one-product approach); (2) a genetic algorithm system for adjusting the price matrices of one log product in parallel for several stands (the forest-level one-product approach); and (3) a genetic algorithm system for dividing the overall demand matrix of each of the several log products into stand-specific sub-demands simultaneously for several stands and products (the forest-level multi-product approach). The stem material used for testing the performance of the stand-specific price and demand matrices against that of the reference matrices was comprised of 9 155 Norway spruce (Picea abies (L.) Karst.) sawlog stems gathered by harvesters from 15 mature spruce-dominated stands in southern Finland. The reference price and demand matrices were either direct copies or slightly modified versions of those used by two Finnish sawmilling companies. Two types of stand-specific bucking matrices were compiled for each log product. One was from the harvester-collected stem profiles and the other was from the pre-harvest inventory data. Four goodness-of-fit measures were analyzed for their appropriateness in determining the similarity between the log demand and log output distributions: (1) the apportionment degree (index), (2) the chi-square statistic, (3) Laspeyres quantity index, and (4) the price-weighted apportionment degree. The study confirmed that any improvement in the fit between the log demand and log output distributions can only be realized at the expense of log volumes produced. Stand-level pre-control of price matrices was found to be advantageous, provided the control is done with perfect stem data. Forest-level pre-control of price matrices resulted in no improvement in the cumulative apportionment degree. Cutting stands under the control of stand-specific demand matrices yielded a better total fit between the demand and output matrices at the forest level than was obtained by cutting each stand with non-stand-specific reference matrices. The theoretical and experimental analyses suggest that none of the three alternative goodness-of-fit measures clearly outperforms the traditional apportionment degree measure. Keywords: harvesting, tree bucking optimization, simulation, fuzzy control, genetic algorithms, goodness-of-fit

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This study aimed to unravel the effects of climate, topography, soil, and grazing management on soil organic carbon (SOC) stocks in the grazing lands of north-eastern Australia. We sampled for SOC stocks at 98 sites from 18 grazing properties across Queensland, Australia. These samples covered four nominal grazing management classes (Continuous, Rotational, Cell, and Exclosure), eight broad soil types, and a strong tropical to subtropical climatic gradient. Temperature and vapour-pressure deficit explained >80% of the variability of SOC stocks at cumulative equivalent mineral masses nominally representing 0-0.1 and 0-0.3m depths. Once detrended of climatic effects, SOC stocks were strongly influenced by total standing dry matter, soil type, and the dominant grass species. At 0-0.3m depth only, there was a weak negative association between stocking rate and climate-detrended SOC stocks, and Cell grazing was associated with smaller SOC stocks than Continuous grazing and Exclosure. In future, collection of quantitative information on stocking intensity, frequency, and duration may help to improve understanding of the effect of grazing management on SOC stocks. Further exploration of the links between grazing management and above- and below-ground biomass, perhaps inferred through remote sensing and/or simulation modelling, may assist large-area mapping of SOC stocks in northern Australia. © CSIRO 2013.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

- Background In the UK, women aged 50–73 years are invited for screening by mammography every 3 years. In 2009–10, more than 2.24 million women in this age group in England were invited to take part in the programme, of whom 73% attended a screening clinic. Of these, 64,104 women were recalled for assessment. Of those recalled, 81% did not have breast cancer; these women are described as having a false-positive mammogram. - Objective The aim of this systematic review was to identify the psychological impact on women of false-positive screening mammograms and any evidence for the effectiveness of interventions designed to reduce this impact. We were also looking for evidence of effects in subgroups of women. - Data sources MEDLINE, MEDLINE In-Process & Other Non-Indexed Citations, EMBASE, Health Management Information Consortium, Cochrane Central Register for Controlled Trials, Cochrane Database of Systematic Reviews, Centre for Reviews and Dissemination (CRD) Database of Abstracts of Reviews of Effects, CRD Health Technology Assessment (HTA), Cochrane Methodology, Web of Science, Science Citation Index, Social Sciences Citation Index, Conference Proceedings Citation Index-Science, Conference Proceeding Citation Index-Social Science and Humanities, PsycINFO, Cumulative Index to Nursing and Allied Health Literature, Sociological Abstracts, the International Bibliography of the Social Sciences, the British Library's Electronic Table of Contents and others. Initial searches were carried out between 8 October 2010 and 25 January 2011. Update searches were carried out on 26 October 2011 and 23 March 2012. - Review methods Based on the inclusion criteria, titles and abstracts were screened independently by two reviewers. Retrieved papers were reviewed and selected using the same independent process. Data were extracted by one reviewer and checked by another. Each included study was assessed for risk of bias. - Results Eleven studies were found from 4423 titles and abstracts. Studies that used disease-specific measures found a negative psychological impact lasting up to 3 years. Distress increased with the level of invasiveness of the assessment procedure. Studies using instruments designed to detect clinical levels of morbidity did not find this effect. Women with false-positive mammograms were less likely to return for the next round of screening [relative risk (RR) 0.97; 95% confidence interval (CI) 0.96 to 0.98] than those with normal mammograms, were more likely to have interval cancer [odds ratio (OR) 3.19 (95% CI 2.34 to 4.35)] and were more likely to have cancer detected at the next screening round [OR 2.15 (95% CI 1.55 to 2.98)]. - Limitations This study was limited to UK research and by the robustness of the included studies, which frequently failed to report quality indicators, for example failure to consider the risk of bias or confounding, or failure to report participants' demographic characteristics. - Conclusions We conclude that the experience of having a false-positive screening mammogram can cause breast cancer-specific psychological distress that may endure for up to 3 years, and reduce the likelihood that women will return for their next round of mammography screening. These results should be treated cautiously owing to inherent weakness of observational designs and weaknesses in reporting. Future research should include a qualitative interview study and observational studies that compare generic and disease-specific measures, collect demographic data and include women from different social and ethnic groups.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

- Background Nilotinib and dasatinib are now being considered as alternative treatments to imatinib as a first-line treatment of chronic myeloid leukaemia (CML). - Objective This technology assessment reviews the available evidence for the clinical effectiveness and cost-effectiveness of dasatinib, nilotinib and standard-dose imatinib for the first-line treatment of Philadelphia chromosome-positive CML. - Data sources Databases [including MEDLINE (Ovid), EMBASE, Current Controlled Trials, ClinicalTrials.gov, the US Food and Drug Administration website and the European Medicines Agency website] were searched from search end date of the last technology appraisal report on this topic in October 2002 to September 2011. - Review methods A systematic review of clinical effectiveness and cost-effectiveness studies; a review of surrogate relationships with survival; a review and critique of manufacturer submissions; and a model-based economic analysis. - Results Two clinical trials (dasatinib vs imatinib and nilotinib vs imatinib) were included in the effectiveness review. Survival was not significantly different for dasatinib or nilotinib compared with imatinib with the 24-month follow-up data available. The rates of complete cytogenetic response (CCyR) and major molecular response (MMR) were higher for patients receiving dasatinib than for those with imatinib for 12 months' follow-up (CCyR 83% vs 72%, p < 0.001; MMR 46% vs 28%, p < 0.0001). The rates of CCyR and MMR were higher for patients receiving nilotinib than for those receiving imatinib for 12 months' follow-up (CCyR 80% vs 65%, p < 0.001; MMR 44% vs 22%, p < 0.0001). An indirect comparison analysis showed no difference between dasatinib and nilotinib for CCyR or MMR rates for 12 months' follow-up (CCyR, odds ratio 1.09, 95% CI 0.61 to 1.92; MMR, odds ratio 1.28, 95% CI 0.77 to 2.16). There is observational association evidence from imatinib studies supporting the use of CCyR and MMR at 12 months as surrogates for overall all-cause survival and progression-free survival in patients with CML in chronic phase. In the cost-effectiveness modelling scenario, analyses were provided to reflect the extensive structural uncertainty and different approaches to estimating OS. First-line dasatinib is predicted to provide very poor value for money compared with first-line imatinib, with deterministic incremental cost-effectiveness ratios (ICERs) of between £256,000 and £450,000 per quality-adjusted life-year (QALY). Conversely, first-line nilotinib provided favourable ICERs at the willingness-to-pay threshold of £20,000-30,000 per QALY. - Limitations Immaturity of empirical trial data relative to life expectancy, forcing either reliance on surrogate relationships or cumulative survival/treatment duration assumptions. - Conclusions From the two trials available, dasatinib and nilotinib have a statistically significant advantage compared with imatinib as measured by MMR or CCyR. Taking into account the treatment pathways for patients with CML, i.e. assuming the use of second-line nilotinib, first-line nilotinib appears to be more cost-effective than first-line imatinib. Dasatinib was not cost-effective if decision thresholds of £20,000 per QALY or £30,000 per QALY were used, compared with imatinib and nilotinib. Uncertainty in the cost-effectiveness analysis would be substantially reduced with better and more UK-specific data on the incidence and cost of stem cell transplantation in patients with chronic CML. - Funding The Health Technology Assessment Programme of the National Institute for Health Research.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The in vivo faecal egg count reduction test (FECRT) is the most commonly used test to detect anthelmintic resistance (AR) in gastrointestinal nematodes (GIN) of ruminants in pasture based systems. However, there are several variations on the method, some more appropriate than others in specific circumstances. While in some cases labour and time can be saved by just collecting post-drench faecal worm egg counts (FEC) of treatment groups with controls, or pre- and post-drench FEC of a treatment group with no controls, there are circumstances when pre- and post-drench FEC of an untreated control group as well as from the treatment groups are necessary. Computer simulation techniques were used to determine the most appropriate of several methods for calculating AR when there is continuing larval development during the testing period, as often occurs when anthelmintic treatments against genera of GIN with high biotic potential or high re-infection rates, such as Haemonchus contortus of sheep and Cooperia punctata of cattle, are less than 100% efficacious. Three field FECRT experimental designs were investigated: (I) post-drench FEC of treatment and controls groups, (II) pre- and post-drench FEC of a treatment group only and (III) pre- and post-drench FEC of treatment and control groups. To investigate the performance of methods of indicating AR for each of these designs, simulated animal FEC were generated from negative binominal distributions with subsequent sampling from the binomial distributions to account for drench effect, with varying parameters for worm burden, larval development and drench resistance. Calculations of percent reductions and confidence limits were based on those of the Standing Committee for Agriculture (SCA) guidelines. For the two field methods with pre-drench FEC, confidence limits were also determined from cumulative inverse Beta distributions of FEC, for eggs per gram (epg) and the number of eggs counted at detection levels of 50 and 25. Two rules for determining AR: (1) %reduction (%R) < 95% and lower confidence limit <90%; and (2) upper confidence limit <95%, were also assessed. For each combination of worm burden, larval development and drench resistance parameters, 1000 simulations were run to determine the number of times the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been declared. When continuing larval development occurs during the testing period of the FECRT, the simulations showed AR should be calculated from pre- and post-drench worm egg counts of an untreated control group as well as from the treatment group. If the widely used resistance rule 1 is used to assess resistance, rule 2 should also be applied, especially when %R is in the range 90 to 95% and resistance is suspected.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Vegetable cropping systems are often characterised by high inputs of nitrogen fertiliser. Elevated emissions of nitrous oxide (N2O) can be expected as a consequence. In order to mitigate N2O emissions from fertilised agricultural fields, the use of nitrification inhibitors, in combination with ammonium based fertilisers, has been promoted. However, no data is currently available on the use of nitrification inhibitors in sub-tropical vegetable systems. A field experiment was conducted to investigate the effect of the nitrification inhibitor 3,4-dimethylpyrazole phosphate (DMPP) on N2O emissions and yield from broccoli production in sub-tropical Australia. Soil N2O fluxes were monitored continuously (3 h sampling frequency) with fully automated, pneumatically operated measuring chambers linked to a sampling control system and a gas chromatograph. Cumulative N2O emissions over the 5 month observation period amounted to 298 g-N/ha, 324 g-N/ha, 411 g-N/ha and 463 g-N/ha in the conventional fertiliser (CONV), the DMPP treatment (DMPP), the DMMP treatment with a 10% reduced fertiliser rate (DMPP-red) and the zero fertiliser (0N), respectively. The temporal variation of N2O fluxes showed only low emissions over the broccoli cropping phase, but significantly elevated emissions were observed in all treatments following broccoli residues being incorporated into the soil. Overall 70–90% of the total emissions occurred in this 5 weeks fallow phase. There was a significant inhibition effect of DMPP on N2O emissions and soil mineral N content over the broccoli cropping phase where the application of DMPP reduced N2O emissions by 75% compared to the standard practice. However, there was no statistical difference between the treatments during the fallow phase or when the whole season was considered. This study shows that DMPP has the potential to reduce N2O emissions from intensive vegetable systems, but also highlights the importance of post-harvest emissions from incorporated vegetable residues. N2O mitigation strategies in vegetable systems need to target these post-harvest emissions and a better evaluation of the effect of nitrification inhibitors over the fallow phase is needed.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Hendra virus (HeV), a highly pathogenic zoonotic paramyxovirus recently emerged from bats, is a major concern to the horse industry in Australia. Previous research has shown that higher temperatures led to lower virus survival rates in the laboratory. We develop a model of survival of HeV in the environment as influenced by temperature. We used 20 years of daily temperature at six locations spanning the geographic range of reported HeV incidents to simulate the temporal and spatial impacts of temperature on HeV survival. At any location, simulated virus survival was greater in winter than in summer, and in any month of the year, survival was higher in higher latitudes. At any location, year-to-year variation in virus survival 24 h post-excretion was substantial and was as large as the difference between locations. Survival was higher in microhabitats with lower than ambient temperature, and when environmental exposure was shorter. The within-year pattern of virus survival mirrored the cumulative within-year occurrence of reported HeV cases, although there were no overall differences in survival in HeV case years and non-case years. The model examines the effect of temperature in isolation; actual virus survivability will reflect the effect of additional environmental factors

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Indospicine is a non-proteinogenic amino acid which occurs in Indigofera species with widespread prevalence in grazing pastures across tropical Africa, Asia, Australia, and the Americas. It accumulates in the tissues of grazing livestock after ingestion of Indigofera. It is a competitive inhibitor of arginase and causes both liver degeneration and abortion. Indospicine hepatoxicity occurs universally across animal species but the degree varies considerably between species, with dogs being particularly sensitive. The magnitude of canine sensitivity is such that ingestion of naturally indospicine-contaminated horse and camel meat has caused secondary poisoning of dogs, raising significant industry concern. Indospicine impacts on the health and production of grazing animals per se has been less widely documented. Livestock grazing Indigofera have a chronic and cumulative exposure to this toxin, with such exposure experimentally shown to induce both hepatotoxicity and embryo-lethal effects in cattle and sheep. In extensive pasture systems, where animals are not closely monitored, the resultant toxicosis may well occur after prolonged exposure but either be undetected, or even if detected not be attributable to a particular cause. Indospicine should be considered as a possible cause of animal poor performance, particularly reduced weight gain or reproductive losses, in pastures where Indigofera are prevalent.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The data obtained in the earlier parts of this series for the donor and acceptor end parameters of N-H. O and O-H. O hydrogen bonds have been utilised to obtain a qualitative working criterion to classify the hydrogen bonds into three categories: "very good" (VG), "moderately good" (MG) and weak (W). The general distribution curves for all the four parameters are found to be nearly of the Gaussian type. Assuming that the VG hydrogen bonds lie between 0 and ± la, MG hydrogen bonds between ± 1 and ± 2, W hydrogen bonds beyond ± 2 (where is the standard deviation), suitable cut-off limits for classifying the hydrogen bonds in the three categories have been derived. These limits are used to get VG and MG ranges for the four parameters 1 and θ (at the donor end) and ± and ± (at the acceptor end). The qualitative strength of a hydrogen bond is decided by the cumulative application of the criteria to all the four parameters. The criterion has been further applied to some practical examples in conformational studies such as α-helix and can be used for obtaining suitable location of hydrogen atoms to form good hydrogen bonds. An empirical approach to the energy of hydrogen bonds in the three categories has also been presented.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Prescribed fire is one of the most widely-used management tools for reducing fuel loads in managed forests. However the long-term effects of repeated prescribed fires on soil carbon (C) and nitrogen (N) pools are poorly understood. This study aimed to investigate how different fire frequency regimes influence C and N pools in the surface soils (0–10 cm). A prescribed fire field experiment in a wet sclerophyll forest established in 1972 in southeast Queensland was used in this study. The fire frequency regimes included long unburnt (NB), burnt every 2 years (2yrB) and burnt every 4 years (4yrB), with four replications. Compared with the NB treatment, the 2yrB treatment lowered soil total C by 44%, total N by 54%, HCl hydrolysable C and N by 48% and 59%, KMnO4 oxidizable C by 81%, microbial biomass C and N by 42% and 33%, cumulative CO2–C by 28%, NaOCl-non-oxidizable C and N by 41% and 51%, and charcoal-C by 17%, respectively. The 4yrB and NB treatments showed no significant differences for these soil C and N pools. All soil labile, biologically active and recalcitrant and total C and N pools were correlated positively with each other and with soil moisture content, but negatively correlated with soil pH. The C:N ratios of different C and N pools were greater in the burned treatments than in the NB treatments. This study has highlighted that the prescribed burning at four year interval is a more sustainable management practice for this subtropical forest ecosystem.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Clays could underpin a viable agricultural greenhouse gas (GHG) abatement technology given their affinity for nitrogen and carbon compounds. We provide the first investigation into the efficacy of clays to decrease agricultural nitrogen GHG emissions (i.e., N2O and NH3). Via laboratory experiments using an automated closed-vessel analysis system, we tested the capacity of two clays (vermiculite and bentonite) to decrease N2O and NH3 emissions and organic carbon losses from livestock manures (beef, pig, poultry, and egg layer) incorporated into an agricultural soil. Clay addition levels varied, with a maximum of 1:1 to manure (dry weight). Cumulative gas emissions were modeled using the biological logistic function, with 15 of 16 treatments successfully fitted (P < 0.05) by this model. When assessing all of the manures together, NH3 emissions were lower (×2) at the highest clay addition level compared with no clay addition, but this difference was not significant (P = 0.17). Nitrous oxide emissions were significantly lower (×3; P < 0.05) at the highest clay addition level compared with no clay addition. When assessing manures individually, we observed generally decreasing trends in NH3 and N2O emissions with increasing clay addition, albeit with widely varying statistical significance between manure types. Most of the treatments also showed strong evidence of increased C retention with increasing clay additions, with up to 10 times more carbon retained in treatments containing clay compared with treatments containing no clay. This preliminary assessment of the efficacy of clays to mitigate agricultural GHG emissions indicates strong promise.