53 resultados para Nitrogen excretion
Resumo:
Cyperus iria is a weed of rice with widespread occurrence throughout the world. Because of concerns about excessive and injudicious use of herbicides, cultural weed management approaches that are safe and economical are needed. Developing such approaches will require a better understanding of weed biology and ecology, as well as of weed response to increases in crop density and nutrition. Knowledge of the effects of nitrogen (N) fertilizer on crop-weed competitive interactions could also help in the development of integrated weed management strategies. The present study was conducted in a screenhouse to determine the effects of rice planting density (0, 5, 10, and 20 plants pot−1) and N rate (0, 50, 100, and 150 kg ha−1) on the growth of C. iria. Tiller number per plant decreased by 73–88%, leaf number by 85–94%, leaf area by 85–98%, leaf biomass by 92–99%, and inflorescence biomass by 96–99% when weed plants were grown at 20 rice plants pot−1 (i.e., 400 plants m−2) compared with weed plants grown alone. All of these parameters increased when N rates were increased. On average, weed biomass increased by 118–389% and rice biomass by 121–275% with application of 50–150 kg N ha−1, compared to control. Addition of N favored weed biomass production relative to rice biomass. Increased N rates reduced the root-to-shoot weight ratio of C. iria. Rice interference reduced weed growth and biomass and completely suppressed C. iria when no N was applied at high planting densities (i.e., 20 plants pot−1). The weed showed phenotypic plasticity in response to N application, and the addition of N increased the competitive ability of the weed over rice at densities of 5 and 10 rice plants pot−1 compared with 20 plants pot−1. The results of the present study suggest that high rice density (i.e., 400 plants m−2) can help suppress C. iria growth even at high N rates (150 kg ha−1).
Resumo:
Pratylenchus thornei is a major pathogen of wheat in Australia. Two glasshouse experiments with four wheat cultivars that had different final populations (Pf) of P. thornei in the field were used to optimise conditions for assessing resistance. With different initial populations (Pi) ranging up to 5250 P. thornei/kg soil, Pf of P. thornei increased to 16 weeks after sowing, and then decreased at 20 weeks in some cultivar x Pi combinations. The population dynamics of P. thornei up to 16 weeks were best described by a modified exponential equation P f (t) = aP i e kt where P f (t) is the final population density at time t, P i is the initial population density, a is the proportion of P i that initiates population development, and k is the intrinsic rate of increase of the population. The cultivar GS50a had very low k values at Pi of 5250 and 1050 indicating its resistance, Suneca and Potam had high k values indicating susceptibility, whereas intolerant Gatcher had a low value at the higher Pi and a high value at the lower Pi. Nitrate fertiliser increased plant growth and Pf values of susceptible cultivars, but in unplanted soil it decreased Pf. Nematicide (aldicarb 5 mg/kg soil) killed P. thornei more effectively in planted than in unplanted soil and increased plant growth particularly in the presence of N fertiliser. In both experiments, the wheat cultivars Suneca and Potam were more susceptible than the cultivar GS50a reflecting field results. The method chosen to discriminate wheat cultivars was to assess Pf after growth for 16 weeks in soil with Pi ~1050–5250 P. thornei/kg soil and fertilised with 200 mg NO3–N/kg soil.
Resumo:
Pteropid bats or flying-foxes (Chiroptera: Pteropodidae) are the natural host of Hendra virus (HeV) which sporadically causes fatal disease in horses and humans in eastern Australia. While there is strong evidence that urine is an important infectious medium that likely drives bat to bat transmission and bat to horse transmission, there is uncertainty about the relative importance of alternative routes of excretion such as nasal and oral secretions, and faeces. Identifying the potential routes of HeV excretion in flying-foxes is important to effectively mitigate equine exposure risk at the bat-horse interface, and in determining transmission rates in host-pathogen models. The aim of this study was to identify the major routes of HeV excretion in naturally infected flying-foxes, and secondarily, to identify between-species variation in excretion prevalence. A total of 2840 flying-foxes from three of the four Australian mainland species (Pteropus alecto, P. poliocephalus and P. scapulatus) were captured and sampled at multiple roost locations in the eastern states of Queensland and New South Wales between 2012 and 2014. A range of biological samples (urine and serum, and urogenital, nasal, oral and rectal swabs) were collected from anaesthetized bats, and tested for HeV RNA using a qRT-PCR assay targeting the M gene. Forty-two P. alecto (n = 1410) had HeV RNA detected in at least one sample, and yielded a total of 78 positive samples, at an overall detection rate of 1.76% across all samples tested in this species (78/4436). The rate of detection, and the amount of viral RNA, was highest in urine samples (>serum, packed haemocytes >faecal >nasal >oral), identifying urine as the most plausible source of infection for flying-foxes and for horses. Detection in a urine sample was more efficient than detection in urogenital swabs, identifying the former as the preferred diagnostic sample. The detection of HeV RNA in serum is consistent with haematogenous spread, and with hypothesised latency and recrudesence in flying-foxes. There were no detections in P. poliocephalus (n = 1168 animals; n = 2958 samples) or P. scapulatus (n = 262 animals; n = 985 samples), suggesting (consistent with other recent studies) that these species are epidemiologically less important than P. alecto in HeV infection dynamics. The study is unprecedented in terms of the individual animal approach, the large sample size, and the use of a molecular assay to directly determine infection status. These features provide a high level of confidence in the veracity of our findings, and a sound basis from which to more precisely target equine risk mitigation strategies.
Resumo:
Extensive cattle grazing is the dominant land use in northern Australia. It has been suggested that grazing intensity and rainfall have profound effects on the dynamics of soil nutrients in northern Australia’s semi-arid rangelands. Previous studies have found positive, neutral and negative effects of grazing pressure on soil nutrients. These inconsistencies could be due to short-term experiments that do not capture the slow dynamics of some soil nutrients and the effects of interannual variability in rainfall. In a long-term cattle grazing trial in northern Australia on Brown Sodosol–Yellow Kandosol complex, we analysed soil organic matter and mineral nitrogen in surface soils (0–10 cm depth) 11, 12 and 16 years after trial establishment on experimental plots representing moderate stocking (stocked at the long-term carrying capacity for the region) and heavy stocking (stocked at twice the long-term carrying capacity). Higher soil organic matter was found under heavy stocking, although grazing treatment had little effect on mineral and total soil nitrogen. Interannual variability had a large effect on soil mineral nitrogen, but not on soil organic matter, suggesting that soil nitrogen levels observed in this soil complex may be affected by other indirect pathways, such as climate. The effect of interannual variability in rainfall and the effects of other soil types need to be explored further.
Resumo:
Increasing organic carbon inputs to agricultural soils through the use of pastures or crop residues has been suggested as a means of restoring soil organic carbon lost via anthropogenic activities, such as land use change. However, the decomposition and retention of different plant residues in soil, and how these processes are affected by soil properties and nitrogen fertiliser application, is not fully understood. We evaluated the rate and extent of decomposition of 13C-pulse labelled plant material in response to nitrogen addition in four pasture soils of varying physico-chemical characteristics. Microbial respiration of buffel grass (Cenchrus ciliaris L.), wheat (Triticum aestivum L.) and lucerne (Medicago sativa L.) residues was monitored over 365-days. A double exponential model fitted to the data suggested that microbial respiration occurred as an early rapid and a late slow stage. A weighted three-compartment mixing model estimated the decomposition of both soluble and insoluble plant 13C (mg C kg−1 soil). Total plant material decomposition followed the alkyl C: O-alkyl C ratio of plant material, as determined by solid-state 13C nuclear magnetic resonance spectroscopy. Urea-N addition increased the decomposition of insoluble plant 13C in some soils (≤0.1% total nitrogen) but not others (0.3% total nitrogen). Principal components regression analysis indicated that 26% of the variability of plant material decomposition was explained by soil physico-chemical characteristics (P = 0.001), which was primarily described by the C:N ratio. We conclude that plant species with increasing alkyl C: O-alkyl C ratio are better retained as soil organic matter, and that the C:N stoichiometry of soils determines whether N addition leads to increases in soil organic carbon stocks.
Resumo:
Prescribed fire is one of the most widely-used management tools for reducing fuel loads in managed forests. However the long-term effects of repeated prescribed fires on soil carbon (C) and nitrogen (N) pools are poorly understood. This study aimed to investigate how different fire frequency regimes influence C and N pools in the surface soils (0–10 cm). A prescribed fire field experiment in a wet sclerophyll forest established in 1972 in southeast Queensland was used in this study. The fire frequency regimes included long unburnt (NB), burnt every 2 years (2yrB) and burnt every 4 years (4yrB), with four replications. Compared with the NB treatment, the 2yrB treatment lowered soil total C by 44%, total N by 54%, HCl hydrolysable C and N by 48% and 59%, KMnO4 oxidizable C by 81%, microbial biomass C and N by 42% and 33%, cumulative CO2–C by 28%, NaOCl-non-oxidizable C and N by 41% and 51%, and charcoal-C by 17%, respectively. The 4yrB and NB treatments showed no significant differences for these soil C and N pools. All soil labile, biologically active and recalcitrant and total C and N pools were correlated positively with each other and with soil moisture content, but negatively correlated with soil pH. The C:N ratios of different C and N pools were greater in the burned treatments than in the NB treatments. This study has highlighted that the prescribed burning at four year interval is a more sustainable management practice for this subtropical forest ecosystem.