8 resultados para stress-based forming limit
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Nitrogen (N) is the largest agricultural input in many Australian cropping systems and applying the right amount of N in the right place at the right physiological stage is a significant challenge for wheat growers. Optimizing N uptake could reduce input costs and minimize potential off-site movement. Since N uptake is dependent on soil and plant water status, ideally, N should be applied only to areas within paddocks with sufficient plant available water. To quantify N and water stress, spectral and thermal crop stress detection methods were explored using hyperspectral, multispectral and thermal remote sensing data collected at a research field site in Victoria, Australia. Wheat was grown over two seasons with two levels of water inputs (rainfall/irrigation) and either four levels (in 2004; 0, 17, 39 and 163 kg/ha) or two levels (in 2005; 0 and 39 kg/ha N) of nitrogen. The Canopy Chlorophyll Content Index (CCCI) and modified Spectral Ratio planar index (mSRpi), two indices designed to measure canopy-level N, were calculated from canopy-level hyperspectral data in 2005. They accounted for 76% and 74% of the variability of crop N status, respectively, just prior to stem elongation (Zadoks 24). The Normalised Difference Red Edge (NDRE) index and CCCI, calculated from airborne multispectral imagery, accounted for 41% and 37% of variability in crop N status, respectively. Greater scatter in the airborne data was attributable to the difference in scale of the ground and aerial measurements (i.e., small area plant samples against whole-plot means from imagery). Nevertheless, the analysis demonstrated that canopy-level theory can be transferred to airborne data, which could ultimately be of more use to growers. Thermal imagery showed that mean plot temperatures of rainfed treatments were 2.7 °C warmer than irrigated treatments (P < 0.001) at full cover. For partially vegetated fields, the two-Dimensional Crop Water Stress Index (2D CWSI) was calculated using the Vegetation Index-Temperature (VIT) trapezoid method to reduce the contribution of soil background to image temperature. Results showed rainfed plots were consistently more stressed than irrigated plots. Future work is needed to improve the ability of the CCCI and VIT methods to detect N and water stress and apply both indices simultaneously at the paddock scale to test whether N can be targeted based on water status. Use of these technologies has significant potential for maximising the spatial and temporal efficiency of N applications for wheat growers. ‘Ground–breaking Stuff’- Proceedings of the 13th Australian Society of Agronomy Conference, 10-14 September 2006, Perth, Western Australia.
Resumo:
Information on the effects of growing cotton (Gossypium hirsutum L.)-based crop rotations on soil quality of dryland Vertisols is sparse. The objective of this study was to quantify the effects of growing cereal and leguminous crops in rotation with dryland cotton on physical and chemical properties of a grey Vertisol near Warra, SE Queensland, Australia. The experimental treatments, selected after consultations with local cotton growers, were continuous cotton (T1), cotton-sorghum (Sorghum bicolor (L.) Moench.) (T2), cotton-wheat (Triticum aestivum L.) double cropped (T3), cotton-chickpea (Cicer arietinum L.) double cropped followed by wheat (T4) and cotton-wheat (T5). From 1993 to 1996 land preparation was by chisel ploughing to about 0.2 m followed by two to four cultivations with a Gyral tyne cultivator. Thereafter all crops were sown with zero tillage except for cultivation with a chisel plough to about 0.07-0.1 m after cotton picking to control heliothis moth pupae. Soil was sampled from 1996 to 2004 and physical (air-filled porosity of oven-dried soil, an indicator of soil compaction; plastic limit; linear shrinkage; dispersion index) and chemical (pH in 0.01 M CaCl2, organic carbon, exchangeable Ca, Mg, K and Na contents) properties measured. Crop rotation affected soil properties only with respect to exchangeable Na content and air-filled porosity. In the surface 0.15 m during 2000 and 2001 lowest air-filled porosity occurred with T1 (average of 34.6 m3/100 m3) and the highest with T3 (average of 38.9 m3/100 m3). Air-filled porosity decreased in the same depth between 1997 and 1998 from 45.0 to 36.1 m3/100 m3, presumably due to smearing and compaction caused by shallow cultivation in wet soil. In the subsoil, T1 and T2 frequently had lower air-filled porosity values in comparison with T3, T4 and T5, particularly during the early stages of the experiment, although values under T1 increased subsequently. In general, compaction was less under rotations which included a wheat crop (T3, T4, T5). For example, average air-filled porosity (in m3/100 m3) in the 0.15-0.30 m depth from 1996 to 1999 was 19.8 with both T1 and T2, and 21.2 with T3, 21.1 with T4 and 21.5 with T5. From 2000 to 2004, average air-filled porosity (in m3/100 m3) in the same depth was 21.3 with T1, 19.0 with T2, 19.8 with T3, 20.0 with T4 and 20.5 with T5. The rotation which included chickpea (T4) resulted in the lowest exchangeable Na content, although differences among rotations were small. Where only a cereal crop with a fibrous root system was sown in rotation with cotton (T2, T3, T5) linear shrinkage in the 0.45-0.60 m depth was lower than in rotations, which included tap-rooted crops such as chickpea (T4) or continuous cotton (T1). Dispersion index and organic carbon decreased, and plastic limit increased with time. Soil organic carbon stocks decreased at a rate of 1.2 Mg/ha/year. Lowest average cotton lint yield occurred with T2 (0.54 Mg/ha) and highest wheat yield with T3 (2.8 Mg/ha). Rotations which include a wheat crop are more likely to result in better soil structure and cotton lint yield than cotton-sorghum or continuous cotton.
Resumo:
The productivity of a fisheries resource can be quantified from estimates of recruitment, individual growth and natural and fisheries-related mortality, assuming the spatial extent of the resource has been quantified and there is minimal immigration or emigration. The sustainability of a fisheries resource is facilitated by management controls such as minimum and maximum size limits and total allowable catch. Minimum size limits are often set to allow individuals the opportunity to reproduce at least once before the chance of capture. Total allowable catches are a proportion of the population biomass, which is estimated based on known reproduction, recruitment, mortality and growth rates. In some fisheries, however, management actions are put in place without quantification of the resource through the stock assessment process. This occurs because species-specific information, for example individual growth, may not be available. In these circumstances, management actions need to be precautionary to protect against future resource collapse, but this often means that the resource is lightly exploited. Consequently, the productivity of the resource is not fully realised. Australia’s most valuable fisheries are invertebrate fisheries (Australian Department of Agriculture Fisheries and Forestry, 2008). For example, Australian fisheries (i.e. excluding aquaculture) production of crustaceans (largely prawns, rock lobster and crab) was 41,000 tonnes in 2006/7, worth $778 million. Production from mollusc (largely abalone, scallops, oysters and squid) fisheries was 39,000 tonnes, worth $502 million. Together, in 2006/7 crustacean and mollusc fisheries represented 58% of the total value of Australian wild fisheries production. Sustainable management of Australia’s invertebrate fisheries is frustrated by the lack of data on species-specific growth rates. This project investigated a new method to estimate age, and hence individual growth rates, in invertebrate fisheries species. The principle behind the new aging method was that telomeres (i.e. DNA end-caps of chromosomes) get shorter as an individual gets older. We studied commercial crustacean and molluscan species. A vertebrate fish species (silver perch, Bidyanus bidyanus) was used as a control to standardise our work against the literature. We found a clear relationship between telomere length and shell size for temperate abalone (Haliotis rubra). Further research is recommended before the method can be implemented to assist management of wildharvested abalone populations. Age needs to be substituted for shell size in the relationship and it needs to be studied for abalone from several regions. This project showed that telomere length declined with increasing age in Sydney rock oysters (Saccostrea glomerata) and was affected by regional variation. A relationship was not apparent between telomere length and age (or size as a surrogate for age) for crustacean species (school prawns, Metapenaeus macleayi; eastern rock lobster, Sagmariasus verreauxi; southern rock lobster, Jasus edwardsii; and spanner crabs, Ranina ranina). For school prawns, there was no difference between telomere length in males and females. Further research is recommended, however, as telomeric DNA from crustaceans was difficult to analyse using the terminal restriction fragment (TRF) assay. Telomere lengths of spanner crabs and lobsters were at the upper limit of resolution of the assay used and results were affected by degradation and possible contamination of telomeric DNA. It is possible that telomere length is an indicator of remaining lifespan in molluscan and crustacean individuals, as suggested for some vertebrate species (e.g. Monaghan, 2010). Among abalone of similar shell size and among lobster pueruli, there was evidence of individuals having significantly longer or shorter telomeres than the group average. At a population level, this may be a surrogate for estimates of future natural mortality, which may have usefulness in the management of those populations. The method used to assay telomere length (terminal restriction fragment assay) performed adequately for most species, but it was too expensive and time-consuming to be considered a useful tool for gathering information for fisheries management. Research on alternative methods is strongly recommended.
Resumo:
Genotype-environment interactions (GEI) limit genetic gain for complex traits such as tolerance to drought. Characterization of the crop environment is an important step in understanding GEI. A modelling approach is proposed here to characterize broadly (large geographic area, long-term period) and locally (field experiment) drought-related environmental stresses, which enables breeders to analyse their experimental trials with regard to the broad population of environments that they target. Water-deficit patterns experienced by wheat crops were determined for drought-prone north-eastern Australia, using the APSIM crop model to account for the interactions of crops with their environment (e.g. feedback of plant growth on water depletion). Simulations based on more than 100 years of historical climate data were conducted for representative locations, soils, and management systems, for a check cultivar, Hartog. The three main environment types identified differed in their patterns of simulated water stress around flowering and during grain-filling. Over the entire region, the terminal drought-stress pattern was most common (50% of production environments) followed by a flowering stress (24%), although the frequencies of occurrence of the three types varied greatly across regions, years, and management. This environment classification was applied to 16 trials relevant to late stages testing of a breeding programme. The incorporation of the independently-determined environment types in a statistical analysis assisted interpretation of the GEI for yield among the 18 representative genotypes by reducing the relative effect of GEI compared with genotypic variance, and helped to identify opportunities to improve breeding and germplasm-testing strategies for this region.
Resumo:
The effect of partially replacing rolled barley (86.6% of control diet) with 20% wheat dried distillers grains plus solubles (DDGS), 40% wheat DDGS, 20% corn DDGS, or 40% corn DDGS (dietary DM basis) on rumen fluid fatty acid (FA) composition and some rumen bacterial communities was evaluated using 100 steers (20 per treatment). Wheat DDGS increased the 11t-to 10t-18:1 ratio (P < 0.05) in rumen fluid and there was evidence that the conversion of trans-18:1 to 18:0 was reduced in the control and wheat DDGS diets but not in the corn DDGS diet. Bacterial community profiles obtained using denaturing gradient gel electrophoresis and evaluated by Pearson correlation similarity matrices were not consistent for diet and, therefore, these could not be linked to different specific rumen FA. This inconsistency may be related to the nature of diets fed (dominant effect of barley), limited change in dietary composition as the result of DDGS inclusion, large animal-to-animal variation, and possibly additional stress as a result of transport just before slaughter. Ruminal densities of a key fiber-digesting bacteria specie that produces 11t-18:1 from linoleic and linolenic acids (Butyrivibrio fibrisolvens), and a lactate producer originally thought responsible for production of 10t, 12c-18:2 (Megasphaera elsdenii) were not influenced by diet (P > 0.05).
Resumo:
An examination of ex-type and authentic cultures of 34 species of Bipolaris and Curvularia by phylogenetic analysis of four loci (EF-1α, GAPDH, ITS and LSU) resulted in nine new combinations in Curvularia, as well as new synonymies for some species of Bipolaris and Curvularia. Lectotypes are designated for Bipolaris secalis and Curvularia richardiae, and an epitype is designated for Curvularia crustacea. A new monotypic genus, Johnalcornia, is introduced to accommodate Bipolaris aberrans, which clusters sister to the newly described Porocercospora. Johnalcornia differs morphologically from this taxon by producing distinctive conidia-like chlamydospores as well as comparatively thick-walled, geniculate conidiophores, with conidiogenous cells that have conspicuous scars. Johnalcornia further differs from related genera by forming the second conidial septum in the apical cell.
Resumo:
Probiotic supplements are single or mixed strain cultures of live microorganisms that benefit the host by improving the properties of the indigenous microflora (Seo et al 2010). In a pilot study at the University of Queensland, Norton et al (2008) found that Bacillus amyloliquefaciens Strain H57 (H57), primarily investigated as an inoculum to make high-quality hay, improved feed intake and nitrogen utilisation over several weeks in pregnant ewes. The purpose of the following study was to further challenge the potential of H57 -to show it survives the steam-pelleting process, and that it improves the performance of ewes fed pellets based on an agro-industrial by-product with a reputation for poor palatability, palm kernel meal (PKM), (McNeill 2013). Thirty-two first-parity White Dorper ewes (day 37 of pregnancy, mean liveweight = 47.3 kg, mean age = 15 months) were inducted into individual pens in the animal house at the University of Queensland, Gatton. They were adjusted onto PKM-based pellets (g/kg drymatter (DM): PKM, 408; sorghum, 430; chick pea hulls, 103; minerals and vitamins; Crude protein, 128; ME: 11.1MJ/kg DM) until day 89 of pregnancy and thereafter fed a predominately pelleted diet incorporating with or without H57 spores (10 9 colony forming units (cfu)/kg pellet, as fed), plus 100g/ewe/day oaten chaff, until day 7 of lactation. From day 7 to 20 of lactation the pelleted component of the diet was steadily reduced to be replaced by a 50:50 mix of lucerne: oaten chaff, fed ad libitum, plus 100g/ewe/day of ground sorghum grain with or without H57 (10 9 cfu/ewe/day). The period of adjustment in pregnancy (day 37-89) extended beyond expectations due to some evidence of mild ruminal acidosis after some initially high intakes that were followed by low intakes. During that time the diet was modified, in an attempt to improve palatability, by the addition of oaten chaff and the removal of an acidifying agent (NH4Cl) that was added initially to reduce the risk of urinary calculi. Eight ewes were removed due to inappetence, leaving 24 ewes to start the trial at day 90 of pregnancy. From day 90 of pregnancy until day 63 of lactation, liveweights of the ewes and their lambs were determined weekly and at parturition. Feed intakes of the ewes were determined weekly. Once lambing began, 1 ewe was removed as it gave birth to twin lambs (whereas the rest gave birth to a single lamb), 4 due to the loss of their lambs (2 to dystocia), and 1 due to copper toxicity. The PKM pellets were suspected to be the cause of the copper toxicity and so were removed in early lactation. Hence, the final statistical analysis using STATISTICA 8 (Repeated measures ANOVA for feed intake, One-way ANOVA for liveweight change and birth weight) was completed on 23 ewes for the pregnancy period (n = 11 fed H57; n = 12 control), and 18 ewes or lambs for the lactation period (n = 8 fed H57; n = 10 control). From day 90 of pregnancy until parturition the H57 supplemented ewes ate 17 more DM (g/day: 1041 vs 889, sed = 42.4, P = 0.04) and gained more liveweight (g/day: 193 vs 24.0, sed = 25.4, P = 0.0002), but produced lambs with a similar birthweight (kg: 4.18 vs 3.99, sed = 0.19, P = 0.54). Over the 63 days of lactation the H57 ewes ate similar amounts of DM but grew slower than the control ewes (g/day: 1.5 vs 97.0, sed = 21.7, P = 0.012). The lambs of the H57 ewes grew faster than those of the control ewes for the first 21 days of lactation (g/day: 356 vs 265, sed = 16.5, P = 0.006). These data support the findings of Norton et al (2008) and Kritas et al (2006) that certain Bacillus spp. supplements can improve the performance of pregnant and lactating ewes. In the current study we particularly highlighted the capacity of H57 to stimulate immature ewes to continue to grow maternal tissue through pregnancy, possibly through an enhanced appetite, which appeared then to stimulate a greater capacity to partition nutrients to their lambs through milk, at least for the first few weeks of lactation, a critical time for optimising lamb survival. To conclude, H57 can survive the steam pelleting process to improve feed intake and maternal liveweight gain in late pregnancy, and performance in early lactation, of first-parity ewes fed a diet based on PKM.
Resumo:
Two trials were done in this project. One was a continuation of work started under a previous GRDC/SRDC-funded activity, 'Strategies to improve the integration of legumes into cane based farming systems'. This trial aimed to assess the impact of trash and tillage management options and nematicide application on nematodes and crop performance. Methods and results are contained in the following publication: Halpin NV, Stirling GR, Rehbein WE, Quinn B, Jakins A, Ginns SP. The impact of trash and tillage management options and nematicide application on crop performance and plant-parasitic nematode populations in a sugarcane/peanut farming system. Proc. Aust. Soc. Sugar Cane Technol. 37, 192-203. Nematicide application in the plant crop significantly reduced total numbers of plant parasitic nematodes (PPN) but there was no impact on yield. Application of nematicide to the ratoon crop significantly reduced sugar yield. The study confirmed other work demonstrating that implementation of strategies like reduced tillage reduced populations of total PPN, suggesting that the soil was more suppressive to PPN in those treatments. The second trial, a variety trial, demonstrated the limited value of nematicide application in sugarcane farming systems. This study has highlighted that growers shouldn’t view nematicides as a ‘cure all’ for paddocks that have historically had high PPN numbers. Nematicides have high mammalian toxicity, have the potential to contaminate ground water (Kookana et al. 1995) and are costly. The cost of nematicide used in R1 was approx. $320 - $350/ha, adding $3.50/t of cane in a 100 t/ha crop. Also, our study demonstrated that a single nematicide treatment at the application rate registered for sugarcane is not very effective in reducing populations of nematode pests. There appears to be some levels of resistance to nematodes within the current suite of varieties available to the southern canelands. For example the soil in plots that were growing Q183 had 560% more root knot nematodes / 200mL soil compared to plots that grew Q245. The authors see great value in investment into a nematode screening program that could rate varieties into groups of susceptibility to both major sugarcane nematode pests. Such a rating could then be built into a decision support ‘tree’ or tool to better enable producers to select varieties on a paddock by paddock basis.