29 resultados para critical period


Relevância:

20.00% 20.00%

Publicador:

Resumo:

After more than 30 years in which ‘Tifgreen’ and ‘Tifdwarf’ were the only greens-quality varieties available, the choice for golf courses and bowls clubs in northern Australia has been expanded to include six new Cynodon hybrids [Cynodon dactylon (L.) Pers x Cynodon transvaalensis Burtt-Davy]. Five of these – ‘Champion Dwarf’ (Texas), ‘MS-Supreme’ (Mississippi), FloraDwarf™ (Florida), ‘TifEagle’ (Georgia), MiniVerde™ (Arizona) - are from US breeding programs, while the sixth, ‘TL2’ (marketed as Novotek™) was selected in north Queensland. The finer, denser and lower growing habit of the “ultradwarf” cultivars allows very low mowing heights (e.g. 2.5 mm) to be imposed, resulting in denser and smoother putting and bowls surfaces. In addition to the Cynodon hybrids, four new greens quality seashore paspalum (Paspalum vaginatum O. Swartz) cultivars including ‘Sea Isle 2000’, Sea Isle Supreme™, Velvetene™ and Sea Dwarf™ (where tolerance of salty water is required) expands the range of choices for greens in difficult environments. The project was developed to determine (a) the appropriate choice of cultivar for different environments and budgets, and (b) best management practices for the new cultivars which differ from the Cynodon hybrid industry standards ‘Tifgreen’ and ‘Tifdwarf’. Management practices, particularly fertilising, mowing heights and frequency, and thatch control were investigated to determine optimum management inputs and provide high quality playing surfaces with the new grasses. To enable effective trialling of these new and old cultivars it was essential to have a number of regional sites participating in the study. Drought and financial hardship of many clubs presented an initial setback with numerous clubs wanting to be involved in the study but were unable to commit due to their financial position at the time. The study was fortunate to have seven regional sites from Queensland, New South Wales, Victoria and South Australia volunteer to be involved in the study which would add to the results being collected at the centralised test facility being constructed at DEEDI’s Redlands Research Station. The major research findings acquired from the eight trial sites included: • All of the new second generation “ultradwarf” couchgrasses tend to produce a large amount of thatch with MiniVerde™ being the greatest thatch producer, particularly compared to ‘Tifdwarf’ and ‘Tifgreen’. The maintenance of the new Cynodon hybrids will require a program of regular dethatching/grooming as well as regular light dustings of sand. Thatch prevention should begin 3 to 4 weeks after planting a new “ultradwarf” couchgrass green, with an emphasis on prevention rather than control. • The “ultradwarfs” produced faster green speeds than the current industry standards ‘Tifgreen’ and ‘Tifdwarf’. However, all Cynodon hybrids were considerably faster than the seashore paspalums (e.g. comparable to the speed diference of Bentgrass and couchgrass) under trial conditions. Green speed was fastest being cut at 3.5 mm and rolled (compared to 3.5 mm cut, no roll and 2.7 mm cut, no roll). • All trial sites reported the occurrence of disease in the Cynodon hybrids with the main incidence of disease occurring during the dormancy period (autumn and winter). The main disease issue reported was “patch diseases” which includes both Gaumannomyces and Rhizoctonia species. There was differences in the severity of the disease between cultivars, however, the severity of the disease was not consistent between cultivars and is largely attributed to an environment (location) effect. In terms of managing the occurrence of disease, the incidence of disease is less severe where there is a higher fertility rate (about 3 kgN/100m2/year) or a preventitatve fungicide program is adopted. • Cynodon hybrid and seashore paspalum cultivars maintained an acceptable to ideal surface being cut between 2.7 mm and 5.0 mm. “Ultradwarf” cultivars can tolerate mowing heights as low as 2.5 mm for short periods but places the plant under high levels of stress. Greens being maintained at a continually lower cutting height (e.g. 2.7 mm) of both species is achievable, but would need to be cut daily for best results. Seashore paspalums performed best when cut at a height of between 2.7 mm and 3.0 mm. If a lower cutting height is adopted, regular and repeated mowings are required to reduce scalping and produce a smooth surface. • At this point in time the optimum rate of nitrogen (N) for the Cynodon hybrids is 3 kg/100m2/year and while the seashore paspalums is 2 to 3 kg/100m2/year. • Dormancy occurred for all Cynodon and seashore paspalum culitvars from north in Brisbane (QLD) to south in Mornington Peninsula (VIC) and west to Novar Gardens (SA). Cynodon and Paspalum growth in both Victoria and South Australia was less favourable as a result of the cooler climates. • After combining the data collected from all eight sites, the results indicated that there can be variation (e.g. turfgrass quality, colour, disease resistance, performace) depending on the site and climatic conditions. Such evidence highlights the need to undertake genotype by environment (G x E) studies on new and old cultivars prior to conversion or establishment. • For a club looking to select either a Cynodon hybrid or seashore paspalum cultivar for use at their club they need to: - Review the research data. - Look at trial plots. - Inspect greens in play that have the new grasses. - Select 2 to 3 cultivars that are considered to be the better types. - Establish them in large (large enough to putt on) plots/nursery/practice putter. Ideally the area should be subjected to wear. - Maintain them exactly as they would be on the golf course/lawn bowls green. This is a critical aspect. Regular mowing, fertilising etc. is essential. - Assess them over at least 2 to 3 years. - Make a selection and establish it in a playing green so that it is subjected to typical wear.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Weed eradication efforts often must be sustained for long periods owing to the existence of persistent seed banks, among other factors. Decision makers need to consider both the amount of investment required and the period over which investment must be maintained when determining whether to commit to (or continue) an eradication programme. However, a basis for estimating eradication programme duration based on simple data has been lacking. Here, we present a stochastic dynamic model that can provide such estimates. 2. The model is based upon the rates of progression of infestations from the active to the monitoring state (i.e. no plants detected for at least 12 months), rates of reversion of infestations from monitoring to the active state and the frequency distribution of time since last detection for all infestations. Isoquants that illustrate the combinations of progression and reversion parameters corresponding to eradication within different time frames are generated. 3. The model is applied to ongoing eradication programmes targeting branched broomrape Orobanche ramosa and chromolaena Chromolaena odorata. The minimum periods in which eradication could potentially be achieved were 22 and 23 years, respectively. On the basis of programme performance until 2008, however, eradication is predicted to take considerably longer for both species (on average, 62 and 248 years, respectively). Performance of the branched broomrape programme could be best improved through reducing rates of reversion to the active state; for chromolaena, boosting rates of progression to the monitoring state is more important. 4. Synthesis and applications. Our model for estimating weed eradication programme duration, which captures critical transitions between a limited number of states, is readily applicable to any weed.Aparticular strength of the method lies in its minimal data requirements. These comprise estimates of maximum seed persistence and infested area, plus consistent annual records of the detection (or otherwise) of the weed in each infestation. This work provides a framework for identifying where improvements in management are needed and a basis for testing the effectiveness of alternative tactics. If adopted, our approach should help improve decision making with regard to eradication as a management strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This is the first of two projects which generates the required chlorothalonil and difenconazole reside data to potentially reduce the withholding periods down from 7 days to possibly 3 or 5 days. This project funds the generation of pesticide residue sasamples in papaya which will be analysed under project PP09007. These reside data for the papaya industry are required to support the reduction in the withholding period for chlorothaloni; trade neamed including Bravo and Barrack, and difenconazole.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project focussed on the phosphorus (P) and potassium (K) status of northern cropping soils. Stores of P and K have been depleted by crop removal and limited fertiliser application, with depletion most significant in the subsoil. Soil testing strategies are confounded by slowly available mineral reserves with uncertain availability. The utility of new soil tests was assessed to measure these reserves, their availability to plants quantified and a regional sampling strategy undertaken to identify areas of greatest P and K deficit. Fertiliser application strategies for P and K have been tested and the interactions between these and other nutrients have been determined in a large field program.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

On-going, high-profile public debate about climate change has focussed attention on how to monitor the soil organic carbon stock (C(s)) of rangelands (savannas). Unfortunately, optimal sampling of the rangelands for baseline C(s) - the critical first step towards efficient monitoring - has received relatively little attention to date. Moreover, in the rangelands of tropical Australia relatively little is known about how C(s) is influenced by the practice of cattle grazing. To address these issues we used linear mixed models to: (i) unravel how grazing pressure (over a 12-year period) and soil type have affected C(s) and the stable carbon isotope ratio of soil organic carbon (delta(13)C) (a measure of the relative contributions of C(3) and C(4) vegetation to C(s)); (ii) examine the spatial covariation of C(s) and delta(13)C; and, (iii) explore the amount of soil sampling required to adequately determine baseline C(s). Modelling was done in the context of the material coordinate system for the soil profile, therefore the depths reported, while conventional, are only nominal. Linear mixed models revealed that soil type and grazing pressure interacted to influence C(s) to a depth of 0.3 m in the profile. At a depth of 0.5 m there was no effect of grazing on C(s), but the soil type effect on C(s) was significant. Soil type influenced delta(13)C to a soil depth of 0.5 m but there was no effect of grazing at any depth examined. The linear mixed model also revealed the strong negative correlation of C(s) with delta(13)C, particularly to a depth of 0.1 m in the soil profile. This suggested that increased C(s) at the study site was associated with increased input of C from C(3) trees and shrubs relative to the C(4) perennial grasses; as the latter form the bulk of the cattle diet, we contend that C sequestration may be negatively correlated with forage production. Our baseline C(s) sampling recommendation for cattle-grazing properties of the tropical rangelands of Australia is to: (i) divide the property into units of apparently uniform soil type and grazing management; (ii) use stratified simple random sampling to spread at least 25 soil sampling locations about each unit, with at least two samples collected per stratum. This will be adequate to accurately estimate baseline mean C(s) to within 20% of the true mean, to a nominal depth of 0.3 m in the profile.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

There is a large gap between the refined approaches to characterise genotypes and the common use of location and season as a coarse surrogate for environmental characterisation of breeding trials. As a framework for breeding, the aim of this paper is quantifying the spatial and temporal patterns of thermal and water stress for field pea in Australia. We compiled a dataset for yield of the cv. Kaspa measured in 185 environments, and investigated the associations between yield and seasonal patterns of actual temperature and modelled water stress. Correlations between yield and temperature indicated two distinct stages. In the first stage, during crop establishment and canopy expansion before flowering, yield was positively associated with minimum temperature. Mean minimum temperature below similar to 7 degrees C suggests that crops were under suboptimal temperature for both canopy expansion and radiation-use efficiency during a significant part of this early growth period. In the second stage, during critical reproductive phases, grain yield was negatively associated with maximum temperature over 25 degrees C. Correlations between yield and modelled water supply/demand ratio showed a consistent pattern with three phases: no correlation at early stages of the growth cycle, a progressive increase in the association that peaked as the crop approached the flowering window, and a progressive decline at later reproductive stages. Using long-term weather records (1957-2010) and modelled water stress for 104 locations, we identified three major patterns of water deficit nation wide. Environment type 1 (ET1) represents the most favourable condition, with no stress during most of the pre-flowering phase and gradual development of mild stress after flowering. Type 2 is characterised by increasing water deficit between 400 degree-days before flowering and 200 degree-days after flowering and rainfall that relieves stress late in the season. Type 3 represents the more stressful condition with increasing water deficit between 400 degree-days before flowering and maturity. Across Australia, the frequency of occurrence was 24% for ET1, 32% for ET2 and 43% for ET3, highlighting the dominance of the most stressful condition. Actual yield averaged 2.2 t/ha for ET1, 1.9 t/ha for ET2 and 1.4 t/ha for ET3, and the frequency of each pattern varied substantially among locations. Shifting from a nominal (i.e. location and season) to a quantitative (i.e. stress type) characterisation of environments could help improving breeding efficiency of field pea in Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Top-predators have been reported to have an important role in structuring food webs and maintaining ecological processes for the benefit of biodiversity at lower trophic levels. This is thought to be achieved through their suppressive effects on sympatric mesopredators and prey. Great scientific and public interest surrounds the potential use of top-predators as biodiversity conservation tools, and it can often be difficult to separate what we think we know and what we really know about their ecological utility. Not all the claims made about the ecological roles of top-predators can be substantiated by current evidence. We review the methodology underpinning empirical data on the ecological roles of Australian dingoes (Canis lupus dingo and hybrids) to provide a comprehensive and objective benchmark for knowledge of the ecological roles of Australia's largest terrestrial predator. From a wide variety of methodological flaws, sampling bias, and experimental design constraints inherent to 38 of the 40 field studies we assessed, we demonstrate that there is presently unreliable and inconclusive evidence for dingoes role as a biodiversity regulator. We also discuss the widespread (both taxonomically and geographically) and direct negative effects of dingoes to native fauna, and the few robust studies investigating their positive roles. In light of the highly variable and context-specific impacts of dingoes on faunal biodiversity and the inconclusive state of the literature, we strongly caution against the positive management of dingoes in the absence of a supporting evidence-base for such action.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Statistical studies of rainfed maize yields in the United States(1) and elsewhere(2) have indicated two clear features: a strong negative yield response to accumulation of temperatures above 30 degrees C (or extreme degree days (EDD)), and a relatively weak response to seasonal rainfall. Here we show that the process-based Agricultural Production Systems Simulator (APSIM) is able to reproduce both of these relationships in the Midwestern United States and provide insight into underlying mechanisms. The predominant effects of EDD in APSIM are associated with increased vapour pressure deficit, which contributes to water stress in two ways: by increasing demand for soil water to sustain a given rate of carbon assimilation, and by reducing future supply of soil water by raising transpiration rates. APSIM computes daily water stress as the ratio of water supply to demand, and during the critical month of July this ratio is three times more responsive to 2 degrees C warming than to a 20% precipitation reduction. The results suggest a relatively minor role for direct heat stress on reproductive organs at present temperatures in this region. Effects of elevated CO2 on transpiration efficiency should reduce yield sensitivity to EDD in the coming decades, but at most by 25%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Increasing resistance to phosphine (PH 3) in insect pests, including lesser grain borer (Rhyzopertha dominica) has become a critical issue, and development of effective and sustainable strategies to manage resistance is crucial. In practice, the same grain store may be fumigated multiple times, but usually for the same exposure period and concentration. Simulating a single fumigation allows us to look more closely at the effects of this standard treatment.We used an individual-based, two-locus model to investigate three key questions about the use of phosphine fumigant in relation to the development of PH 3 resistance. First, which is more effective for insect control; long exposure time with a low concentration or short exposure period with a high concentration? Our results showed that extending exposure duration is a much more efficient control tactic than increasing the phosphine concentration. Second, how long should the fumigation period be extended to deal with higher frequencies of resistant insects in the grain? Our results indicated that if the original frequency of resistant insects is increased n times, then the fumigation needs to be extended, at most, n days to achieve the same level of insect control. The third question is how does the presence of varying numbers of insects inside grain storages impact the effectiveness of phosphine fumigation? We found that, for a given fumigation, as the initial population number was increased, the final survival of resistant insects increased proportionally. To control initial populations of insects that were n times larger, it was necessary to increase the fumigation time by about n days. Our results indicate that, in a 2-gene mediated resistance where dilution of resistance gene frequencies through immigration of susceptibles has greater effect, extending fumigation times to reduce survival of homozygous resistant insects will have a significant impact on delaying the development of resistance. © 2012 Elsevier Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A disease outbreak investigation was conducted in western Queensland to investigate a rare suspected outbreak of pyrrolizidine alkaloid (PA) toxicosis in horses. Thirty five of 132 horses depastured on five properties on the Mitchell grass plains of western Queensland died in the first six months of 2010. Clinical–pathological findings were consistent with PA toxicosis. A local variety of Crotalaria medicaginea was the only hepatotoxic plant found growing on affected properties. Pathology reports and departure and arrival dates of two brood mares provided evidence of a pre wet season exposure period. All five affected properties experienced a very dry spring and early summer preceded by a large summer wet season. The outbreak was characterised as a point epidemic with a sudden peak of deaths in March followed by mortalities steadily declining until the end of June. The estimated morbidity (serum IGG > 50 IU/L) rate was 76%. Average crude mortality was 27% but higher in young horses (67%) and brood mares (44%). Logistic regression analysis showed that young horses and brood mares and those grazing denuded pastures in December were most strongly associated with dying whereas those fed hay and/or grain based supplements were less likely to die. This is the first detailed study of an outbreak of PA toxicosis in central western Queensland and the first to provide evidence that environmental determinants were associated with mortality, that the critical exposure period was towards the end of the dry season, that supplementary feeding is protective and that denuded pastures and the horses physiological protein requirement are risk factors.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Probiotic supplements are single or mixed strain cultures of live microorganisms that benefit the host by improving the properties of the indigenous microflora (Seo et al 2010). In a pilot study at the University of Queensland, Norton et al (2008) found that Bacillus amyloliquefaciens Strain H57 (H57), primarily investigated as an inoculum to make high-quality hay, improved feed intake and nitrogen utilisation over several weeks in pregnant ewes. The purpose of the following study was to further challenge the potential of H57 -to show it survives the steam-pelleting process, and that it improves the performance of ewes fed pellets based on an agro-industrial by-product with a reputation for poor palatability, palm kernel meal (PKM), (McNeill 2013). Thirty-two first-parity White Dorper ewes (day 37 of pregnancy, mean liveweight = 47.3 kg, mean age = 15 months) were inducted into individual pens in the animal house at the University of Queensland, Gatton. They were adjusted onto PKM-based pellets (g/kg drymatter (DM): PKM, 408; sorghum, 430; chick pea hulls, 103; minerals and vitamins; Crude protein, 128; ME: 11.1MJ/kg DM) until day 89 of pregnancy and thereafter fed a predominately pelleted diet incorporating with or without H57 spores (10 9 colony forming units (cfu)/kg pellet, as fed), plus 100g/ewe/day oaten chaff, until day 7 of lactation. From day 7 to 20 of lactation the pelleted component of the diet was steadily reduced to be replaced by a 50:50 mix of lucerne: oaten chaff, fed ad libitum, plus 100g/ewe/day of ground sorghum grain with or without H57 (10 9 cfu/ewe/day). The period of adjustment in pregnancy (day 37-89) extended beyond expectations due to some evidence of mild ruminal acidosis after some initially high intakes that were followed by low intakes. During that time the diet was modified, in an attempt to improve palatability, by the addition of oaten chaff and the removal of an acidifying agent (NH4Cl) that was added initially to reduce the risk of urinary calculi. Eight ewes were removed due to inappetence, leaving 24 ewes to start the trial at day 90 of pregnancy. From day 90 of pregnancy until day 63 of lactation, liveweights of the ewes and their lambs were determined weekly and at parturition. Feed intakes of the ewes were determined weekly. Once lambing began, 1 ewe was removed as it gave birth to twin lambs (whereas the rest gave birth to a single lamb), 4 due to the loss of their lambs (2 to dystocia), and 1 due to copper toxicity. The PKM pellets were suspected to be the cause of the copper toxicity and so were removed in early lactation. Hence, the final statistical analysis using STATISTICA 8 (Repeated measures ANOVA for feed intake, One-way ANOVA for liveweight change and birth weight) was completed on 23 ewes for the pregnancy period (n = 11 fed H57; n = 12 control), and 18 ewes or lambs for the lactation period (n = 8 fed H57; n = 10 control). From day 90 of pregnancy until parturition the H57 supplemented ewes ate 17 more DM (g/day: 1041 vs 889, sed = 42.4, P = 0.04) and gained more liveweight (g/day: 193 vs 24.0, sed = 25.4, P = 0.0002), but produced lambs with a similar birthweight (kg: 4.18 vs 3.99, sed = 0.19, P = 0.54). Over the 63 days of lactation the H57 ewes ate similar amounts of DM but grew slower than the control ewes (g/day: 1.5 vs 97.0, sed = 21.7, P = 0.012). The lambs of the H57 ewes grew faster than those of the control ewes for the first 21 days of lactation (g/day: 356 vs 265, sed = 16.5, P = 0.006). These data support the findings of Norton et al (2008) and Kritas et al (2006) that certain Bacillus spp. supplements can improve the performance of pregnant and lactating ewes. In the current study we particularly highlighted the capacity of H57 to stimulate immature ewes to continue to grow maternal tissue through pregnancy, possibly through an enhanced appetite, which appeared then to stimulate a greater capacity to partition nutrients to their lambs through milk, at least for the first few weeks of lactation, a critical time for optimising lamb survival. To conclude, H57 can survive the steam pelleting process to improve feed intake and maternal liveweight gain in late pregnancy, and performance in early lactation, of first-parity ewes fed a diet based on PKM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Strong statistical evidence was found for differences in tolerance to natural infections of Tobacco streak virus (TSV) in sunflower hybrids. Data from 470 plots involving 23 different sunflower hybrids tested in multiple trials over 5 years in Australia were analysed. Using a Bayesian Hierarchical Logistic Regression model for analysis provided: (i) a rigorous method for investigating the relative effects of hybrid, seasonal rainfall and proximity to inoculum source on the incidence of severe TSV disease; (ii) a natural method for estimating the probability distributions of disease incidence in different hybrids under historical rainfall conditions; and (iii) a method for undertaking all pairwise comparisons of disease incidence between hybrids whilst controlling the familywise error rate without any drastic reduction in statistical power. The tolerance identified in field trials was effective against the main TSV strain associated with disease outbreaks, TSV-parthenium. Glasshouse tests indicate this tolerance to also be effective against the other TSV strain found in central Queensland, TSV-crownbeard. The use of tolerant germplasm is critical to minimise the risk of TSV epidemics in sunflower in this region. We found strong statistical evidence that rainfall during the early growing months of March and April had a negative effect on the incidence of severe infection with greatly reduced disease incidence in years that had high rainfall during this period.