57 resultados para capacity utilization rate
Resumo:
A restricted maximum likelihood analysis applied to an animal model showed no significant differences (P > 0.05) in pH value of the longissimus dorsi measured at 24 h post-mortem (pH24) between high and low lines of Large White pigs selected over 4 years for post-weaning growth rate on restricted feeding. Genetic and phenotypic correlations between pH24 and production and carcass traits were estimated using all performance testing records combined with the pH24 measurements (5.05-7.02) on slaughtered animals. The estimate of heritability for pH24 was moderate (0.29 ± 0.18). Genetic correlations between pH24 and production or carcass composition traits, except for ultrasonic backfat (UBF), were not significantly different from zero. UBF had a moderate, positive genetic correlation with pH24 (0.24 ± 0.33). These estimates of genetic correlations affirmed that selection for increased growth rate on restricted feeding is likely to result in limited changes in pH24 and pork quality since the selection does not put a high emphasis on reduced fatness.
Resumo:
The objective of this study was to examine genetic changes in reproduction traits in sows (total number born (TNB), number born alive (NBA), average piglet birth weight (ABW) and number of piglets weaned (NW), body weight prior to mating (MW), gestation length (GL) and daily food intake during lactation (DFI)) in lines of Large White pigs divergently selected over 4 years for high and low post-weaning growth rate on a restricted ration. Heritabilities and repeatabilities of the reproduction traits were also determined. The analyses were carried out on 913 litter records using average information-restricted maximum likelihood method applied to single trait animal models. Estimates of heritability for most traits were small, except for ABW (0·33) and MW (0·35). Estimates of repeatability were slightly higher than those of heritability for TNB, NBA and NW, but they were almost identical for ABW, MW, GL and DFI. After 4 years of selection, the high growth line sows had significantly heavier body weight prior to mating and produced significantly more piglets born alive with heavier average birth weight than the low line sows. There were, however, no statistical differences between the selected lines in TNB or NW. The lower food intake of high relative to low line sows during lactation was not significant, indicating that daily food intake differences found between grower pigs in the high and low lines (2·71 v. 2·76 kg/day, s.e.d. 0·024) on ad libitum feeding were not fully expressed in lactating sows. It is concluded that selection for growth rate on the restricted ration resulted in beneficial effects on important measures of reproductive performance of the sows.
Resumo:
Arbuscular mycorrhizal (AM) fungi, commonly found in long-term cane-growing fields in northern Queensland, are linked with both negative and positive growth responses by sugarcane (Saccharum spp.), depending on P supply. A glasshouse trial was established to examine whether AM density might also have an important influence on these growth responses. Mycorrhizal spores (Glomus clarum), isolated from a long-term cane block in northern Queensland, were introduced into a pasteurised low-P cane soil at 5 densities (0, 0.06, 0.25, 1, 4 spores/g soil) and with 4 P treatments (0, 8.2, 25, and 47 mg/kg). At 83 days after planting, sugarcane tops responded positively to P fertilizer, although responses attributable to spore density were rarely observed. In one case, addition of 4 spores/g led to a 53% yield response over those without AM at 8 mgP/kg, or a relative benefit of 17 mg P/kg. Root colonisation was reduced for plants with nil or 74 mg P/kg. For those without AM, P concentration in the topmost visible dewlap (TVD) leaf increased significantly with fertiliser P (0.07 v. 0.15%). However, P concentration increased further with the presence of AM spores. Irrespective of AM, the critical P concentration in the TVD leaf was 0.18%. This study confirms earlier reports that sugarcane is poorly responsive to AM. Spore density, up to 4 spores/g soil, appears unable to influence this responsiveness, either positively or negatively. Attempts to gain P benefits by increasing AM density through rotation seem unlikely to lead to yield increases by sugarcane. Conversely, sugarcane grown in fields with high spore densities and high plant-available P, such as long-term cane-growing soils, is unlikely to suffer a yield reduction from mycorrhizal fungi.
Resumo:
Winter cereal cropping is marginal in south-west Queensland because of low and variable rainfall and declining soil fertility. Increasing the soil water storage and the efficiency of water and nitrogen (N) use is essential for sustainable cereal production. The effect of zero tillage and N fertiliser application on these factors was evaluated in wheat and barley from 1996 to 2001 on a grey Vertosol. Annual rainfall was above average in 1996, 1997, 1998 and 1999 and below average in 2000 and 2001. Due to drought, no crop was grown in the 2000 winter cropping season. Zero tillage improved fallow soil water storage by a mean value of 20 mm over 4 years, compared with conventional tillage. However, mean grain yield and gross margin of wheat were similar under conventional and zero tillage. Wheat grain yield and/or grain protein increased with N fertiliser application in all years, resulting in an increase in mean gross margin over 5 years from $86/ha, with no N fertiliser applied, to $250/ha, with N applied to target ≥13% grain protein. A similar increase in gross margin occurred in barley where N fertiliser was applied to target malting grade. The highest N fertiliser application rate in wheat resulted in a residual benefit to soil N supply for the following crop. This study has shown that profitable responses to N fertiliser addition in wheat and barley can be obtained on long-term cultivated Vertosols in south-west Queensland when soil water reserves at sowing are at least 60% of plant available water capacity, or rainfall during the growing season is above average. An integrative benchmark for improved N fertiliser management appears to be the gross margin/water use of ~$1/ha.mm. Greater fallow soil water storage or crop water use efficiency under zero tillage has the potential to improve winter cereal production in drier growing seasons than experienced during the period of this study.
Resumo:
Negative potassium (K) balances in all broadacre grain cropping systems in northern Australia are resulting in a decline in the plant-available reserves of K and necessitating a closer examination of strategies to detect and respond to developing K deficiency in clay soils. Grain growers on the Red Ferrosol soils have increasingly encountered K deficiency over the last 10 years due to lower available K reserves in these soils in their native condition. However, the problem is now increasingly evident on the medium-heavy clay soils (Black and Grey Vertosols) and is made more complicated by the widespread adoption of direct drill cropping systems and the resulting strong strati. cation of available K reserves in the top 0.05-0.1 m of the soil pro. le. This paper reports glasshouse studies examining the fate of applied K fertiliser in key cropping soils of the inland Burnett region of south-east Queensland, and uses the resultant understanding of K dynamics to interpret results of field trials assessing the effectiveness of K application strategies in terms of K availability to crop plants. At similar concentrations of exchangeable K (K-exch), soil solution K concentrations and activity of K in the soil solution (AR(K)) varied by 6-7-fold between soil types. When K-exch arising from different rates of fertiliser application was expressed as a percentage of the effective cation exchange capacity (i.e. K saturation), there was evidence of greater selective adsorption of K on the exchange complex of Red Ferrosols than Black and Grey Vertosols or Brown Dermosols. Both soil solution K and AR(K) were much less responsive to increasing K-exch in the Black Vertosols; this is indicative of these soils having a high K buffer capacity (KBC). These contrasting properties have implications for the rate of diffusive supply of K to plant roots and the likely impact of K application strategies (banding v. broadcast and incorporation) on plant K uptake. Field studies investigating K application strategies (banding v. broadcasting) and the interaction with the degree of soil disturbance/mixing of different soil types are discussed in relation to K dynamics derived from glasshouse studies. Greater propensity to accumulate luxury K in crop biomass was observed in a Brown Ferrosol with a KBC lower than that of a Black Vertosol, consistent with more efficient diffusive supply to plant roots in the Ferrosol. This luxury K uptake, when combined with crops exhibiting low proportional removal of K in the harvested product (i.e. low K harvest index coarse grains and winter cereals) and residue retention, can lead to rapid re-development of stratified K profiles. There was clear evidence that some incorporation of K fertiliser into soil was required to facilitate root access and crop uptake, although there was no evidence of a need to incorporate K fertiliser any deeper than achieved by conventional disc tillage (i.e. 0.1-0.15 m). Recovery of fertiliser K applied in deep (0.25-0.3 m) bands in combination with N and P to facilitate root proliferation was quite poor in Red Ferrosols and Grey or Black Vertosols with moderate effective cation exchange capacity (ECEC, 25-35 cmol(+)/kg), was reasonable but not enough to overcome K deficiency in a Brown Dermosol (ECEC 11 cmol(+)/kg), but was quite good on a Black Vertosol (ECEC 50-60 cmol(+)/kg). Collectively, results suggest that frequent small applications of K fertiliser, preferably with some soil mixing, is an effective fertiliser application strategy on lighter clay soils with low KBC and an effective diffusive supply mechanism. Alternately, concentrated K bands and enhanced root proliferation around them may be a more effective strategy in Vertosol soils with high KBC and limited diffusive supply. Further studies to assess this hypothesis are needed.
Resumo:
Rainfall variability is a challenge to sustainable and pro. table cattle production in northern Australia. Strategies recommended to manage for rainfall variability, like light or variable stocking, are not widely adopted. This is due partly to the perception that sustainability and profitability are incompatible. A large, long-term grazing trial was initiated in 1997 in north Queensland, Australia, to test the effect of different grazing strategies on cattle production. These strategies are: (i) constant light stocking (LSR) at long-term carrying capacity (LTCC); (ii) constant heavy stocking (HSR) at twice LTCC; (iii) rotational wet-season spelling (R/Spell) at 1.5 LTCC; (iv) variable stocking (VAR), with stocking rates adjusted in May based on available pasture; and (v) a Southern Oscillation Index (SOI) variable strategy, with stocking rates adjusted in November, based on available pasture and SOI seasonal forecasts. Animal performance varied markedly over the 10 years for which data is presented, due to pronounced differences in rainfall and pasture availability. Nonetheless, lighter stocking at or about LTCC consistently gave the best individual liveweight gain (LWG), condition score and skeletal growth; mean LWG per annum was thus highest in the LSR (113 kg), intermediate in the R/Spell (104 kg) and lowest in the HSR(86 kg). MeanLWGwas 106 kg in the VAR and 103 kg in the SOI but, in all years, the relative performance of these strategies was dependent upon the stocking rate applied. After 2 years on the trial, steers from lightly stocked strategies were 60-100 kg heavier and received appreciable carcass price premiums at the meatworks compared to those under heavy stocking. In contrast, LWG per unit area was greatest at stocking rates of about twice LTCC; mean LWG/ha was thus greatest in the HSR (21 kg/ha), but this strategy required drought feeding in four of the 10 years and was unsustainable. Although LWG/ha was lower in the LSR (mean 14 kg/ha), or in strategies that reduced stocking rates in dry years like the VAR(mean 18 kg/ha) and SOI (mean 17 kg/ha), these strategies did not require drought feeding and appeared sustainable. The R/Spell strategy (mean 16 kg/ha) was compromised by an ill-timed fire, but also performed satisfactorily. The present results provide important evidence challenging the assumption that sustainable management in a variable environment is unprofitable. Further research is required to fully quantify the long-term effects of these strategies on land condition and profitability and to extrapolate the results to breeder performance at the property level.
Resumo:
Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.
Resumo:
BACKGROUND: Wheat can be stored for many months before being fumigated with phosphine to kill insects, so a study was undertaken to investigate whether the sorptive capacity of wheat changes as it ages. Wheat was stored at 15 or 25C and 55% RH for up to 5.5 months, and samples were fumigated at intervals to determine sorption. Sealed glass flasks (95% full) were injected with 1.5 mg L-1 of phosphine based on flask volume. Concentrations were monitored for 11 days beginning 2 h after injection. Some wheat samples were refumigated after a period of ventilation. Several fumigations of wheat were conducted to determine the pattern of sorption during the first 24 h. RESULTS: Phosphine concentration declined exponentially with time from 2 h after injection. Rate of sorption decreased with time spent in storage at either 15 or 25C and 55% RH. Rate of sorption tended to be lower when wheat was refumigated, but this could be explained by time in storage rather than by refumigation per se. The data from the 24 h fumigations did not fit a simple exponential decay equation. Instead, there was a rapid decline in the first hour, with phosphine concentration falling much more slowly thereafter. CONCLUSIONS: The results have implications for phosphine fumigation of insects in stored wheat. Both the time wheat has spent in storage and the temperature at which it has been stored are factors that must be considered when trying to understand the impact of sorption on phosphine concentrations in commercial fumigations.
Resumo:
Understanding the effects of different types and quality of data on bioclimatic modeling predictions is vital to ascertaining the value of existing models, and to improving future models. Bioclimatic models were constructed using the CLIMEX program, using different data types – seasonal dynamics, geographic (overseas) distribution, and a combination of the two – for two biological control agents for the major weed Lantana camara L. in Australia. The models for one agent, Teleonemia scrupulosa Stål (Hemiptera:Tingidae) were based on a higher quality and quantity of data than the models for the other agent, Octotoma scabripennis Guérin-Méneville (Coleoptera: Chrysomelidae). Predictions of the geographic distribution for Australia showed that T. scrupulosa models exhibited greater accuracy with a progressive improvement from seasonal dynamics data, to the model based on overseas distribution, and finally the model combining the two data types. In contrast, O. scabripennis models were of low accuracy, and showed no clear trends across the various model types. These case studies demonstrate the importance of high quality data for developing models, and of supplementing distributional data with species seasonal dynamics data wherever possible. Seasonal dynamics data allows the modeller to focus on the species response to climatic trends, while distributional data enables easier fitting of stress parameters by restricting the species envelope to the described distribution. It is apparent that CLIMEX models based on low quality seasonal dynamics data, together with a small quantity of distributional data, are of minimal value in predicting the spatial extent of species distribution.
Resumo:
Management of the commercial harvest of kangaroos relies on quotas set annually as a proportion of regular estimates of population size. Surveys to generate these estimates are expensive and, in the larger states, logistically difficult; a cheaper alternative is desirable. Rainfall is a disappointingly poor predictor of kangaroo rate of increase in many areas, but harvest statistics (sex ratio, carcass weight, skin size and animals shot per unit time) potentially offer cost-effective indirect monitoring of population abundance (and therefore trend) and status (i.e. under-or overharvest). Furthermore, because harvest data are collected continuously and throughout the harvested areas, they offer the promise of more intensive and more representative coverage of harvest areas than aerial surveys do. To be useful, harvest statistics would need to have a close and known relationship with either population size or harvest rate. We assessed this using longterm (11-22 years) data for three kangaroo species (Macropus rufus, M. giganteus and M. fuliginosus) and common wallaroos (M. robustus) across South Australia, New South Wales and Queensland. Regional variation in kangaroo body size, population composition, shooter efficiency and selectivity required separate analyses in different regions. Two approaches were taken. First, monthly harvest statistics were modelled as a function of a number of explanatory variables, including kangaroo density, harvest rate and rainfall. Second, density and harvest rate were modelled as a function of harvest statistics. Both approaches incorporated a correlated error structure. Many but not all regions had relationships with sufficient precision to be useful for indirect monitoring. However, there was no single relationship that could be applied across an entire state or across species. Combined with rainfall-driven population models and applied at a regional level, these relationships could be used to reduce the frequency of aerial surveys without compromising decisions about harvest management.
Resumo:
Since their release over 100 years ago, camels have spread across central Australia and increased in number. Increasingly, they are being seen as a pest, with observed impacts from overgrazing and damage to infrastructure such as fences. Irregular aerial surveys since 1983 and an interview-based survey in 1966 suggest that camels have been increasing at close to their maximum rate. A comparison of three models of population growth fitted to these, albeit limited, data suggests that the Northern Territory population has indeed been growing at an annual exponential rate of r = 0.074, or 8% per year, with little evidence of a density-dependent brake. A stage-structured model using life history data from a central Australian camel population suggests that this rate approximates the theoretical maximum. Elasticity analysis indicates that adult survival is by far the biggest influence on rate of increase and that a 9% reduction in survival from 96% is needed to stop the population growing. In contrast, at least 70% of mature females need to be sterilised to have a similar effect. In a benign environment, a population of large mammals such as camels is expected to grow exponentially until close to carrying capacity. This will frustrate control programs, because an ever-increasing number of animals will need to be removed for zero growth the longer that culling or harvesting effort is delayed. A population projection for 2008 suggests ~10 500 animals need to be harvested across the Northern Territory. Current harvests are well short of this. The ability of commercial harvesting to control camel populations in central Australia will depend on the value of animals, access to animals and the presence of alternative species to harvest when camels are at low density.
Resumo:
Interest in cashew production in Australia has been stimulated by domestic and export market opportunities and suitability of large areas of tropical Australia. Economic models indicate that cashew production is profitable at 2.8 t ha-1 nut-in-shell (NIS). Balanced plant nutrition is essential to achieve economic yields in Australia, with nitrogen (N) of particular importance because of its capacity to modify growth, affect nut yield and cause environmental degradation through soil acidification and off-site contamination. The study on a commercial cashew plantation at Dimbulah, Australia, investigated the effect of N rate and timing on cashew growth, nut production, N leaching and soil chemical properties over five growth cycles (1995-1999). Nitrogen was applied during the main periods of vegetative (December-April) and reproductive (June-October) growth. Commercial NIS yields (up to 4.4 t ha-1 from individual trees) that exceeded the economic threshold of 2.8 t ha-1 were achieved. The yield response was mainly determined by canopy size as mean nut weight, panicle density and nuts per panicle were largely unaffected by N treatments. Nitrogen application confined to the main period of vegetative growth (December-April) produced a seasonal growth pattern that corresponded most consistently with highest NIS yield. This N timing also reduced late season flowering and undesirable post-November nut drop. Higher yields were not produced at N rates greater than 17 g m-2 of canopy surface area (equating to 210 kg N ha-1 for mature size trees). High yields were attained when N concentrations in Mveg leaves in May-June were about 2%, but this assessment occurs at a time when it is not feasible to correct N deficiency. The Mflor leaf of the preceding November, used in conjunction with the Mveg leaf, was proposed as a diagnostic tool to guide N rate decisions. Leaching of nitrate-N and acidification of the soil profile was recorded to 0.9 m. This is an environmental and sustainability hazard, and demonstrates that improved methods of N management are required.
Resumo:
Wear resistance and recovery of 8 Bermudagrass (Cynodon dactylon (L.) Pers.) and hybrid Bermudagrass (C. Dactylon x C. transvaalensis Burtt-Davey) cultivars grown on a sandbased soil profile near Brisbane, Australia, were assessed in 4 wear trials conducted over a two year period. Wear was applied on a 7-day or a 14-day schedule by a modified Brinkman Traffic Simulator for 6-14 weeks at a time, either during winter-early spring or during summer-early autumn. The more frequent wear under the 7-day treatment was more damaging to the turf than the 14-day wear treatment, particularly during winter when its capacity for recovery from wear was severely restricted. There were substantial differences in wear tolerance among the 8 cultivars investigated, and the wear tolerance rankings of some cultivars changed between years. Wear tolerance was associated with high shoot density, a dense stolon mat strongly rooted to the ground surface, high cell wall strength as indicated by high total cell wall content, and high levels of lignin and neutral detergent fiber. Wear tolerance was also affected by turf age, planting sod quality, and wet weather. Resistance to wear and recovery from wear are both important components of wear tolerance, but the relative importance of their contributions to overall wear tolerance varies seasonally with turf growth rate.
Resumo:
Exotic and invasive woody vines are major environmental weeds of riparian areas, rainforest communities and remnant natural vegetation in coastal eastern Australia, where they smother standing vegetation, including large trees, and cause canopy collapse. We investigated, through glasshouse resource manipulative experiments, the ecophysiological traits that might facilitate faster growth, better resource acquisition and/or utilization and thus dominance of four exotic and invasive vines of South East Queensland, Australia, compared with their native counterparts. Relative growth rate was not significantly different between the two groups but water use efficiency (WUE) was higher in the native species while the converse was observed for light use efficiency (quantum efficiency, AQE) and maximum photosynthesis on a mass basis (Amax mass). The invasive species, as a group, also exhibited higher respiration load, higher light compensation point and higher specific leaf area. There were stronger correlations of leaf traits and greater structural (but not physiological) plasticity in invasive species than in their native counterparts. The scaling coefficients of resource use efficiencies (WUE, AQE and respiration efficiency) as well as those of fitness (biomass accumulated) versus many of the performance traits examined did not differ between the two species-origin groups, but there were indications of significant shifts in elevation (intercept values) and shifts along common slopes in many of these relationships – signalling differences in carbon economy (revenue returned per unit energy invested) and/or resource usage. Using ordination and based on 14 ecophysiological attributes, a fair level of separation between the two groups was achieved (51.5% explanatory power), with AQE, light compensation point, respiration load, WUE, specific leaf area and leaf area ratio, in decreasing order, being the main drivers. This study suggests similarity in trait plasticity, especially for physiological traits, but there appear to be fundamental differences in carbon economy and resource conservation between native and invasive vine species.
Resumo:
Seeds in the field experience wet-dry cycling that is akin to the well-studied commercial process of seed priming in which seeds are hydrated and then re-dried to standardise their germination characteristics. To investigate whether the persistence (defined as in situ longevity) and antioxidant capacity of seeds are influenced by wet-dry cycling, seeds of the global agronomic weed Avena sterilis ssp. ludoviciana were subjected to (1) controlled ageing at 60% relative humidity and 53.5°C for 31 days, (2) controlled ageing then priming, or (3) ageing in the field in three soils for 21 months. Changes in seed viability (total germination), mean germination time, seedling vigour (mean seedling length), and the concentrations of the glutathione (GSH) / glutathione disulphide (GSSG) redox couple were recorded over time. As controlled-aged seeds lost viability, GSH levels declined and the relative proportion of GSSG contributing to total glutathione increased, indicative of a failing antioxidant capacity. Subjecting seeds that were aged under controlled conditions to a wet-dry cycle (to −1 MPa) prevented viability loss and increased GSH levels. Field-aged seeds that underwent numerous wet-dry cycles due to natural rainfall maintained high viability and high GSH levels. Thus wet-dry cycles in the field may enhance seed longevity and persistence coincident with re-synthesis of protective compounds such as GSH.