80 resultados para Dry off period
Resumo:
Remote drafting technology now available for sheep makes possible targeted supplementation of individuals within a grazing flock. This system was evaluated by using 68 Merino wethers grazing dry-season, native Mitchell grass pasture (predominantly Astrebla spp.) as a group and receiving access to lupin grain through a remote drafter 0, 1, 2, 4 or 7 days/week for 8 weeks. The sole paddock watering point was separately fenced and access was via a one-way flow gate. Sheep exited the watering point through a remote drafter operated by solar power and were drafted by radio frequency identification (RFID) tag, according to treatment, either back into the paddock or into a common supplement yard where lupins were provided ad libitum in a self-feeder. Sheep were drafted into the supplement yard on only their first time through the drafter during the prescribed 24-h period and exited the supplement yard via one-way flow gates in their own time. The remote drafter operated with a high accuracy, with only 2.1% incorrect drafts recorded during the experimental period out of a total of 7027 sheep passes through the remote drafter. The actual number of accesses to supplement for each treatment group, in order, were generally less than that intended, i.e. 0.02, 0.69, 1.98, 3.35 and 6.04 days/week. Deviations from the intended number of accesses to supplement were mainly due to sheep not coming through to water on their allocated day of treatment access, although some instances were due to incorrect drafts. There was a non-linear response in growth rate to increased frequency of access to lupins with the growth rate response plateauing at similar to 3 actual accesses per week, corresponding to a growth rate of 72.5 g/head. day. This experiment has demonstrated the application of the remote drafting supplementation system for the first time under grazing conditions and with the drafter operated completely from solar power. The experiment demonstrates a growth response to increasing frequency of access to supplement and provides a starting point with which to begin to develop feeding strategies to achieve sheep weight-change targets.
Resumo:
The potential of beef producers to profitably produce 500-kg steers at 2.5 years of age in northern Australia's dry tropics to meet specifications of high-value markets, using a high-input management (HIM) system was examined. HIM included targeted high levels of fortified molasses supplementation, short seasonal mating and the use of growth promotants. Using herds of 300-400 females plus steer progeny at three sites, HIM was compared at a business level to prevailing best-practice, strategic low-input management (SLIM) in which there is a relatively low usage of energy concentrates to supplement pasture intake. The data presented for each breeding-age cohort within management system at each site includes: annual pregnancy rates (range: 14-99%), time of conception, mortalities (range: 0-10%), progeny losses between confirmed pregnancy and weaning (range: 0-29%), and weaning rates (range: 14-92%) over the 2-year observation. Annual changes in weight and relative net worth were calculated for all breeding and non-breeding cohorts. Reasons for outcomes are discussed. Compared with SLIM herds, both weaning weights and annual growth were >= 30 kg higher, enabling 86-100% of HIM steers to exceed 500 kg at 2.5 years of age. Very few contemporary SLIM steers reached this target. HIM was most profitably applied to steers. Where HIM was able to achieve high pregnancy rates in yearlings, its application was recommended in females. Well managed, appropriate HIM systems increased profits by around $15/adult equivalent at prevailing beef and supplement prices. However, a 20% supplement price rise without a commensurate increase in values for young slaughter steers would generally eliminate this advantage. This study demonstrated the complexity of pro. table application of research outcomes to commercial business, even when component research suggests that specific strategies may increase growth and reproductive efficiency and/or be more pro. table. Because of the higher level of management required, higher costs and returns, and higher susceptibility to market changes and disease, HIM systems should only be applied after SLIM systems are well developed. To increase profitability, any strategy must ultimately either increase steer growth and sale values and/or enable a shift to high pregnancy rates in yearling heifers.
Resumo:
The Wet Tropics bioregion of north Queensland has been identified as an area of global significance. The world-heritage-listed rainforests have been invaded by feral pigs (Sus scrofa) that are perceived to cause substantial environmental damage. A community perception exists of an annual altitudinal migration of the feral-pig population. The present study describes the movements of 29 feral pigs in relation to altitudinal migration (highland, transitional and lowland areas). Feral pigs were sedentary and stayed within their home range throughout a 4-year study period. No altitudinal migration was detected; pigs moved no more than a mean distance of 1.0 km from the centre of their calculated home ranges. There was no significant difference between the mean (+/- 95% confidence interval) aggregate home ranges for males (8.7 +/- 4.3 km², n = 15) and females (7.2 +/- 1.8 km², n = 14). No difference in home range was detected among the three altitudinal areas: 7.2 +/- 2.4 km² for highland, 6.2 +/- 3.9 km² for transitional and 9.9 +/- 5.3 km² for lowland areas. The aggregate mean home range for all pigs in the present study was 8.0 +/- 2.4 km². The study also assessed the influence seasons had on the home range of eight feral pigs on the rainforest boundary; home ranges did not significantly vary in size between the tropical wet and dry seasons, although the mean home range in the dry season (7.7 +/- 6.9 km²) was more than twice the home range in the wet season (2.9 +/- 0.8 km²). Heavier pigs tended to have larger home ranges. The results of the present study suggest that feral pigs are sedentary throughout the year so broad-scale control techniques need to be applied over sufficient areas to encompass individual home ranges. Control strategies need a coordinated approach if a long-term reduction in the pig population is to be achieved.
Resumo:
Rainfall variability is a challenge to sustainable and pro. table cattle production in northern Australia. Strategies recommended to manage for rainfall variability, like light or variable stocking, are not widely adopted. This is due partly to the perception that sustainability and profitability are incompatible. A large, long-term grazing trial was initiated in 1997 in north Queensland, Australia, to test the effect of different grazing strategies on cattle production. These strategies are: (i) constant light stocking (LSR) at long-term carrying capacity (LTCC); (ii) constant heavy stocking (HSR) at twice LTCC; (iii) rotational wet-season spelling (R/Spell) at 1.5 LTCC; (iv) variable stocking (VAR), with stocking rates adjusted in May based on available pasture; and (v) a Southern Oscillation Index (SOI) variable strategy, with stocking rates adjusted in November, based on available pasture and SOI seasonal forecasts. Animal performance varied markedly over the 10 years for which data is presented, due to pronounced differences in rainfall and pasture availability. Nonetheless, lighter stocking at or about LTCC consistently gave the best individual liveweight gain (LWG), condition score and skeletal growth; mean LWG per annum was thus highest in the LSR (113 kg), intermediate in the R/Spell (104 kg) and lowest in the HSR(86 kg). MeanLWGwas 106 kg in the VAR and 103 kg in the SOI but, in all years, the relative performance of these strategies was dependent upon the stocking rate applied. After 2 years on the trial, steers from lightly stocked strategies were 60-100 kg heavier and received appreciable carcass price premiums at the meatworks compared to those under heavy stocking. In contrast, LWG per unit area was greatest at stocking rates of about twice LTCC; mean LWG/ha was thus greatest in the HSR (21 kg/ha), but this strategy required drought feeding in four of the 10 years and was unsustainable. Although LWG/ha was lower in the LSR (mean 14 kg/ha), or in strategies that reduced stocking rates in dry years like the VAR(mean 18 kg/ha) and SOI (mean 17 kg/ha), these strategies did not require drought feeding and appeared sustainable. The R/Spell strategy (mean 16 kg/ha) was compromised by an ill-timed fire, but also performed satisfactorily. The present results provide important evidence challenging the assumption that sustainable management in a variable environment is unprofitable. Further research is required to fully quantify the long-term effects of these strategies on land condition and profitability and to extrapolate the results to breeder performance at the property level.
Resumo:
For pasture growth in the semi-arid tropics of north-east Australia, where up to 80% of annual rainfall occurs between December and March, the timing and distribution of rainfall events is often more important than the total amount. In particular, the timing of the 'green break of the season' (GBOS) at the end of the dry season, when new pasture growth becomes available as forage and a live-weight gain is measured in cattle, affects several important management decisions that prevent overgrazing and pasture degradation. Currently, beef producers in the region use a GBOS rule based on rainfall (e. g. 40mm of rain over three days by 1 December) to define the event and make their management decisions. A survey of 16 beef producers in north-east Queensland shows three quarters of respondents use a rainfall amount that occurs in only half or less than half of all years at their location. In addition, only half the producers expect the GBOS to occur within two weeks of the median date calculated by the CSIRO plant growth days model GRIM. This result suggests that in the producer rules, either the rainfall quantity or the period of time over which the rain is expected, is unrealistic. Despite only 37% of beef producers indicating that they use a southern oscillation index (SOI) forecast in their decisions, cross validated LEPS (linear error in probability space) analyses showed both the average 3 month July-September SOI and the 2 month August-September SOI have significant forecast skill in predicting the probability of both the amount of wet season rainfall and the timing of the GBOS. The communication and implementation of a rigorous and realistic definition of the GBOS, and the likely impacts of anthropogenic climate change on the region are discussed in the context of the sustainable management of northern Australian rangelands.
Resumo:
The project assembled basic information to allow effective management and manipulation of native pastures in the southern Maranoa region of Queensland. This involved a range of plant studies, including a grazing trial, to quantify the costs of poor pasture composition. While the results focus on perennial grasses, we recognise the important dietary role played by broad-leaved herbs. The plant manipulation studies focussed on ways to change the proportions of plants in a grazed pasture, eg. by recruitment or accelerated morbidity of existing plants. As most perennial grasses have a wide range of potential flowering times outside of mid-winter, rainfall exerts the major influence on flowering and seedset; exceptions are black speargrass, rough speargrass and golden beardgrass that flower only for a restricted period each year. This simplifies potential control options through reducing seedset. Data from field growth studies of four pasture grasses have been used to refine the State's pasture production model GRASP. We also provide detailed data on the forage value of many native species at different growth stages. Wiregrass dominance in pastures on a sandy red earth reduced wool value by only 5-10% at Roma in 1994/95 when winters were very dry and grass seed problems were minimal.
Resumo:
A total of 2115 heifers from two tropical genotypes (1007 Brahman and 1108 Tropical Composite) raised in four locations in northern Australia were ovarian-scanned every 4-6 weeks to determine the age at the first-observed corpus luteum (CL) and this was used to de. ne the age at puberty for each heifer. Other traits recorded at each time of ovarian scanning were liveweight, fat depths and body condition score. Reproductive tract size was measured close to the start of the first joining period. Results showed significant effects of location and birth month on the age at first CL and associated puberty traits. Genotypes did not differ significantly for the age or weight at first CL; however, Brahman were fatter at first CL and had a small reproductive tract size compared with that of Tropical Composite. Genetic analyses estimated the age at first CL to be moderately to highly heritable for Brahman (0.57) and Tropical Composite (0.52). The associated traits were also moderately heritable, except for reproductive tract size in Brahmans (0.03) and for Tropical Composite, the presence of an observed CL on the scanning day closest to the start of joining (0.07). Genetic correlations among puberty traits were mostly moderate to high and generally larger in magnitude for Brahman than for Tropical Composite. Genetic correlations between the age at CL and heifer- and steer-production traits showed important genotype differences. For Tropical Composite, the age at CL was negatively correlated with the heifer growth rate in their first postweaning wet season (-0.40) and carcass marbling score (-0.49), but was positively correlated with carcass P8 fat depth (0.43). For Brahman, the age at CL was moderately negatively genetically correlated with heifer measures of bodyweight, fatness, body condition score and IGF-I, in both their first postweaning wet and second dry seasons, but was positively correlated with the dry-season growth rate. For Brahman, genetic correlations between the age at CL and steer traits showed possible antagonisms with feedlot residual feed intake (-0.60) and meat colour (0.73). Selection can be used to change the heifer age at puberty in both genotypes, with few major antagonisms with steer- and heifer- production traits.
Resumo:
Maintenance of green leaf area during grain filling can increase grain yield of sorghum grown under terminal water limitation. This 'stay-green' trait has been related to the nitrogen (N) supply-demand balance during grain filling. This study quantifies the N demand of grain and N translocation rates from leaves and stem and explores effects of genotype and N stress on onset and rate of leaf senescence during the grain filling period. Three hybrids differing in potential height were grown at three levels of N supply under well-watered conditions. Vertical profiles of biomass, leaf area, and N% of leaves, stem and grain were measured at regular intervals. Weekly SPAD chlorophyll readings on main shoot leaves were correlated with observed specific leaf nitrogen (SLN) to derive seasonal patterns of leaf N content. For all hybrids, individual grain N demand was sink determined and was initially met through N translocation from the stem and rachis. Only if this was insufficient did leaf N translocation occur. Maximum N translocation rates from leaves and stem were dependent on their N status. However, the supply of N at canopy scale was also related to the amount of leaf area senescing at any one time. This supply-demand framework for N dynamics explained effects of N stress and genotype on the onset and rate of leaf senescence.
Resumo:
The effects on yield, botanical composition and persistence, of using a variable defoliation schedule as a means of optimising the quality of the tall fescue component of simple and complex temperate pasture mixtures in a subtropical environment was studied in a small plot cutting experiment at Gatton Research Station in south-east Queensland. A management schedule of 2-, 3- and 4-weekly defoliations in summer, autumn and spring and winter, respectively, was imposed on 5 temperate pasture mixtures: 2 simple mixtures including tall fescue (Festuca arundinacea) and white clover (Trifolium repens); 2 mixtures including perennial ryegrass (Lolium perenne), tall fescue and white clover; and a complex mixture, which included perennial ryegrass, tall fescue, white, red (T. pratense) and Persian (T. resupinatum) clovers and chicory (Cichorium intybus). Yield from the variable cutting schedule was 9% less than with a standard 4-weekly defoliation. This loss resulted from reductions in both the clover component (13%) and cumulative grass yield (6%). There was no interaction between cutting schedule and sowing mixture, with simple and complex sowing mixtures reacting in a similar manner to both cutting schedules. The experiment also demonstrated that, in complex mixtures, the cutting schedules used failed to give balanced production from all sown components. This was especially true of the grass and white clover components of the complex mixture, as chicory and Persian clover components dominated the mixtures, particularly in the first year. Quality measurements (made only in the final summer) suggested that variable management had achieved a quality improvement with increases in yields of digestible crude protein (19%) and digestible dry matter (9%) of the total forage produced in early summer. The improvements in the yields of digestible crude protein and digestible dry matter of the tall fescue component in late summer were even greater (28 and 19%, respectively). While advantages at other times of the year were expected to be smaller, the data suggested that the small loss in total yield was likely to be offset by increases in digestibility of available forage for grazing stock, especially in the critical summer period.
Resumo:
Insights into the relative importance of various aspects of product quality can be provided through quantitative analysis of consumer preference and choice of fruit. In this study, methods previously used to establish taste preferences for kiwifruit (Harker et al., 2008) and conjoint approaches were used to determine the influence of three key aspects of avocado quality on consumer liking and willingness to purchase fruit: dry matter percentage (DM), level of ripeness (firmness) and internal defects (bruising). One hundred and seven consumers tasted avocados with a range of DM levels from ~20% (minimally mature) to nearly 40% (very mature), and at a range of fruit firmness (ripeness) stages (firm-ripe to soft-ripe). Responses to bruising, a common quality defect in fruit obtained from the retail shelf, were examined using a conjoint approach in which consumers were presented with photographs showing fruit affected by damage of varying severity. In terms of DM, consumers showed a progressive increase in liking and intent to buy avocados as the DM increased. In terms of ripeness, liking and purchase intent was higher in avocados that had softened to a firmness of 6.5 N or below (hand-rating 5). For internal defects, conjoint analysis revealed that price, level of bruising and incidence of bruising all significantly lowered consumers' future purchase decision, but the latter two factors had a greater impact than price. These results indicate the usefulness of the methodology, and also provide realistic targets for Hass avocado quality on the retail shelf.
Resumo:
When exposed to hot (22-35 degrees C) and dry climatic conditions in the field during the final 4-6 weeks of pod filling, peanuts (Arachis hypogaea L.) can accumulate highly carcinogenic and immuno-suppressing aflatoxins. Forecasting of the risk posed by these conditions can assist in minimizing pre-harvest contamination. A model was therefore developed as part of the Agricultural Production Systems Simulator (APSIM) peanut module, which calculated an aflatoxin risk index (ARI) using four temperature response functions when fractional available soil water was <0.20 and the crop was in the last 0.40 of the pod-filling phase. ARI explained 0.95 (P <= 0.05) of the variation in aflatoxin contamination, which varied from 0 to c. 800 mu g/kg in 17 large-scale sowings in tropical and four sowings in sub-tropical environments carried out in Australia between 13 November and 16 December 2007. ARI also explained 0.96 (P <= 0.01) of the variation in the proportion of aflatoxin-contaminated loads (>15 mu g/kg) of peanuts in the Kingaroy region of Australia during the period between the 1998/99 and 2007/08 seasons. Simulation of ARI using historical climatic data from 1890 to 2007 indicated a three-fold increase in its value since 1980 compared to the entire previous period. The increase was associated with increases in ambient temperature and decreases in rainfall. To facilitate routine monitoring of aflatoxin risk by growers in near real time, a web interface of the model was also developed. The ARI predicted using this interface for eight growers correlated significantly with the level of contamination in crops (r=095, P <= 0.01). These results suggest that ARI simulated by the model is a reliable indicator of aflatoxin contamination that can be used in aflatoxin research as well as a decision-support tool to monitor pre-harvest aflatoxin risk in peanuts.
Resumo:
Interest in cashew production in Australia has been stimulated by domestic and export market opportunities and suitability of large areas of tropical Australia. Economic models indicate that cashew production is profitable at 2.8 t ha-1 nut-in-shell (NIS). Balanced plant nutrition is essential to achieve economic yields in Australia, with nitrogen (N) of particular importance because of its capacity to modify growth, affect nut yield and cause environmental degradation through soil acidification and off-site contamination. The study on a commercial cashew plantation at Dimbulah, Australia, investigated the effect of N rate and timing on cashew growth, nut production, N leaching and soil chemical properties over five growth cycles (1995-1999). Nitrogen was applied during the main periods of vegetative (December-April) and reproductive (June-October) growth. Commercial NIS yields (up to 4.4 t ha-1 from individual trees) that exceeded the economic threshold of 2.8 t ha-1 were achieved. The yield response was mainly determined by canopy size as mean nut weight, panicle density and nuts per panicle were largely unaffected by N treatments. Nitrogen application confined to the main period of vegetative growth (December-April) produced a seasonal growth pattern that corresponded most consistently with highest NIS yield. This N timing also reduced late season flowering and undesirable post-November nut drop. Higher yields were not produced at N rates greater than 17 g m-2 of canopy surface area (equating to 210 kg N ha-1 for mature size trees). High yields were attained when N concentrations in Mveg leaves in May-June were about 2%, but this assessment occurs at a time when it is not feasible to correct N deficiency. The Mflor leaf of the preceding November, used in conjunction with the Mveg leaf, was proposed as a diagnostic tool to guide N rate decisions. Leaching of nitrate-N and acidification of the soil profile was recorded to 0.9 m. This is an environmental and sustainability hazard, and demonstrates that improved methods of N management are required.
Resumo:
Soil water repellency occurs widely in horticultural and agricultural soils when very dry. The gradual accumulation and breakdown of surface organic matter over time produces wax-like organic acids, which coat soil particles preventing uniform entry of water into the soil. Water repellency is usually managed by regular surfactant applications. Surfactants, literally, are surface active agents (SURFace ACTive AgeNTS). Their mode of action is to reduce the surface tension of water, allowing it to penetrate and wet the soil more easily and completely. This practice improves water use efficiency (by requiring less water to wet the soil and by capturing rainfall and irrigation more effectively and rapidly). It also reduces nutrient losses through run-off erosion or leaching. These nutrients have the potential to pollute the surrounding environment and water courses. This project investigated potential improvements to standard practices (product combination and scheduling) for surfactant use to overcome localised dry spots on water repellent soils and thus improve turf quality and water use efficiency. Weather conditions for the duration of the trial prevented the identification of improved practices in terms of combination and scheduling. However, the findings support previous research that the use of soil surfactants decreased the time for water to infiltrate dry soil samples taken from a previously severely hydrophobic site. Data will be continually collected from this trial site on a private contractual basis, with the hope that improvements to standard practices will be observed during the drier winter months when moisture availability is a limiting factor for turfgrass growth and quality.
Resumo:
Salinity is an increasingly important issue in both rural and urban areas throughout much of Australia. The use of recycled/reclaimed water and other sources of poorer quality water to irrigate turf is also increasing. Hybrid Bermudagrass (Cynodon dactylon (L.) Pers. x C. transvaalensis Burtt Davey), together with the parent species C. dactylon, are amongst the most widely used warm-season turf grass groups. Twelve hybrid Bermudagrass genotypes and one accession each of Bermudagrass (C. dactylon), African Bermudagrass (C. transvaalensis) and seashore paspalum (Paspalum vaginatum Sw.) were grown in a glasshouse experiment with six different salinity treatments applied hydroponically through the irrigation water (ECW = <0.1, 6, 12, 18, 24 or 30 dSm-1) in a flood-and-drain system. Each pot was clipped progressively at 2-weekly intervals over the 12-week experimental period to determine dry matter production; leaf firing was rated visually on 3 occasions during the last 6 weeks of salinity treatment. At the end of the experiment, dry weights of roots and crowns below clipping height were also determined. Clipping yields declined sharply after about the first 6 weeks of salinity treatment, but then remained stable at substantially lower levels of dry matter production from weeks 8 to 12. Growth data over this final 4-week experimental period is therefore a more accurate guide to the relative salinity tolerance of the 15 entries than data from the preceding 8 weeks. Based on these data, the 12 hybrid Bermudagrass genotypes showed moderate salinity tolerance, with FloraDwarfM, 'Champion Dwarf', NovotekM and 'TifEagle' ranking as the most salt tolerant and 'Patriot', 'Santa Ana', 'Tifgreen' and TifSport M the least tolerant within the hybrid group. Nevertheless, Santa Ana, for example, maintained relatively strong root growth as salinity increased, and so may show better salt tolerance in practice than predicted from the growth data alone. The 12 hybrid Bermudagrasses and the single African Bermudagrass genotype were all ranked above FloraTeXM Bermudagrass in terms of salt tolerance. However, seashore paspalum, which is widely acknowledged as a halophytic species showing high salt tolerance, ranked well above all 14 Cynodon genotypes in terms of salinity tolerance.
Resumo:
The nitrogen-driven trade-off between nitrogen utilisation efficiency (yield per unit nitrogen uptake) and water use efficiency (yield per unit evapotranspiration) is widespread and results from well established, multiple effects of nitrogen availability on the water, carbon and nitrogen economy of crops. Here we used a crop model (APSIM) to simulate the yield, evapotranspiration, soil evaporation and nitrogen uptake of wheat, and analysed yield responses to water, nitrogen and climate using a framework analogous to the rate-duration model of determinate growth. The relationship between modelled grain yield (Y) and evapotranspiration (ET) was fitted to a linear-plateau function to derive three parameters: maximum yield (Ymax), the ET break-point when yield reaches its maximum (ET#), and the rate of yield response in the linear phase ([Delta]Y/[Delta]ET). Against this framework, we tested the hypothesis that nitrogen deficit reduces maximum yield by reducing both the rate ([Delta]Y/[Delta]ET) and the range of yield response to evapotranspiration, i.e. ET# - Es, where Es is modelled median soil evaporation. Modelled data reproduced the nitrogen-driven trade-off between nitrogen utilisation efficiency and water use efficiency in a transect from Horsham (36°S) to Emerald (23°S) in eastern Australia. Increasing nitrogen supply from 50 to 250 kg N ha-1 reduced yield per unit nitrogen uptake from 29 to 12 kg grain kg-1 N and increased yield per unit evapotranspiration from 6 to 15 kg grain ha-1 mm-1 at Emerald. The same increment in nitrogen supply reduced yield per unit nitrogen uptake from 30 to 25 kg grain kg-1 N and increased yield per unit evapotranspiration from 6 to 25 kg grain ha-1 mm-1 at Horsham. Maximum yield ranged from 0.9 to 6.4 t ha-1. Consistent with our working hypothesis, reductions in maximum yield with nitrogen deficit were associated with both reduction in the rate of yield response to ET and compression of the range of yield response to ET. Against the notion of managing crops to maximise water use efficiency in low rainfall environments, we emphasise the trade-off between water use efficiency and nitrogen utilisation efficiency, particularly under conditions of high nitrogen-to-grain price ratio. The rate-range framework to characterise the relationship between yield and evapotranspiration is useful to capture this trade-off as the parameters were responsive to both nitrogen supply and climatic factors.