952 resultados para pasture deferment
Resumo:
To evaluate the role of using forage, shade and shelterbelts in attracting birds into the range, three trials were undertaken with free range layers both on a research facility and on commercial farms. Each of the trials on the free range research facility in South Australia used a total of 120 laying hens (Hyline Brown). Birds were housed in an eco-shelter which had 6 internal pens of equal size with a free range area adjoining the shelter. The on-farm trials were undertaken on commercial free range layer farms in the Darling Downs in Southeast Queensland with bird numbers on farms ranging from 2,000-6,800 hens. The first research trial examined the role of shaded areas in the range; the second trial examined the role of forage and the third trial examined the influence of shelterbelts in the range. These treatments were compared to a free range area with no enrichment. Aggressive feather pecking was only observed on a few occasions in all of the trials due to the low bird numbers housed. Enriching the free range environment attracted more birds into the range. Shaded areas were used by 18% of the hens with a tendency (p = 0.07) for more hens to be in the paddock. When forage was provided in paddocks more control birds (55%) were observed in the range in morning than in the afternoon (30%) while for the forage treatments 45% of the birds were in the range both during the morning and afternoon. When shelterbelts were provided there was a significantly (p<0.05) higher % of birds in the range (43% vs. 24%) and greater numbers of birds were observed in areas further away from the poultry house. The results from the on-farm trials mirrored the research trials. Overall 3 times more hens used the shaded areas than the non shaded areas, with slightly more using the shade in the morning than in the afternoon. As the environmental temperature increased the number of birds using the outdoor shade also increased. Overall 17 times more hens used the shelterbelt areas than the control areas, with slightly more using the shelterbelts in the afternoon than in the morning. Approximately 17 times more birds used the forage areas compared to the control area in the corresponding range. There were 8 times more birds using a hay bale enriched area compared to the area with no hay bales. The use of forage sources (including hay bales) were the most successful method on-farm to attract birds into the range followed by shelterbelts and artificial shade. Free range egg farmers are encouraged to provide pasture, shaded areas and shelterbelts to attract birds into the free range.
Resumo:
There is a large gap between the refined approaches to characterise genotypes and the common use of location and season as a coarse surrogate for environmental characterisation of breeding trials. As a framework for breeding, the aim of this paper is quantifying the spatial and temporal patterns of thermal and water stress for field pea in Australia. We compiled a dataset for yield of the cv. Kaspa measured in 185 environments, and investigated the associations between yield and seasonal patterns of actual temperature and modelled water stress. Correlations between yield and temperature indicated two distinct stages. In the first stage, during crop establishment and canopy expansion before flowering, yield was positively associated with minimum temperature. Mean minimum temperature below similar to 7 degrees C suggests that crops were under suboptimal temperature for both canopy expansion and radiation-use efficiency during a significant part of this early growth period. In the second stage, during critical reproductive phases, grain yield was negatively associated with maximum temperature over 25 degrees C. Correlations between yield and modelled water supply/demand ratio showed a consistent pattern with three phases: no correlation at early stages of the growth cycle, a progressive increase in the association that peaked as the crop approached the flowering window, and a progressive decline at later reproductive stages. Using long-term weather records (1957-2010) and modelled water stress for 104 locations, we identified three major patterns of water deficit nation wide. Environment type 1 (ET1) represents the most favourable condition, with no stress during most of the pre-flowering phase and gradual development of mild stress after flowering. Type 2 is characterised by increasing water deficit between 400 degree-days before flowering and 200 degree-days after flowering and rainfall that relieves stress late in the season. Type 3 represents the more stressful condition with increasing water deficit between 400 degree-days before flowering and maturity. Across Australia, the frequency of occurrence was 24% for ET1, 32% for ET2 and 43% for ET3, highlighting the dominance of the most stressful condition. Actual yield averaged 2.2 t/ha for ET1, 1.9 t/ha for ET2 and 1.4 t/ha for ET3, and the frequency of each pattern varied substantially among locations. Shifting from a nominal (i.e. location and season) to a quantitative (i.e. stress type) characterisation of environments could help improving breeding efficiency of field pea in Australia.
Resumo:
Hip height, body condition, subcutaneous fat, eye muscle area, percentage Bos taurus, fetal age and diet digestibility data were collected at 17 372 assessments on 2181 Brahman and tropical composite (average 28% Brahman) female cattle aged between 0.5 and 7.5 years of age at five sites across Queensland. The study validated the subtraction of previously published estimates of gravid uterine weight to correct liveweight to the non-pregnant status. Hip height and liveweight were linearly related (Brahman: P<0.001, R-2 = 58%; tropical composite P<0.001, R-2 = 67%). Liveweight varied by 12-14% per body condition score (5-point scale) as cows differed from moderate condition (P<0.01). Parallel effects were also found due to subcutaneous rump fat depth and eye muscle area, which were highly correlated with each other and body condition score (r = 0.7-0.8). Liveweight differed from average by 1.65-1.66% per mm of rump fat depth and 0.71-0.76% per cm(2) of eye muscle area (P<0.01). Estimated dry matter digestibility of pasture consumed had no consistent effect in predicting liveweight and was therefore excluded from final models. A method developed to estimate full liveweight of post-weaning age female beef cattle from the other measures taken predicted liveweight to within 10 and 23% of that recorded for 65 and 95% of cases, respectively. For a 95% chance of predicted group average liveweight (body condition score used) being within 5, 4, 3, 2 and 1% of actual group average liveweight required 23, 36, 62, 137 and 521 females, respectively, if precision and accuracy of measurements matches that used in the research. Non-pregnant Bos taurus female cattle were calculated to be 10-40% heavier than Brahmans at the same hip height and body condition, indicating a substantial conformational difference. The liveweight prediction method was applied to a validation population of 83 unrelated groups of cattle weighed in extensive commercial situations on 119 days over 18 months (20 917 assessments). Liveweight prediction in the validation population exceeded average recorded liveweight for weigh groups by an average of 19 kg (similar to 6%) demonstrating the difficulty of achieving accurate and precise animal measurements under extensive commercial grazing conditions.
Resumo:
In Finland, suckler cow production is carried out in circumstances characterized by a long winter period and a short grazing period. The traditional winter housing system for suckler cows has been insulated or uninsulated buildings, but there is a demand for developing less expensive housing systems. In addition, more information is needed on new winter feeding strategies, carried out in inexpensive winter facilities with conventional (hay, grass silage, straw) or alternative (treated straw, industrial by-product, whole-crop silage) feeds. The new feeding techniques should not have any detrimental effects on animal welfare in order to be acceptable to both farmers and consumers. Furthermore, no official feeding recommendations for suckler cows are available in Finland and, thus, recommendations for dairy cows have been used. However, this may lead to over- or underfeeding of suckler cows and, finally, to decreased economic output. In Experiment I, second-calf beef-dairy suckler cows were used to compare the effects of diets based on hay (H) or urea-treated straw (US) at two feeding levels (Moderate; M vs. Low; L) on the performance of cows and calves. Live weight (LW) gain during the indoor feeding was lower for cows on level L than on level M. Cows on diet US lost more LW indoors than those on diet H. The cows replenished the LW losses on good pasture. Calf LW gain and cow milk production were unaffected by the treatments. Conception rate was unaffected by the treatments but was only 69%. Urea-treated straw proved to be a suitable winter feed for spring-calving suckler cows. Experiment II studied the effects of feeding accuracy on the performance of first- and second-calf beef-dairy cows and calves. In II-1, the day-to-day variation in the roughage offered ranged up to ± 40%. In II-2, the same variation was used in two-week periods. Variation of the roughages offered had minor effects on cow performance. Reproduction was unaffected by the feeding accuracy. Accurate feeding is not necessary for young beef-dairy crosses, if the total amount of energy offered over a period of a few weeks fulfills the energy requirements. Effects of feeding strategies with alternative feeds on the performance of mature beef-dairy and beef cows and calves were evaluated in Experiment III. Two studies consisted of two feeding strategies (Step-up vs. Flat-rate) and two diets (Control vs. Alternative). There were no differences between treatments in the cow LW, body condition score (BCS), calf pre-weaning LW gain and cow reproduction. A flat-rate strategy can be practised in the nutrition of mature suckler cows. Oat hull based flour-mill by product can partly replace grass silage and straw in the winter diet. Whole-crop barley silage can be offered as a sole feed to suckler cows. Experiment IV evaluated during the winter feeding period the effects of replacing grass silage with whole-crop barley or oat silage on mature beef cow and calf performance. Both whole-crop silages were suitable winter feeds for suckler cows in cold outdoor winter conditions. Experiment V aimed at assessing the effects of daily feeding vs. feeding every third day on the performance of mature beef cows and calves. No differences between the treatments were observed in cow LW, BCS, milk production and calf LW. The serum concentrations of urea and long-chain fatty acids were increased on the third day after feeding in the cows fed every third day. Despite of that the feeding every third day is an acceptable feeding strategy for mature suckler cows. Experiment VI studied the effects of feeding levels and long-term cold climatic conditions on mature beef cows and calves. The cows were overwintered in outdoor facilities or in an uninsulated indoor facility. Whole-crop barley silage was offered either ad libitum or restricted. All the facilities offered adequate shelter for the cows. The restricted offering of whole-crop barley silage provided enough energy for the cows. The Finnish energy recommendations for dairy cows were too high for mature beef breed suckler cows in good body condition at housing, even in cold conditions. Therefore, there is need to determine feeding recommendations for suckler cows in Finland. The results showed that the required amount of energy can be offered to the cows using conventional or alternative feeds provided at a lower feeding level, with an inaccurate feeding, flat-rate feeding or feeding every third day strategy. The cows must have an opportunity to replenish the LW and BCS losses at pasture before the next winter. Production in cold conditions can be practised in inexpensive facilities when shelter against rain and wind, a dry resting place, adequate amounts of feed suitable for cold conditions and water are provided for the animals as was done in the present study.
Resumo:
TRFLP (terminal restriction fragment length polymorphism) was used to assess whether management practices that improved disease suppression and/or yield in a 4-year ginger field trial were related to changes in soil microbial community structure. Bacterial and fungal community profiles were defined by presence and abundance of terminal restriction fragments (TRFs), where each TRF represents one or more species. Results indicated inclusion of an organic amendment and minimum tillage increased the relative diversity of dominant fungal populations in a system dependant way. Inclusion of an organic amendment increased bacterial species richness in the pasture treatment. Redundancy analysis showed shifts in microbial community structure associated with different management practices and treatments grouped according to TRF abundance in relation to yield and disease incidence. ANOVA also indicated the abundance of certain TRFs was significantly affected by farming system management practices, and a number of these TRFs were also correlated with yield or disease suppression. Further analyses are required to determine whether identified TRFs can be used as general or soil-type specific bio-indicators of productivity (increased and decreased) and Pythium myriotylum suppressiveness.
Resumo:
Inter-annual rainfall variability is a major challenge to sustainable and productive grazing management on rangelands. In Australia, rainfall variability is particularly pronounced and failure to manage appropriately leads to major economic loss and environmental degradation. Recommended strategies to manage sustainably include stocking at long-term carrying capacity (LTCC) or varying stock numbers with forage availability. These strategies are conceptually simple but difficult to implement, given the scale and spatial heterogeneity of grazing properties and the uncertainty of the climate. This paper presents learnings and insights from northern Australia gained from research and modelling on managing for rainfall variability. A method to objectively estimate LTCC in large, heterogeneous paddocks is discussed, and guidelines and tools to tactically adjust stocking rates are presented. The possible use of seasonal climate forecasts (SCF) in management is also considered. Results from a 13-year grazing trial in Queensland show that constant stocking at LTCC was far more profitable and largely maintained land condition compared with heavy stocking (HSR). Variable stocking (VAR) with or without the use of SCF was marginally more profitable, but income variability was greater and land condition poorer than constant stocking at LTCC. Two commercial scale trials in the Northern Territory with breeder cows highlighted the practical difficulties of variable stocking and provided evidence that heavier pasture utilisation rates depress reproductive performance. Simulation modelling across a range of regions in northern Australia also showed a decline in resource condition and profitability under heavy stocking rates. Modelling further suggested that the relative value of variable v. constant stocking depends on stocking rate and land condition. Importantly, variable stocking may possibly allow slightly higher stocking rates without pasture degradation. Enterprise-level simulations run for breeder herds nevertheless show that poor economic performance can occur under constant stocking and even under variable stocking in some circumstances. Modelling and research results both suggest that a form of constrained flexible stocking should be applied to manage for climate variability. Active adaptive management and research will be required as future climate changes make managing for rainfall variability increasingly challenging.
Resumo:
Predicting which species are likely to cause serious impacts in the future is crucial for targeting management efforts, but the characteristics of such species remain largely unconfirmed. We use data and expert opinion on tropical and subtropical grasses naturalised in Australia since European settlement to identify naturalised and high-impact species and subsequently to test whether high-impact species are predictable. High-impact species for the three main affected sectors (environment, pastoral and agriculture) were determined by assessing evidence against pre-defined criteria. Twenty-one of the 155 naturalised species (14%) were classified as high-impact, including four that affected more than one sector. High-impact species were more likely to have faster spread rates (regions invaded per decade) and to be semi-aquatic. Spread rate was best explained by whether species had been actively spread (as pasture), and time since naturalisation, but may not be explanatory as it was tightly correlated with range size and incidence rate. Giving more weight to minimising the chance of overlooking high-impact species, a priority for biosecurity, meant a wider range of predictors was required to identify high-impact species, and the predictive power of the models was reduced. By-sector analysis of predictors of high impact species was limited by their relative rarity, but showed sector differences, including to the universal predictors (spread rate and habitat) and life history. Furthermore, species causing high impact to agriculture have changed in the past 10 years with changes in farming practice, highlighting the importance of context in determining impact. A rationale for invasion ecology is to improve the prediction and response to future threats. Although our study identifies some universal predictors, it suggests improved prediction will require a far greater emphasis on impact rather than invasiveness, and will need to account for the individual circumstances of affected sectors and the relative rarity of high-impact species.
Resumo:
Remote detection of management-related trend in the presence of inter-annual climatic variability in the rangelands is difficult. Minimally disturbed reference areas provide a useful guide, but suitable benchmarks are usually difficult to identify. We describe a method that uses a unique conceptual framework to identify reference areas from multitemporal sequences of ground cover derived from Landsat TM and ETM+ imagery. The method does not require ground-based reference sites nor GIS layers about management. We calculate a minimum ground cover image across all years to identify locations of most persistent ground cover in years of lowest rainfall. We then use a moving window approach to calculate the difference between the window's central pixel and its surrounding reference pixels. This difference estimates ground-cover change between successive below-average rainfall years, which provides a seasonally interpreted measure of management effects. We examine the approach's sensitivity to window size and to cover-index percentiles used to define persistence. The method successfully detected management-related change in ground cover in Queensland tropical savanna woodlands in two case studies: (1) a grazing trial where heavy stocking resulted in substantial decline in ground cover in small paddocks, and (2) commercial paddocks where wet-season spelling (destocking) resulted in increased ground cover. At a larger scale, there was broad agreement between our analysis of ground-cover change and ground-based land condition change for commercial beef properties with different a priori ratings of initial condition, but there was also some disagreement where changing condition reflected pasture composition rather than ground cover. We conclude that the method is suitably robust to analyse grazing effects on ground cover across the 1.3 x 10(6) km(2) of Queensland's rangelands. Crown Copyright (c) 2012 Published by Elsevier Inc. All rights reserved.
Resumo:
The global importance of grasslands is indicated by their extent; they comprise some 26% of total land area and 80% of agriculturally productive land. The majority of grasslands are located in tropical developing countries where they are particularly important to the livelihoods of some one billion poor peoples. Grasslands clearly provide the feed base for grazing livestock and thus numerous high-quality foods, but such livestock also provide products such as fertilizer, transport, traction, fibre and leather. In addition, grasslands provide important services and roles including as water catchments, biodiversity reserves, for cultural and recreational needs, and potentially a carbon sink to alleviate greenhouse gas emissions. Inevitably, such functions may conflict with management for production of livestock products. Much of the increasing global demand for meat and milk, particularly from developing countries, will have to be supplied from grassland ecosystems, and this will provide difficult challenges. Increased production of meat and milk generally requires increased intake of metabolizable energy, and thus increased voluntary intake and/or digestibility of diets selected by grazing animals. These will require more widespread and effective application of improved management. Strategies to improve productivity include fertilizer application, grazing management, greater use of crop by-products, legumes and supplements and manipulation of stocking rate and herbage allowance. However, it is often difficult to predict the efficiency and cost-effectiveness of such strategies, particularly in tropical developing country production systems. Evaluation and on-going adjustment of grazing systems require appropriate and reliable assessment criteria, but these are often lacking. A number of emerging technologies may contribute to timely low-cost acquisition of quantitative information to better understand the soil-pasture-animal interactions and animal management in grassland systems. Development of remote imaging of vegetation, global positioning technology, improved diet markers, near IR spectroscopy and modelling provide improved tools for knowledge-based decisions on the productivity constraints of grazing animals. Individual electronic identification of animals offers opportunities for precision management on an individual animal basis for improved productivity. Improved outcomes in the form of livestock products, services and/or other outcomes from grasslands should be possible, but clearly a diversity of solutions are needed for the vast range of environments and social circumstances of global grasslands.
Resumo:
More than 1200 wheat and 120 barley experiments conducted in Australia to examine yield responses to applied nitrogen (N) fertiliser are contained in a national database of field crops nutrient research (BFDC National Database). The yield responses are accompanied by various pre-plant soil test data to quantify plant-available N and other indicators of soil fertility status or mineralisable N. A web application (BFDC Interrogator), developed to access the database, enables construction of calibrations between relative crop yield ((Y0/Ymax) × 100) and N soil test value. In this paper we report the critical soil test values for 90% RY (CV90) and the associated critical ranges (CR90, defined as the 70% confidence interval around that CV90) derived from analysis of various subsets of these winter cereal experiments. Experimental programs were conducted throughout Australia’s main grain-production regions in different eras, starting from the 1960s in Queensland through to Victoria during 2000s. Improved management practices adopted during the period were reflected in increasing potential yields with research era, increasing from an average Ymax of 2.2 t/ha in Queensland in the 1960s and 1970s, to 3.4 t/ha in South Australia (SA) in the 1980s, to 4.3 t/ha in New South Wales (NSW) in the 1990s, and 4.2 t/ha in Victoria in the 2000s. Various sampling depths (0.1–1.2 m) and methods of quantifying available N (nitrate-N or mineral-N) from pre-planting soil samples were used and provided useful guides to the need for supplementary N. The most regionally consistent relationships were established using nitrate-N (kg/ha) in the top 0.6 m of the soil profile, with regional and seasonal variation in CV90 largely accounted for through impacts on experimental Ymax. The CV90 for nitrate-N within the top 0.6 m of the soil profile for wheat crops increased from 36 to 110 kg nitrate-N/ha as Ymax increased over the range 1 to >5 t/ha. Apparent variation in CV90 with seasonal moisture availability was entirely consistent with impacts on experimental Ymax. Further analyses of wheat trials with available grain protein (~45% of all experiments) established that grain yield and not grain N content was the major driver of crop N demand and CV90. Subsets of data explored the impact of crop management practices such as crop rotation or fallow length on both pre-planting profile mineral-N and CV90. Analyses showed that while management practices influenced profile mineral-N at planting and the likelihood and size of yield response to applied N fertiliser, they had no significant impact on CV90. A level of risk is involved with the use of pre-plant testing to determine the need for supplementary N application in all Australian dryland systems. In southern and western regions, where crop performance is based almost entirely on in-crop rainfall, this risk is offset by the management opportunity to split N applications during crop growth in response to changing crop yield potential. In northern cropping systems, where stored soil moisture at sowing is indicative of minimum yield potential, erratic winter rainfall increases uncertainty about actual yield potential as well as reducing the opportunity for effective in-season applications.
Resumo:
Soil testing is the most widely used tool to predict the need for fertiliser phosphorus (P) application to crops. This study examined factors affecting critical soil P concentrations and confidence intervals for wheat and barley grown in Australian soils by interrogating validated data from 1777 wheat and 150 barley field treatment series now held in the BFDC National Database. To narrow confidence intervals associated with estimated critical P concentrations, filters for yield, crop stress, or low pH were applied. Once treatment series with low yield (<1 t/ha), severe crop stress, or pHCaCl2 <4.3 were screened out, critical concentrations were relatively insensitive to wheat yield (>1 t/ha). There was a clear increase in critical P concentration from early trials when full tillage was common compared with those conducted in 1995–2011, which corresponds to a period of rapid shift towards adoption of minimum tillage. For wheat, critical Colwell-P concentrations associated with 90 or 95% of maximum yield varied among Australian Soil Classification (ASC) Orders and Sub-orders: Calcarosol, Chromosol, Kandosol, Sodosol, Tenosol and Vertosol. Soil type, based on ASC Orders and Sub-orders, produced critical Colwell-P concentrations at 90% of maximum relative yield from 15 mg/kg (Grey Vertosol) to 47 mg/kg (Supracalcic Calcarosols), with other soils having values in the range 19–27 mg/kg. Distinctive differences in critical P concentrations were evident among Sub-orders of Calcarosols, Chromosols, Sodosols, Tenosols, and Vertosols, possibly due to differences in soil properties related to P sorption. However, insufficient data were available to develop a relationship between P buffering index (PBI) and critical P concentration. In general, there was no evidence that critical concentrations for barley would be different from those for wheat on the same soils. Significant knowledge gaps to fill to improve the relevance and reliability of soil P testing for winter cereals were: lack of data for oats; the paucity of treatment series reflecting current cropping practices, especially minimum tillage; and inadequate metadata on soil texture, pH, growing season rainfall, gravel content, and PBI. The critical concentrations determined illustrate the importance of recent experimental data and of soil type, but also provide examples of interrogation pathways into the BFDC National Database to extract locally relevant critical P concentrations for guiding P fertiliser decision-making in wheat and barley.
Resumo:
We review here research on semiochemicals for cotton pest management carried out in successive Cotton Co-operative Research Centres from 1998 to 2012. Australian cotton is now dominated by transgenic (Bt) varieties, which provide a strong platform for integrated pest management of key pests such as Helicoverpa spp., but new technologies are required to manage the development of resistance in Helicoverpa spp. to transgenic cotton and the problems posed by emerging and secondary pests, especially sucking insects. A long-range attractant for Helicoverpa moths, based on plant volatiles, has been commercialised as Magnet®. The product has substantial area-wide impacts on moth populations, and only limited effects on beneficial insects. Potential roles are being investigated for this product in resistance management of Helicoverpa spp. on transgenic cotton. Short-range, non-volatile compounds on organ surfaces of plants that do not support development of Helicoverpa spp. have been identified; these compounds deter feeding or oviposition, or are toxic to insect pests. One such product, Sero X®, is effective on Helicoverpa spp. and sucking pests such as whiteflies (Bemisia tabaci), green mirids (Creontiades dilutus), and other hemipteran insects, and is in the advanced stages of commercialisation.
Resumo:
Australian cotton (Gossypium hirsutum L.) is predominantly grown on heavy clay soils (Vertosols). Cotton grown on Vertosols often experiences episodes of low oxygen concentration in the root-zone, particularly after irrigation events. In subsurface drip-irrigation (SDI), cotton receives frequent irrigation and sustained wetting fronts are developed in the rhizosphere. This can lead to poor soil diffusion of oxygen, causing temporal and spatial hypoxia. As cotton is sensitive to waterlogging, exposure to this condition can result in a significant yield penalty. Use of aerated water for drip irrigation (‘oxygation’) can ameliorate hypoxia in the wetting front and, therefore, overcome the negative effects of poor soil aeration. The efficacy of oxygation, delivered via SDI to broadacre cotton, was evaluated over seven seasons (2005–06 to 2012–13). Oxygation of irrigation water by Mazzei air-injector produced significantly (P < 0.001) higher yields (200.3 v. 182.7 g m–2) and water-use efficiencies. Averaged over seven years, the yield and gross production water-use index of oxygated cotton exceeded that of the control by 10% and 7%, respectively. The improvements in yields and water-use efficiency in response to oxygation could be ascribed to greater root development and increased light interception by the crop canopies, contributing to enhanced crop physiological performance by ameliorating exposure to hypoxia. Oxygation of SDI contributed to improvements in both yields and water-use efficiency, which may contribute to greater economic feasibility of SDI for broadacre cotton production in Vertosols.
Resumo:
The aim of this review is to report changes in irrigated cotton water use from research projects and on-farm practice-change programs in Australia, in relation to both plant-based and irrigation engineering disciplines. At least 80% of the Australian cotton-growing area is irrigated using gravity surface-irrigation systems. This review found that, over 23 years, cotton crops utilise 6-7ML/ha of irrigation water, depending on the amount of seasonal rain received. The seasonal evapotranspiration of surface-irrigated crops averaged 729mm over this period. Over the past decade, water-use productivity by Australian cotton growers has improved by 40%. This has been achieved by both yield increases and more efficient water-management systems. The whole-farm irrigation efficiency index improved from 57% to 70%, and the crop water use index is >3kg/mm.ha, high by international standards. Yield increases over the last decade can be attributed to plant-breeding advances, the adoption of genetically modified varieties, and improved crop management. Also, there has been increased use of irrigation scheduling tools and furrow-irrigation system optimisation evaluations. This has reduced in-field deep-drainage losses. The largest loss component of the farm water balance on cotton farms is evaporation from on-farm water storages. Some farmers are changing to alternative systems such as centre pivots and lateral-move machines, and increasing numbers of these alternatives are expected. These systems can achieve considerable labour and water savings, but have significantly higher energy costs associated with water pumping and machine operation. The optimisation of interactions between water, soils, labour, carbon emissions and energy efficiency requires more research and on-farm evaluations. Standardisation of water-use efficiency measures and improved water measurement techniques for surface irrigation are important research outcomes to enable valid irrigation benchmarks to be established and compared. Water-use performance is highly variable between cotton farmers and farming fields and across regions. Therefore, site-specific measurement is important. The range in the presented datasets indicates potential for further improvement in water-use efficiency and productivity on Australian cotton farms.
Resumo:
Glyphosate resistance is a rapidly developing threat to profitability in Australian cotton farming. Resistance causes an immediate reduction in the effectiveness of in-crop weed control in glyphosate-resistant transgenic cotton and summer fallows. Although strategies for delaying glyphosate resistance and those for managing resistant populations are qualitatively similar, the longer resistance can be delayed, the longer cotton growers will have choice over which tactics to apply and when to apply them. Effective strategies to avoid, delay, and manage resistance are thus of substantial value. We used a model of glyphosate resistance dynamics to perform simulations of resistance evolution in Sonchus oleraceus (common sowthistle) and Echinochloa colona (awnless barnyard grass) under a range of resistance prevention, delaying, and management strategies. From these simulations, we identified several elements that could contribute to effective glyphosate resistance prevention and management strategies. (i) Controlling glyphosate survivors is the most robust approach to delaying or preventing resistance. High-efficacy, high-frequency survivor control almost doubled the useful lifespan of glyphosate from 13 to 25 years even with glyphosate alone used in summer fallows. (ii) Two non-glyphosate tactics in-crop plus two in-summer fallows is the minimum intervention required for long-term delays in resistance evolution. (iii) Pre-emergence herbicides are important, but should be backed up with non-glyphosate knockdowns and strategic tillage; replacing a late-season, pre-emergence herbicide with inter-row tillage was predicted to delay glyphosate resistance by 4 years in awnless barnyard grass. (iv) Weed species' ecological characteristics, particularly seed bank dynamics, have an impact on the effectiveness of resistance strategies; S. oleraceus, because of its propensity to emerge year-round, was less exposed to selection with glyphosate than E. colona, resulting in an extra 5 years of glyphosate usefulness (18 v. 13 years) even in the most rapid cases of resistance evolution. Delaying tactics are thus available that can provide some or many years of continued glyphosate efficacy. If glyphosate-resistant cotton cropping is to remain profitable in Australian farming systems in the long-term, however, growers must adapt to the probability that they will have to deal with summer weeds that are no longer susceptible to glyphosate. Robust resistance management systems will need to include a diversity of weed control options, used appropriately.