134 resultados para Water season
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Cabomba caroliniana is a submersed aquatic macrophyte that originates from the Americas and is currently invading temperate, subtropical, and tropical freshwater habitats around the world. Despite being a nuisance in many countries, little is known about its ecology. We monitored C. caroliniana populations in three reservoirs in subtropical Queensland, Australia, over 5.5 years. Although biomass, stem length, and plant density of the C. caroliniana stands fluctuated over time, they did not exhibit clear seasonal patterns. Water depth was the most important environmental factor explaining C. caroliniana abundance. Plant biomass was greatest at depths from 2–4 m and rooted plants were not found beyond 5 m. Plant density was greatest in shallow water and decreased with depth, most likely as a function of decreasing light and increasing physical stress. We tested the effect of a range of water physico-chemical parameters. The concentration of phosphorus in the water column was the variable that explained most of the variation in C. caroliniana population parameters. We found that in subtropical Australia, C. caroliniana abundance does not appear to be affected by seasonal conditions but is influenced by other environmental variables such as water depth and nutrient loading. Therefore, further spread will more likely be governed by local habitat rather than climatic conditions.
Resumo:
Cabomba caroliniana is a submersed aquatic macrophyte that originates from the Americas and is currently invading temperate, subtropical, and tropical freshwater habitats around the world. Despite being a nuisance in many countries, little is known about its ecology. We monitored C. caroliniana populations in three reservoirs in subtropical Queensland, Australia, over 5.5 years. Although biomass, stem length, and plant density of the C. caroliniana stands fluctuated over time, they did not exhibit clear seasonal patterns. Water depth was the most important environmental factor explaining C. caroliniana abundance. Plant biomass was greatest at depths from 2–4 m and rooted plants were not found beyond 5 m. Plant density was greatest in shallow water and decreased with depth, most likely as a function of decreasing light and increasing physical stress. We tested the effect of a range of water physico-chemical parameters. The concentration of phosphorus in the water column was the variable that explained most of the variation in C. caroliniana population parameters. We found that in subtropical Australia, C. caroliniana abundance does not appear to be affected by seasonal conditions but is influenced by other environmental variables such as water depth and nutrient loading. Therefore, further spread will more likely be governed by local habitat rather than climatic conditions.
Resumo:
Variable-rate technologies and site-specific crop nutrient management require real-time spatial information about the potential for response to in-season crop management interventions. Thermal and spectral properties of canopies can provide relevant information for non-destructive measurement of crop water and nitrogen stresses. In previous studies, foliage temperature was successfully estimated from canopy-scale (mixed foliage and soil) temperatures and the multispectral Canopy Chlorophyll Content Index (CCCI) was effective in measuring canopy-scale N status in rainfed wheat (Triticum aestivum L.) systems in Horsham, Victoria, Australia. In the present study, results showed that under irrigated wheat systems in Maricopa, Arizona, USA, the theoretical derivation of foliage temperature unmixing produced relationships similar to those in Horsham. Derivation of the CCCI led to an r2 relationship with chlorophyll a of 0.53 after Zadoks stage 43. This was later than the relationship (r2 = 0.68) developed for Horsham after Zadoks stage 33 but early enough to be used for potential mid-season N fertilizer recommendations. Additionally, ground-based hyperspectral data estimated plant N (g kg)1) in Horsham with an r2 = 0.86 but was confounded by water supply and N interactions. By combining canopy thermal and spectral properties, varying water and N status can potentially be identified eventually permitting targeted N applications to those parts of a field where N can be used most efficiently by the crop.
Resumo:
Quantifying the local crop response to irrigation is important for establishing adequate irrigation management strategies. This study evaluated the effect of irrigation applied with subsurface drip irrigation on field corn (Zea mays L.) evapotranspiration (ETc), yield, water use efficiencies (WUE = yield/ETc, and IWUE = yield/irrigation), and dry matter production in the semiarid climate of west central Nebraska. Eight treatments were imposed with irrigation amounts ranging from 53 to 356 mm in 2005 and from 22 to 226 mm in 2006. A soil water balance approach (based on FAO-56) was used to estimate daily soil water and ETc. Treatments resulted in seasonal ETc of 580-663 mm and 466-656 mm in 2005 and 2006, respectively. Yields among treatments differed by as much as 22% in 2005 and 52% in 2006. In both seasons, irrigation significantly affected yields, which increased with irrigation up to a point where irrigation became excessive. Distinct relationships were obtained each season. Yields increased linearly with seasonal ETc (R 2 = 0.89) and ETc/ETp (R 2 = 0.87) (ETp = ETc with no water stress). The yield response factor (ky), which indicates the relative reduction in yield to relative reduction in ETc, averaged 1.58 over the two seasons. WUE increased non-linearly with seasonal ETc and with yield. WUE was more sensitive to irrigation during the drier 2006 season, compared with 2005. Both seasons, IWUE decreased sharply with irrigation. Irrigation significantly affected dry matter production and partitioning into the different plant components (grain, cob, and stover). On average, the grain accounted for the majority of the above-ground plant dry mass (≈59%), followed by the stover (≈33%) and the cob (≈8%). The dry mass of the plant and that of each plant component tended to increase with seasonal ETc. The good relationships obtained in the study between crop performance indicators and seasonal ETc demonstrate that accurate estimates of ETc on a daily and seasonal basis can be valuable for making tactical in-season irrigation management decisions and for strategic irrigation planning and management.
Resumo:
Two field experiments using maize (Pioneer 31H50) and three watering regimes [(i) irrigated for the whole crop cycle, until anthesis, (ii) not at all (experiment 1) and (iii) fully irrigated and rain grown for the whole crop cycle (experiment 2)] were conducted at Gatton, Australia, during the 2003-04 season. Data on crop ontogeny, leaf, sheath and internode lengths and leaf width, and senescence were collected at 1- to 3-day intervals. A glasshouse experiment during 2003 quantified the responses of leaf shape and leaf presentation to various levels of water stress. Data from experiment 1 were used to modify and parameterise an architectural model of maize (ADEL-Maize) to incorporate the impact of water stress on maize canopy characteristics. The modified model produced accurate fitted values for experiment 1 for final leaf area and plant height, but values during development for leaf area were lower than observed data. Crop duration was reasonably well fitted and differences between the fully irrigated and rain-grown crops were accurately predicted. Final representations of maize crop canopies were realistic. Possible explanations for low values of leaf area are provided. The model requires further development using data from the glasshouse study and before being validated using data from experiment 2 and other independent data. It will then be used to extend functionality in architectural models of maize. With further research and development, the model should be particularly useful in examining the response of maize production to water stress including improved prediction of total biomass and grain yield. This will facilitate improved simulation of plant growth and development processes allowing investigation of genotype by environment interactions under conditions of suboptimal water supply.
Resumo:
We investigated the influence of rainfall patterns on the water-use efficiency of wheat in a transect between Horsham (36°S) and Emerald (23°S) in eastern Australia. Water-use efficiency was defined in terms of biomass and transpiration, WUEB/T, and grain yield and evapotranspiration, WUEY/ET. Our working hypothesis is that latitudinal trends in WUEY/ET of water-limited crops are the complex result of southward increasing WUEB/T and soil evaporation, and season-dependent trends in harvest index. Our approach included: (a) analysis of long-term records to establish latitudinal gradients of amount, seasonality, and size-structure of rainfall; and (b) modelling wheat development, growth, yield, water budget components, and derived variables including WUEB/T and WUEY/ET. Annual median rainfall declined from around 600 mm in northern locations to 380 mm in the south. Median seasonal rain (from sowing to harvest) doubled between Emerald and Horsham, whereas median off-season rainfall (harvest to sowing) ranged from 460 mm at Emerald to 156 mm at Horsham. The contribution of small events (≤ 5 mm) to seasonal rainfall was negligible at Emerald (median 15 mm) and substantial at Horsham (105 mm). Power law coefficients (τ), i.e. the slopes of the regression between size and number of events in a log-log scale, captured the latitudinal gradient characterised by an increasing dominance of small events from north to south during the growing season. Median modelled WUEB/T increased from 46 kg/ha.mm at Emerald to 73 kg/ha.mm at Horsham, in response to decreasing atmospheric demand. Median modelled soil evaporation during the growing season increased from 70 mm at Emerald to 172 mm at Horsham. This was explained by the size-structure of rainfall characterised with parameter τ, rather than by the total amount of rainfall. Median modelled harvest index ranged from 0.25 to 0.34 across locations, and had a season-dependent latitudinal pattern, i.e. it was greater in northern locations in dry seasons in association with wetter soil profiles at sowing. There was a season-dependent latitudinal pattern in modelled WUEY/ET. In drier seasons, high soil evaporation driven by a very strong dominance of small events, and lower harvest index override the putative advantage of low atmospheric demand and associated higher WUEB/T in southern locations, hence the significant southwards decrease in WUEY/ET. In wetter seasons, when large events contribute a significant proportion of seasonal rain, higher WUEB/T in southern locations may translate into high WUEY/ET. Linear boundary functions (French-Schultz type models) accounting for latitudinal gradients in its parameters, slope, and x-intercept, were fitted to scatter-plots of modelled yield v. evapotranspiration. The x-intercept of the model is re-interpreted in terms of rainfall size structure, and the slope or efficiency multiplier is described in terms of the radiation, temperature, and air humidity properties of the environment. Implications for crop management and breeding are discussed.
Resumo:
Stephen Setter, Melissa Setter, Michael Graham and Joe Vitelli recently published their paper 'Buoyancy and germination of pond apple (Annona glabra L.) propagules in fresh and salt water' in Proceedings of the 16th Australian Weeds Conference. Stephen also presented this paper at the conference. Pond apple is an aggressive woody weed which has invaded many wetlands, drainage lines and riparian systems across the Wet Tropics bioregion of Far North Queensland. Most fruit and seed produced by pond apple during the summer wet season fall directly into creeks, river banks, flood plains and swamps from where they are dispersed. They reported that pond apple seeds can float for up to 12 months in either fresh or salt water, with approximately 38% of these seeds germinating in a soil medium once removed from the experimental water tanks at South Johnstone. Their study suggested that the removal of reproductive trees from areas adjacent to creeks and rivers will have an immediate impact on potential spread of pond apple by limiting seed input into flowing water bodies.
Resumo:
Soft-leaf buffalo grass is increasing in popularity as an amenity turfgrass in Australia. This project was instigated to assess the adaptation of and establish management guidelines for its use in Australias vast array of growing environments. There is an extensive selection of soft-leaf buffalo grass cultivars throughout Australia and with the countrys changing climates from temperate in the south to tropical in the north not all cultivars are going to be adapted to all regions. The project evaluated 19 buffalo grass cultivars along with other warm-season grasses including green couch, kikuyu and sweet smother grass. The soft-leaf buffalo grasses were evaluated for their growth and adaptation in a number of regions throughout Australia including Western Australia, Victoria, ACT, NSW and Queensland. The growth habit of the individual cultivars was examined along with their level of shade tolerance, water use, herbicide tolerance, resistance to wear, response to nitrogen applications and growth potential in highly alkaline (pH) soils. The growth habit of the various cultivars currently commercially available in Australia differs considerably from the more robust type that spreads quicker and is thicker in appearance (Sir Walter, Kings Pride, Ned Kelly and Jabiru) to the dwarf types that are shorter and thinner in appearance (AusTine and AusDwarf). Soft-leaf buffalo grass types tested do not differ in water use when compared to old-style common buffalo grass. Thus, soft-leaf buffalo grasses, like other warm-season turfgrass species, are efficient in water use. These grasses also recover after periods of low water availability. Individual cultivar differences were not discernible. In high pH soils (i.e. on alkaline-side) some elements essential for plant growth (e.g. iron and manganese) may be deficient causing turfgrass to appear pale green, and visually unacceptable. When 14 soft-leaf buffalo grass genotypes were grown on a highly alkaline soil (pH 7.5-7.9), cultivars differed in leaf iron, but not in leaf manganese, concentrations. Nitrogen is critical to the production of quality turf. The methods for applying this essential element can be manipulated to minimise the maintenance inputs (mowing) during the peak growing period (summer). By applying the greatest proportion of the turfs total nitrogen requirements in early spring, peak summer growth can be reduced resulting in a corresponding reduction in mowing requirements. Soft-leaf buffalo grass cultivars are more shade and wear tolerant than other warm-season turfgrasses being used by homeowners. There are differences between the individual buffalo grass varieties however. The majority of types currently available would be classified as having moderate levels of shade tolerance and wear reasonably well with good recovery rates. The impact of wear in a shaded environment was not tested and there is a need to investigate this as this is a typical growing environment for many homeowners. The use of herbicides is required to maintain quality soft-leaf buffalo grass turf. The development of softer herbicides for other turfgrasses has seen an increase in their popularity. The buffalo grass cultivars currently available have shown varying levels of susceptibility to the chemicals tested. The majority of the cultivars evaluated have demonstrated low levels of phytotoxicity to the herbicides chlorsulfuron (Glean) and fluroxypyr (Starane and Comet). In general, soft leaf buffalo grasses are varied in their makeup and have demonstrated varying levels of tolerance/susceptibility/adaptation to the conditions they are grown under. Consequently, there is a need to choose the cultivar most suited to the environment it is expected to perform in and the management style it will be exposed to. Future work is required to assess how the structure of the different cultivars impacts on their capacity to tolerate wear, varying shade levels, water use and herbicide tolerance. The development of a growth model may provide the solution.
Resumo:
After more than 30 years in which ‘Tifgreen’ and ‘Tifdwarf’ were the only greens-quality varieties available, the choice for golf courses and bowls clubs in northern Australia has been expanded to include six new Cynodon hybrids [Cynodon dactylon (L.) Pers x Cynodon transvaalensis Burtt-Davy]. Five of these – ‘Champion Dwarf’ (Texas), ‘MS-Supreme’ (Mississippi), FloraDwarf™ (Florida), ‘TifEagle’ (Georgia), MiniVerde™ (Arizona) - are from US breeding programs, while the sixth, ‘TL2’ (marketed as Novotek™) was selected in north Queensland. The finer, denser and lower growing habit of the “ultradwarf” cultivars allows very low mowing heights (e.g. 2.5 mm) to be imposed, resulting in denser and smoother putting and bowls surfaces. In addition to the Cynodon hybrids, four new greens quality seashore paspalum (Paspalum vaginatum O. Swartz) cultivars including ‘Sea Isle 2000’, Sea Isle Supreme™, Velvetene™ and Sea Dwarf™ (where tolerance of salty water is required) expands the range of choices for greens in difficult environments. The project was developed to determine (a) the appropriate choice of cultivar for different environments and budgets, and (b) best management practices for the new cultivars which differ from the Cynodon hybrid industry standards ‘Tifgreen’ and ‘Tifdwarf’. Management practices, particularly fertilising, mowing heights and frequency, and thatch control were investigated to determine optimum management inputs and provide high quality playing surfaces with the new grasses. To enable effective trialling of these new and old cultivars it was essential to have a number of regional sites participating in the study. Drought and financial hardship of many clubs presented an initial setback with numerous clubs wanting to be involved in the study but were unable to commit due to their financial position at the time. The study was fortunate to have seven regional sites from Queensland, New South Wales, Victoria and South Australia volunteer to be involved in the study which would add to the results being collected at the centralised test facility being constructed at DEEDI’s Redlands Research Station. The major research findings acquired from the eight trial sites included: • All of the new second generation “ultradwarf” couchgrasses tend to produce a large amount of thatch with MiniVerde™ being the greatest thatch producer, particularly compared to ‘Tifdwarf’ and ‘Tifgreen’. The maintenance of the new Cynodon hybrids will require a program of regular dethatching/grooming as well as regular light dustings of sand. Thatch prevention should begin 3 to 4 weeks after planting a new “ultradwarf” couchgrass green, with an emphasis on prevention rather than control. • The “ultradwarfs” produced faster green speeds than the current industry standards ‘Tifgreen’ and ‘Tifdwarf’. However, all Cynodon hybrids were considerably faster than the seashore paspalums (e.g. comparable to the speed diference of Bentgrass and couchgrass) under trial conditions. Green speed was fastest being cut at 3.5 mm and rolled (compared to 3.5 mm cut, no roll and 2.7 mm cut, no roll). • All trial sites reported the occurrence of disease in the Cynodon hybrids with the main incidence of disease occurring during the dormancy period (autumn and winter). The main disease issue reported was “patch diseases” which includes both Gaumannomyces and Rhizoctonia species. There was differences in the severity of the disease between cultivars, however, the severity of the disease was not consistent between cultivars and is largely attributed to an environment (location) effect. In terms of managing the occurrence of disease, the incidence of disease is less severe where there is a higher fertility rate (about 3 kgN/100m2/year) or a preventitatve fungicide program is adopted. • Cynodon hybrid and seashore paspalum cultivars maintained an acceptable to ideal surface being cut between 2.7 mm and 5.0 mm. “Ultradwarf” cultivars can tolerate mowing heights as low as 2.5 mm for short periods but places the plant under high levels of stress. Greens being maintained at a continually lower cutting height (e.g. 2.7 mm) of both species is achievable, but would need to be cut daily for best results. Seashore paspalums performed best when cut at a height of between 2.7 mm and 3.0 mm. If a lower cutting height is adopted, regular and repeated mowings are required to reduce scalping and produce a smooth surface. • At this point in time the optimum rate of nitrogen (N) for the Cynodon hybrids is 3 kg/100m2/year and while the seashore paspalums is 2 to 3 kg/100m2/year. • Dormancy occurred for all Cynodon and seashore paspalum culitvars from north in Brisbane (QLD) to south in Mornington Peninsula (VIC) and west to Novar Gardens (SA). Cynodon and Paspalum growth in both Victoria and South Australia was less favourable as a result of the cooler climates. • After combining the data collected from all eight sites, the results indicated that there can be variation (e.g. turfgrass quality, colour, disease resistance, performace) depending on the site and climatic conditions. Such evidence highlights the need to undertake genotype by environment (G x E) studies on new and old cultivars prior to conversion or establishment. • For a club looking to select either a Cynodon hybrid or seashore paspalum cultivar for use at their club they need to: - Review the research data. - Look at trial plots. - Inspect greens in play that have the new grasses. - Select 2 to 3 cultivars that are considered to be the better types. - Establish them in large (large enough to putt on) plots/nursery/practice putter. Ideally the area should be subjected to wear. - Maintain them exactly as they would be on the golf course/lawn bowls green. This is a critical aspect. Regular mowing, fertilising etc. is essential. - Assess them over at least 2 to 3 years. - Make a selection and establish it in a playing green so that it is subjected to typical wear.
Resumo:
There is a large gap between the refined approaches to characterise genotypes and the common use of location and season as a coarse surrogate for environmental characterisation of breeding trials. As a framework for breeding, the aim of this paper is quantifying the spatial and temporal patterns of thermal and water stress for field pea in Australia. We compiled a dataset for yield of the cv. Kaspa measured in 185 environments, and investigated the associations between yield and seasonal patterns of actual temperature and modelled water stress. Correlations between yield and temperature indicated two distinct stages. In the first stage, during crop establishment and canopy expansion before flowering, yield was positively associated with minimum temperature. Mean minimum temperature below similar to 7 degrees C suggests that crops were under suboptimal temperature for both canopy expansion and radiation-use efficiency during a significant part of this early growth period. In the second stage, during critical reproductive phases, grain yield was negatively associated with maximum temperature over 25 degrees C. Correlations between yield and modelled water supply/demand ratio showed a consistent pattern with three phases: no correlation at early stages of the growth cycle, a progressive increase in the association that peaked as the crop approached the flowering window, and a progressive decline at later reproductive stages. Using long-term weather records (1957-2010) and modelled water stress for 104 locations, we identified three major patterns of water deficit nation wide. Environment type 1 (ET1) represents the most favourable condition, with no stress during most of the pre-flowering phase and gradual development of mild stress after flowering. Type 2 is characterised by increasing water deficit between 400 degree-days before flowering and 200 degree-days after flowering and rainfall that relieves stress late in the season. Type 3 represents the more stressful condition with increasing water deficit between 400 degree-days before flowering and maturity. Across Australia, the frequency of occurrence was 24% for ET1, 32% for ET2 and 43% for ET3, highlighting the dominance of the most stressful condition. Actual yield averaged 2.2 t/ha for ET1, 1.9 t/ha for ET2 and 1.4 t/ha for ET3, and the frequency of each pattern varied substantially among locations. Shifting from a nominal (i.e. location and season) to a quantitative (i.e. stress type) characterisation of environments could help improving breeding efficiency of field pea in Australia.
Resumo:
During the post-rainy (rabi) season in India around 3 million tonnes of sorghum grain is produced from 5.7 million ha of cropping. This underpins the livelihood of about 5 million households. Severe drought is common as the crop grown in these areas relies largely on soil moisture stored during the preceding rainy season. Improvement of rabi sorghum cultivars through breeding has been slow but could be accelerated if drought scenarios in the production regions were better understood. The sorghum crop model within the APSIM (Agricultural Production Systems sIMulator) platform was used to simulate crop growth and yield and the pattern of crop water status through each season using available historical weather data. The current model reproduced credibly the observed yield variation across the production region (R2=0.73). The simulated trajectories of drought stress through each crop season were clustered into five different drought stress patterns. A majority of trajectories indicated terminal drought (43%) with various timings of onset during the crop cycle. The most severe droughts (25% of seasons) were when stress began before flowering and resulted in failure of grain production in most cases, although biomass production was not affected so severely. The frequencies of drought stress types were analyzed for selected locations throughout the rabi tract and showed different zones had different predominating stress patterns. This knowledge can help better focus the search for adaptive traits and management practices to specific stress situations and thus accelerate improvement of rabi sorghum via targeted specific adaptation. The case study presented here is applicable to other sorghum growing environments. © 2012 Elsevier B.V.
Resumo:
Bellyache bush (Jatropha gossypifolia L. (Euphorbiaceae)) is a serious weed of dry tropical regions of northern Australia, with the potential to spread over much of the tropical savannah. It is well adapted to the harsh conditions of the dry tropics, defoliating during the dry season and rapidly producing new leaves with the onset of the wet season. In this study we examined the growth and biomass allocation of the three Queensland biotypes Queensland Green, Queensland Bronze and Queensland Purple) under three water regimes (water-stressed, weekly watering and constant water). Bellyache bush plants have a high capacity to adjust to water stress. The impact of water stress was consistent across the three biotypes. Water stressed plants produced significantly less biomass compared to plants with constant water, increased their biomass allocation to the roots and increased biomass allocation to leaf material. Queensland Purple plants allocated more resources to roots and less to shoots than Queensland Green (Queensland Bronze being intermediate). Queensland Green produced less root biomass than the other two biotypes.
Resumo:
The Cotton and Grain Adoption Program of the Queensland Rural Water Use Efficiency Initiative is targeting five major irrigation regions in the state with the objective to develop better irrigation water use efficiency (WUE) through the adoption of best management practices in irrigation. The major beneficiaries of the program will be industries, irrigators and local communities. The benefits will flow via two avenues: increased production and profit resulting from improved WUE and improved environmental health as a consequence of greatly reduced runoff of irrigation tailwater into rivers and streams. This in turn will reduce the risk of nutrient and pesticide contamination of waterways. As a side effect, the work is likely to contribute to an improved public image of the cotton and grain industries. In each of the five regions, WUE officers have established grower groups to assist in providing local input into the specific objectives of extension and demonstration activities. The groups also assist in developing growers' perceptions of ownership of the work. Activities are based around four on-farm demonstration sites in each region where irrigation management techniques and hardware are showcased. A key theme of the program is monitoring water use. This is applied both to on-farm storage and distribution as well as to application methods and in-field management. This paper describes the project, its activities and successes.
Resumo:
The influence of barley and oat grain supplements on hay dry matter intake (DMI), carcass components gain and meat quality in lambs fed a low quality basal diet was examined. Thirty five crossbred wether lambs (9 months of age) were divided into four groups. After adaptation to a basal diet of 85% oat hay and 15% lucerne hay for one week, an initial group of 11 was slaughtered. The weights of carcass components and digesta-free empty body weight (EBW) of this group was used to estimate the weight of carcass components of the other three experimental groups at the start of the experiment. The remaining three groups were randomly assigned to pens and fed ad libitum the basal diet alone (basal), basal with 300 g air dry barley grain (barley), basal with 300 g air dry oat grain (oat). Supplements were fed twice weekly (i.e., 900 g on Tuesday and 1200 g on Friday). After 13 weeks of feeding, animals were slaughtered and, at 24 h post-mortem meat quality and subcutaneous fat colour were measured. Samples of longissimus muscle were collected for determination of sarcomere length and meat tenderness. Hay DMI was reduced (P<0.01) by both barley and oat supplements. Lambs fed barley or oat had a higher and moderate digestibility of DM, and a higher intake of CP (P<0.05) and ME (P<0.01) than basal lambs. Final live weight of barley and oat lambs was higher (P<0.05) than basal, but this was not reflected in EBW or hot carcass weight. Lambs fed barley or oat had increases in protein (P<0.01) and water (P<0.001) in the carcass, but fat gain was not changed (P>0.05). There were no differences in eye muscle area or fat depth (total muscle and adipose tissue depth at 12th rib, 110 mm from midline; GR) among groups. The increased levels of protein and water components in the carcass of barley and oat fed lambs, associated with improved muscle production, were small and did not alter (P>0.05) any of the carcass/meat quality attributes compared to lambs fed a low quality forage diet. Feeding barley or oat grain at 0.9–1% of live weight daily to lambs consuming poor quality hay may not substantially improve carcass quality, but may be useful in maintaining body condition of lambs through the dry season for slaughter out of season
Resumo:
Continuous cultivation and cereal cropping of southern Queensland soils previously supporting native vegetation have resulted in reduced soil nitrogen supply, and consequently decreased cereal grain yields and low grain protein. To enhance yields and protein concentrations of wheat, management practices involving N fertiliser application, with no-tillage and stubble retention, grain legumes, and legume leys were evaluated from 1987 to 1998 on a fertility-depleted Vertosol at Warra, southern Queensland. The objective of this study was to examine the effect of lucerne in a 2-year lucerne–wheat rotation for its nitrogen and disease-break benefits to subsequent grain yield and protein content of wheat as compared with continuous wheat cropping. Dry matter production and nitrogen yields of lucerne were closely correlated with the total rainfall for October–September as well as March–September rainfall. Each 100 mm of total rainfall resulted in 0.97 t/ha of dry matter and 26 kg/ha of nitrogen yield. For the March–September rainfall, the corresponding values were 1.26 t/ha of dry matter and 36 kg/ha of nitrogen yield. The latter values were 10% lower than those produced by annual medics during a similar period. Compared with wheat–wheat cropping, significant increases in total soil nitrogen were observed only in 1990, 1992 and 1994 but increases in soil mineralisable nitrogen were observed in most years following lucerne. Similarly, pre-plant nitrate nitrogen in the soil profile following lucerne was higher by 74 kg/ha (9–167 kg N/ha) than that of wheat–wheat without N fertiliser in all years except 1996. Consequently, higher wheat grain protein (7 out of 9 seasons) and grain yield (4 out of 9 seasons) were produced compared with continuous wheat. There was significant depression in grain yield in 2 (1993 and 1995) out of 9 seasons attributed to soil moisture depletion and/or low growing season rainfall. Consequently, the overall responses in yield were lower than those of 50 kg/ha of fertiliser nitrogen applied to wheat–wheat crops, 2-year medic–wheat or chickpea–wheat rotation, although grain protein concentrations were higher following lucerne. The incidence and severity of the soilborne disease, common root rot of wheat caused by Bipolaris sorokiniana, was generally higher in lucerne–wheat than in continuous wheat with no nitrogen fertiliser applications, since its severity was significantly correlated with plant available water at sowing. No significant incidence of crown rot or root lesion nematode was observed. Thus, productivity, which was mainly due to nitrogen accretion in this experiment, can be maintained where short duration lucerne leys are grown in rotations with wheat.