35 resultados para Growth and remodeling
em eResearch Archive - Queensland Department of Agriculture
Resumo:
The influence of barley and oat grain supplements on hay dry matter intake (DMI), carcass components gain and meat quality in lambs fed a low quality basal diet was examined. Thirty five crossbred wether lambs (9 months of age) were divided into four groups. After adaptation to a basal diet of 85% oat hay and 15% lucerne hay for one week, an initial group of 11 was slaughtered. The weights of carcass components and digesta-free empty body weight (EBW) of this group was used to estimate the weight of carcass components of the other three experimental groups at the start of the experiment. The remaining three groups were randomly assigned to pens and fed ad libitum the basal diet alone (basal), basal with 300 g air dry barley grain (barley), basal with 300 g air dry oat grain (oat). Supplements were fed twice weekly (i.e., 900 g on Tuesday and 1200 g on Friday). After 13 weeks of feeding, animals were slaughtered and, at 24 h post-mortem meat quality and subcutaneous fat colour were measured. Samples of longissimus muscle were collected for determination of sarcomere length and meat tenderness. Hay DMI was reduced (P<0.01) by both barley and oat supplements. Lambs fed barley or oat had a higher and moderate digestibility of DM, and a higher intake of CP (P<0.05) and ME (P<0.01) than basal lambs. Final live weight of barley and oat lambs was higher (P<0.05) than basal, but this was not reflected in EBW or hot carcass weight. Lambs fed barley or oat had increases in protein (P<0.01) and water (P<0.001) in the carcass, but fat gain was not changed (P>0.05). There were no differences in eye muscle area or fat depth (total muscle and adipose tissue depth at 12th rib, 110 mm from midline; GR) among groups. The increased levels of protein and water components in the carcass of barley and oat fed lambs, associated with improved muscle production, were small and did not alter (P>0.05) any of the carcass/meat quality attributes compared to lambs fed a low quality forage diet. Feeding barley or oat grain at 0.9–1% of live weight daily to lambs consuming poor quality hay may not substantially improve carcass quality, but may be useful in maintaining body condition of lambs through the dry season for slaughter out of season
Resumo:
In many designed experiments with animals liveweight is recorded several times during the trial. Such data are commonly referred to as repeated measures data. An aim of such experiments is generally to compare the growth patterns for the applied treatments. This paper discusses some of the methods of analysing repeated measures data and illustrates the use of cubic smoothing splines to describe irregular cattle growth data. Animal production for a consuming world : proceedings of 9th Congress of the Asian-Australasian Association of Animal Production Societies [AAAP] and 23rd Biennial Conference of the Australian Society of Animal Production [ASAP] and 17th Annual Symposium of the University of Sydney, Dairy Research Foundation, [DRF]. 2-7 July 2000, Sydney, Australia.
Resumo:
In recent years significant numbers of Australian goats have been harvested from the feral population to supply a strong demand for export of meat. In addition large numbers of feral does have been domesticated to increase breeding herds in western Queensland. Introduction of the Boer breed to Australia as a specialist meat goat may provide a genetic means for improving the productive performance of the Australian feral. The present paper reports growth and carcase attributes of feral and Boer x feral genotypes born in 1998 and birthweight of those born in 1999. Animal production for a consuming world : proceedings of 9th Congress of the Asian-Australasian Association of Animal Production Societies [AAAP] and 23rd Biennial Conference of the Australian Society of Animal Production [ASAP] and 17th Annual Symposium of the University of Sydney, Dairy Research Foundation, [DRF]. 2-7 July 2000, Sydney, Australia.
Resumo:
Isolates of Sclerotinia sclerotiorum were collected from infected lentil plants from 2 agro-ecological zones of Syria and used to study their comparative growth on culture media and pathogenicity on different lentil genotypes. The growth studies were carried out on Potato Dextrose Agar (PDA) growth media under laboratory conditions. Mycelial radial growth and sclerotial production were the parameters used to compare the isolates. Pathogenicity studies were carried out with selected isolates on 10 lentil genotypes, infected as detached shoots and as whole potted-plants in the plastic house. The isolates showed considerable variation in cultural characteristics through mycelial growth, mycelial pigmentation and sclerotial production in the media plates. There were significant differences in the growth and sclerotial production of most of the isolates, but no apparent correlation between mycelial growth and sclerotial production among the isolates. Genotype by isolate interactions was significant for the isolates tested for pathogenicity. These interactions, however, appeared to be caused by differences in virulence of the isolates and did not suggest the occurrence of distinct pathogenic races of the pathogen isolates.
Resumo:
This study investigated whether mixed-species designs can increase the growth of a tropical eucalypt when compared to monocultures. Monocultures of Eucalyptus pellita (E) and Acacia peregrina (A) and mixtures in various proportions (75E:25A, 50E:50A, 25E:75A) were planted in a replacement series design on the Atherton Tablelands of north Queensland, Australia. High mortality in the establishment phase due to repeated damage by tropical cyclones altered the trial design. Effects of experimental designs on tree growth were estimated using a linear mixed-effects model with restricted maximum likelihood analysis (REML). Volume growth of individual eucalypt trees were positively affected by the presence of acacia trees at age 5 years and this effect generally increased with time up to age 10 years. However, the stand volume and basal area increased with increasing proportions of E. pellita, due to its larger individual tree size. Conventional analysis did not offer convincing support for mixed-species designs. Preliminary individual-based modelling using a modified Hegyi competition index offered a solution and an equation that indicates acacias have positive ecological interactions (facilitation or competitive reduction) and definitely do not cause competition like a eucalypt. These results suggest that significantly increased in growth rates could be achieved with mixed-species designs. This statistical methodology could enable a better understanding of species interactions in similarly altered experiments, or undesigned mixed-species plantations.
Resumo:
The problem of cannibalism in communally reared crabs can be eliminated by separating the growing crabs into holding compartments. There is currently no information on optimal compartment size for growing crabs individually. 136 second instar crablets (Portunus sanguinolentus) (C2 ca. 7-10 mm carapace width (CW)) were grown for 90 days in 10 different-sized opaque and transparent walled acrylic compartments. The base area for each compartment ranged from small (32 mm × 32 mm) to large (176 mm × 176 mm). Effects of holding space and wall transparency on survival, CW, moult increment, intermoult period and average weekly gain (AWG) were examined. Most crabs reached instars C9-C10 (50-70 mm CW) by the end of experiment. The final survival rate in the smallest compartment was 25% mainly due to moult-related mortality predominantly occurring at the C9 instar. However, crabs in these smaller compartments had earlier produced significantly larger moult increments from instar to instar than those in the larger compartments (P < 0.05). Crabs in the smaller compartments (<65 mm × 65 mm) also showed significantly longer moult periods (P < 0.05). The net result was that AWG in CW was 5.22 mm week-1 for the largest compartment and 5.15 mm week-1 in smallest and did not differ significantly between compartment size groups (P = 0.916). Wall transparency had no impact on survival (P = 0.530) but a slight impact on AWG (P = 0.014). Survival rate was the best indicator of minimum acceptable compartment size (?43 mm × 43 mm) for C10 crablets because below this size death occurred before growth rate was significantly affected. For further growth, it would be necessary to transfer the crablets to larger compartments.
Resumo:
Quantitative information regarding nitrogen (N) accumulation and its distribution to leaves, stems and grains under varying environmental and growth conditions are limited for chickpea (Cicer arietinum L.). The information is required for the development of crop growth models and also for assessment of the contribution of chickpea to N balances in cropping systems. Accordingly, these processes were quantified in chickpea under different environmental and growth conditions (still without water or N deficit) using four field experiments and 1325 N measurements. N concentration ([N]) in green leaves was 50 mg g-1 up to beginning of seed growth, and then it declined linearly to 30 mg g-1 at the end of seed growth phase. [N] in senesced leaves was 12 mg g-1. Stem [N] decreased from 30 mg g-1 early in the season to 8 mg g-1 in senesced stems at maturity. Pod [N] was constant (35 mg g-1), but grain [N] decreased from 60 mg g-1 early in seed growth to 43 mg g-1 at maturity. Total N accumulation ranged between 9 and 30 g m-2. N accumulation was closely linked to biomass accumulation until maturity. N accumulation efficiency (N accumulation relative to biomass accumulation) was 0.033 g g-1 where total biomass was -2 and during early growth period, but it decreased to 0.0176 g g-1 during the later growth period when total biomass was >218 g m-2. During vegetative growth (up to first-pod), 58% of N was partitioned to leaves and 42% to stems. Depending on growth conditions, 37-72% of leaf N and 12-56% of stem N was remobilized to the grains. The parameter estimates and functions obtained in this study can be used in chickpea simulation models to simulate N accumulation and distribution.
Resumo:
Residue retention is an important issue in evaluating the sustainability of production forestry. However, its long-term impacts have not been studied extensively, especially in sub-tropical environments. This study investigated the long-term impact of harvest residue retention on tree nutrition, growth and productivity of a F1 hybrid (Pinus elliottii var. elliottii × Pinus caribaea var. hondurensis) exotic pine plantation in sub-tropical Australia, under three harvest residue management regimes: (1) residue removal, RR0; (2) single residue retention, RR1; and (3) double residue retention, RR2. The experiment, established in 1996, is a randomised complete block design with 4 replicates. Tree growth measurements in this study were carried out at ages 2, 4, 6, 8 and 10 years, while foliar nutrient analyses were carried out at ages 2, 4, 6 and 10 years. Litter production and litter nitrogen (N) and phosphorus (P) measurements were carried out quarterly over a 15-month period between ages 9 and 10 years. Results showed that total tree growth was still greater in residue-retained treatments compared to the RR0 treatment. However, mean annual increments of diameter at breast height (MAID) and basal area (MAIB) declined significantly after age 4 years to about 68-78% at age 10 years. Declining foliar N and P concentrations accounted for 62% (p < 0.05) of the variation of growth rates after age 4 years, and foliar N and P concentrations were either marginal or below critical concentrations. In addition, litter production, and litter N and P contents were not significantly different among the treatments. This study suggests that the impact of residue retention on tree nutrition and growth rates might be limited over a longer period, and that the integration of alternative forest management practices is necessary to sustain the benefits of harvest residues until the end of the rotation.
Resumo:
Despite an abundance of polyembryonic genotypes and the need for rootstocks that improve scion yield and productivity, simultaneous field testing of a wide range of mango (Mangifera indica L.) genotypes as rootstocks has not previously been reported. In this experiment, we examined the growth and yield of 'Kensington Pride' on 64 mango genotypes of diverse origin during the first four seasons of fruit production to identify those worth longer-term assessment. We also recorded morphological characteristics of seedlings of 46 of these genotypes in an attempt to relate these measures to subsequent field performance. Tree canopy development on the most vigorous rootstocks was almost double that on the least vigorous. Growth rates differed by more than 160%. Cumulative marketable yield ranged from 36 kg/tree for the lowest yielding rootstock to 181 kg/tree for the most productive. Yield efficiency also differed markedly among the 64 rootstocks with the best treatment being 3.5 times more efficient than the poorest treatment. No relationship was found between yield efficiency and tree size, suggesting it is possible to select highly efficient rootstocks of differing vigor. Two genotypes ('Brodie' and 'MYP') stood out as providing high yield efficiency with small tree size. A further two genotypes ('B' and 'Watertank') were identified as offering high yield efficiency and large tree size and should provide high early yields at traditional tree spacing. Efforts to relate the morphology of different genotype seedlings to subsequent performance as a rootstock showed that nursery performance of mango seedlings is no indication of their likely behavior as a rootstock. The economic cost of poor yields and low yield efficiencies during the early years of commercial orchard production provide a rationale for culling many of the rootstock treatments in this experiment and concentrating future assessment on the top ~20% of the 64 treatments. Of these, 'MYP', 'B', 'Watertank', 'Manzano', and 'Pancho' currently show the most promise.
Resumo:
An understanding of growth and photosynthetic potential of subtropical rainforest species to variations in light environment can be useful for determining the sequence of species introductions in rainforest restoration projects and mixed species plantations. We examined the growth and physiology of six Australian subtropical rainforest tree species in a greenhouse consisting of three artificial light environments (10%, 30%, and 60% full sunlight). Morphological responses followed the typical sun-shade dichotomy, with early and late secondary species (Elaeocarpus grandis, Flindersia brayleyana, Flindersia schottiana, and Gmelina leichhardtii) displaying higher relative growth rate (RGR) compared to mature stage species (Cryptocarya erythroxyion and Heritiera trifoliolatum). Growth and photosynthetic performance of most species reached a maximum in 30-60% full sunlight. Physiological responses provided limited evidence of a distinct dichotomy between early and late successional species. E. grandis and F brayleyana, provided a clear representation of early successional species, with marked increase in Am in high light and an ability to down regulate photosynthetic machinery in low light conditions. The remaining species (F. schottiana, G. leichhardtii, and H. trifoliolatum) were better represented as failing along a shade-tolerant continuum, with limited ability to adjust physiologically to an increase or decrease in light, maintaining similar A(max) across all light environments. Results show that most species belong to a shade-tolerant constituency, with an ability to grow and persist across a wide range of light environments. The species offer a wide range of potential planting scenarios and silvicultural options, with ample potential to achieve rapid canopy closure and rainforest restoration goals.
Resumo:
The effect of defoliation on Amarillo (Arachis pintoi cv. Amarillo) was studied in a glasshouse and in mixed swards with 2 tropical grasses. In the glasshouse, Amarillo plants grown in pots were subjected to a 30/20°C or 25/15°C temperature regime and to defoliation at 10-, 20- or 30-day intervals for 60 days. Two field plot studies were conducted on Amarillo with either irrigated kikuyu (Pennisetum clandestinum) in autumn and spring or dryland Pioneer rhodes grass (Chloris gayana) over summer and autumn. Treatments imposed were 3 defoliation intervals (7, 14 and 28 days) and 2 residual heights (5 and 10 cm for kikuyu; 3 and 10 cm for rhodes grass) with extra treatments (56 days to 3 cm for both grasses and 21 days to 5 cm for kikuyu). Defoliation interval had no significant effect on accumulated Amarillo leaf dry matter (DM) at either temperature regime. At the higher temperature, frequent defoliation reduced root dry weight (DW) and increased crude protein (CP) but had no effect on stolon DW or in vitro organic matter digestibility (OMD). On the other hand, at the lower temperature, frequent defoliation reduced stolon DW and increased OMD but had no effect on root DW or CP. Irrespective of temperaure and defoliation, water-soluble carbohydrate levels were higher in stolons than in roots (4.70 vs 3.65%), whereas for starch the reverse occured (5.37 vs 9.44%). Defoliating the Amarillo-kikuyu sward once at 56 days to 3 cm produced the highest DM yield in autumn and sprong (582 and 7121 kg/ha DM, respectively), although the Amarillo component and OMD were substantially reduced. Highest DM yields (1726 kg/ha) were also achieved in the Amarillo-rhodes grass sward when defoliated every 56 days to 3 cm, although the Amarillo component was unaffected. In a mixed sward with either kikuyu or rhodes grass, the Amarillo component in the sward was maintained up to a 28-day defoliation interval and was higher when more severely defoliated. The results show that Amarillo can tolerate frequent defoliation and that it can co-exist with tropical grasses of differing growth habits, provided the Amarillo-tropical grass sward is subject to frequent and severe defoliation.
Resumo:
Fibre diameter can vary dramatically along a wool staple, especially in the Mediterranean environment of southern Australia with its dry summers and abundance of green feed in spring. Other research results have shown a very low phenotypic correlation between fibre diameter grown between seasons. Many breeders use short staples to measure fibre diameter for breeding purposes and also to promote animals for sale. The effectiveness of this practice is determined by the relative response to selection by measuring fibre traits on a full 12 months wool staple as compared to measuring them only on part of a staple. If a high genetic correlation exists between the part record and the full record, then using part records may be acceptable to identify genetically superior animals. No information is available on the effectiveness of part records. This paper investigated whether wool growth and fibre diameter traits of Merino wool grown at different times of the year in a Mediterranean environment, are genetically the same trait, respectively. The work was carried out on about 7 dyebanded wool sections/animal.year, on ewes from weaning to hogget age, in the Katanning Merino resource flocks over 6 years. Relative clean wool growth of the different sections had very low heritability estimates of less than 0.10, and they were phenotypically and genetically poorly correlated with 6 or 12 months wool growth. This indicates that part record measurement of clean wool growth of these sections will be ineffective as indirect selection criteria to improve wool growth genetically. Staple length growth as measured by the length between dyebands, would be more effective with heritability estimates of between 0.20 and 0.30. However, these measurements were shown to have a low genetic correlation with wool grown for 12 months which implies that these staple length measurements would only be half as efficient as the wool weight for 6 or 12 months to improve total clean wool weight. Heritability estimates of fibre diameter, coefficient of variation of fibre diameter and fibre curvature were relatively high and were genetically and phenotypically highly correlated across sections. High positive phenotypic and genetic correlations were also found between fibre diameter, coefficient of variation of fibre diameter and fibre curvature of the different sections and similar measurements for wool grown over 6 or 12 months. Coefficient of variation of fibre diameter of the sections also had a moderate negative phenotypic and genetic correlation with staple strength of wool staples grown over 6 months indicating that coefficient of variation of fibre diameter of any section would be as good an indirect selection criterion to improve stable strength as coefficient of variation of fibre diameter for wool grown over 6 or 12 months. The results indicate that fibre diameter, coefficient of variation of fibre diameter and fibre curvature of wool grown over short periods of time have virtually the same heritability as that of wool grown over 12 months, and that the genetic correlation between fibre diameter, coefficient of variation of fibre diameter and fibre curvature on part and on full records is very high (rg > 0.85). This indicates that fibre diameter, coefficient of variation of fibre diameter and fibre curvature on part records can be used as selection criteria to improve these traits. However, part records of greasy and clean wool growth would be much less efficient than fleece weight for wool grown over 6 or 12 months because of the low heritability of part records and the low genetic correlation between these traits on part records and on wool grown for 12 months.
Resumo:
It has been reported that high-density planting of sugarcane can improve cane and sugar yield through promoting rapid canopy closure and increasing radiation interception earlier in crop growth. It is widely known that the control of adverse soil biota through fumigation (removes soil biological constraints and improves soil health) can improve cane and sugar yield. Whether the responses to high-density planting and improved soil health are additive or interactive has important implications for the sugarcane production system. Field experiments established at Bundaberg and Mackay, Queensland, Australia, involved all combinations of 2-row spacings (0.5 and 1.5 m), two planting densities (27 000 and 81 000 two-eyed setts/ha), and two soil fumigation treatments (fumigated and non-fumigated). The Bundaberg experiment had two cultivars (Q124, Q155), was fully irrigated, and harvested 15 months after planting. The Mackay experiment had one cultivar (Q117), was grown under rainfed conditions, and harvested 10 months after planting. High-density planting (81 000 setts/ha in 0.5-m rows) did not produce any more cane or sugar yield at harvest than low-density planting (27 000 setts/ha in 1.5-m rows) regardless of location, crop duration (15 v. 10 months), water supply (irrigated v. rainfed), or soil health (fumigated v. non-fumigated). Conversely, soil fumigation generally increased cane and sugar yields regardless of site, row spacing, and planting density. In the Bundaberg experiment there was a large fumigation x cultivar x density interaction (P<0.01). Cultivar Q155 responded positively to higher planting density in non-fumigated soil but not in fumigated soil, while Q124 showed a negative response to higher planting density in non-fumigated soil but no response in fumigated soil. In the Mackay experiment, Q117 showed a non-significant trend of increasing yield in response to increasing planting density in non-fumigated soil, similar to the Q155 response in non-fumigated soil at Bundaberg. The similarity in yield across the range of row spacings and planting densities within experiments was largely due to compensation between stalk number and stalk weight, particularly when fumigation was used to address soil health. Further, the different cultivars (Q124 and Q155 at Bundaberg and Q117 at Mackay) exhibited differing physiological responses to the fumigation, row spacing, and planting density treatments. These included the rate of tiller initiation and subsequent loss, changes in stalk weight, and propensity to lodging. These responses suggest that there may be potential for selecting cultivars suited to different planting configurations.
Resumo:
The promotion of controlled traffic (matching wheel and row spacing) in the Australian sugar industry is necessitating a widening of row spacing beyond the standard 1.5 m. As all cultivars grown in the Australian industry have been selected under the standard row spacing there are concerns that at least some cultivars may not be suitable for wider rows. To address this issue, experiments were established in northern and southern Queensland in which cultivars, with different growth characteristics, recommended for each region, were grown under a range of different row configurations. In the northern Queensland experiment at Gordonvale, cultivars Q187((sic)), Q200((sic)), Q201((sic)), and Q218((sic)) were grown in 1.5-m single rows, 1.8-m single rows, 1.8-m dual rows (50 cm between duals), and 2.3-m dual rows (80 cm between duals). In the southern Queensland experiment at Farnsfield, cvv. Q138, Q205((sic)), Q222((sic)) and Q188((sic)) were also grown in 1.5-m single rows, 1.8-m single rows, 1.8-m dual rows (50 cm between duals), while 1.8-m-wide throat planted single row and 2.0-m dual row (80 cm between duals) configurations were also included. There was no difference in yield between the different row configurations at Farnsfield but there was a significant row configuration x cultivar interaction at Gordonvale due to good yields in 1.8-m single and dual rows with Q201((sic)) and poor yields with Q200((sic)) at the same row spacings. There was no significant difference between the two cultivars in 1.5-m single and 2.3-m dual rows. The experiments once again demonstrated the compensatory capacity that exists in sugarcane to manipulate stalk number and individual stalk weight as a means of producing similar yields across a range of row configurations and planting densities. There was evidence of different growth patterns between cultivars in response to different row configurations (viz. propensity to tiller, susceptibility to lodging, ability to compensate between stalk number and stalk weight), suggesting that there may be genetic differences in response to row configuration. It is argued that there is a need to evaluate potential cultivars under a wider range of row configurations than the standard 1.5-m single rows. Cultivars that perform well in row configurations ranging from 1.8 to 2.0 m are essential if the adverse effects of soil compaction are to be managed through the adoption of controlled traffic.