55 resultados para Growth And Development
Resumo:
The influence of barley and oat grain supplements on hay dry matter intake (DMI), carcass components gain and meat quality in lambs fed a low quality basal diet was examined. Thirty five crossbred wether lambs (9 months of age) were divided into four groups. After adaptation to a basal diet of 85% oat hay and 15% lucerne hay for one week, an initial group of 11 was slaughtered. The weights of carcass components and digesta-free empty body weight (EBW) of this group was used to estimate the weight of carcass components of the other three experimental groups at the start of the experiment. The remaining three groups were randomly assigned to pens and fed ad libitum the basal diet alone (basal), basal with 300 g air dry barley grain (barley), basal with 300 g air dry oat grain (oat). Supplements were fed twice weekly (i.e., 900 g on Tuesday and 1200 g on Friday). After 13 weeks of feeding, animals were slaughtered and, at 24 h post-mortem meat quality and subcutaneous fat colour were measured. Samples of longissimus muscle were collected for determination of sarcomere length and meat tenderness. Hay DMI was reduced (P<0.01) by both barley and oat supplements. Lambs fed barley or oat had a higher and moderate digestibility of DM, and a higher intake of CP (P<0.05) and ME (P<0.01) than basal lambs. Final live weight of barley and oat lambs was higher (P<0.05) than basal, but this was not reflected in EBW or hot carcass weight. Lambs fed barley or oat had increases in protein (P<0.01) and water (P<0.001) in the carcass, but fat gain was not changed (P>0.05). There were no differences in eye muscle area or fat depth (total muscle and adipose tissue depth at 12th rib, 110 mm from midline; GR) among groups. The increased levels of protein and water components in the carcass of barley and oat fed lambs, associated with improved muscle production, were small and did not alter (P>0.05) any of the carcass/meat quality attributes compared to lambs fed a low quality forage diet. Feeding barley or oat grain at 0.9–1% of live weight daily to lambs consuming poor quality hay may not substantially improve carcass quality, but may be useful in maintaining body condition of lambs through the dry season for slaughter out of season
Resumo:
In many designed experiments with animals liveweight is recorded several times during the trial. Such data are commonly referred to as repeated measures data. An aim of such experiments is generally to compare the growth patterns for the applied treatments. This paper discusses some of the methods of analysing repeated measures data and illustrates the use of cubic smoothing splines to describe irregular cattle growth data. Animal production for a consuming world : proceedings of 9th Congress of the Asian-Australasian Association of Animal Production Societies [AAAP] and 23rd Biennial Conference of the Australian Society of Animal Production [ASAP] and 17th Annual Symposium of the University of Sydney, Dairy Research Foundation, [DRF]. 2-7 July 2000, Sydney, Australia.
Resumo:
In recent years significant numbers of Australian goats have been harvested from the feral population to supply a strong demand for export of meat. In addition large numbers of feral does have been domesticated to increase breeding herds in western Queensland. Introduction of the Boer breed to Australia as a specialist meat goat may provide a genetic means for improving the productive performance of the Australian feral. The present paper reports growth and carcase attributes of feral and Boer x feral genotypes born in 1998 and birthweight of those born in 1999. Animal production for a consuming world : proceedings of 9th Congress of the Asian-Australasian Association of Animal Production Societies [AAAP] and 23rd Biennial Conference of the Australian Society of Animal Production [ASAP] and 17th Annual Symposium of the University of Sydney, Dairy Research Foundation, [DRF]. 2-7 July 2000, Sydney, Australia.
Resumo:
The development of innovative methods of stock assessment is a priority for State and Commonwealth fisheries agencies. It is driven by the need to facilitate sustainable exploitation of naturally occurring fisheries resources for the current and future economic, social and environmental well being of Australia. This project was initiated in this context and took advantage of considerable recent achievements in genomics that are shaping our comprehension of the DNA of humans and animals. The basic idea behind this project was that genetic estimates of effective population size, which can be made from empirical measurements of genetic drift, were equivalent to estimates of the successful number of spawners that is an important parameter in process of fisheries stock assessment. The broad objectives of this study were to 1. Critically evaluate a variety of mathematical methods of calculating effective spawner numbers (Ne) by a. conducting comprehensive computer simulations, and by b. analysis of empirical data collected from the Moreton Bay population of tiger prawns (P. esculentus). 2. Lay the groundwork for the application of the technology in the northern prawn fishery (NPF). 3. Produce software for the calculation of Ne, and to make it widely available. The project pulled together a range of mathematical models for estimating current effective population size from diverse sources. Some of them had been recently implemented with the latest statistical methods (eg. Bayesian framework Berthier, Beaumont et al. 2002), while others had lower profiles (eg. Pudovkin, Zaykin et al. 1996; Rousset and Raymond 1995). Computer code and later software with a user-friendly interface (NeEstimator) was produced to implement the methods. This was used as a basis for simulation experiments to evaluate the performance of the methods with an individual-based model of a prawn population. Following the guidelines suggested by computer simulations, the tiger prawn population in Moreton Bay (south-east Queensland) was sampled for genetic analysis with eight microsatellite loci in three successive spring spawning seasons in 2001, 2002 and 2003. As predicted by the simulations, the estimates had non-infinite upper confidence limits, which is a major achievement for the application of the method to a naturally-occurring, short generation, highly fecund invertebrate species. The genetic estimate of the number of successful spawners was around 1000 individuals in two consecutive years. This contrasts with about 500,000 prawns participating in spawning. It is not possible to distinguish successful from non-successful spawners so we suggest a high level of protection for the entire spawning population. We interpret the difference in numbers between successful and non-successful spawners as a large variation in the number of offspring per family that survive – a large number of families have no surviving offspring, while a few have a large number. We explored various ways in which Ne can be useful in fisheries management. It can be a surrogate for spawning population size, assuming the ratio between Ne and spawning population size has been previously calculated for that species. Alternatively, it can be a surrogate for recruitment, again assuming that the ratio between Ne and recruitment has been previously determined. The number of species that can be analysed in this way, however, is likely to be small because of species-specific life history requirements that need to be satisfied for accuracy. The most universal approach would be to integrate Ne with spawning stock-recruitment models, so that these models are more accurate when applied to fisheries populations. A pathway to achieve this was established in this project, which we predict will significantly improve fisheries sustainability in the future. Regardless of the success of integrating Ne into spawning stock-recruitment models, Ne could be used as a fisheries monitoring tool. Declines in spawning stock size or increases in natural or harvest mortality would be reflected by a decline in Ne. This would be good for data-poor fisheries and provides fishery independent information, however, we suggest a species-by-species approach. Some species may be too numerous or experiencing too much migration for the method to work. During the project two important theoretical studies of the simultaneous estimation of effective population size and migration were published (Vitalis and Couvet 2001b; Wang and Whitlock 2003). These methods, combined with collection of preliminary genetic data from the tiger prawn population in southern Gulf of Carpentaria population and a computer simulation study that evaluated the effect of differing reproductive strategies on genetic estimates, suggest that this technology could make an important contribution to the stock assessment process in the northern prawn fishery (NPF). Advances in the genomics world are rapid and already a cheaper, more reliable substitute for microsatellite loci in this technology is available. Digital data from single nucleotide polymorphisms (SNPs) are likely to super cede ‘analogue’ microsatellite data, making it cheaper and easier to apply the method to species with large population sizes.
Resumo:
Isolates of Sclerotinia sclerotiorum were collected from infected lentil plants from 2 agro-ecological zones of Syria and used to study their comparative growth on culture media and pathogenicity on different lentil genotypes. The growth studies were carried out on Potato Dextrose Agar (PDA) growth media under laboratory conditions. Mycelial radial growth and sclerotial production were the parameters used to compare the isolates. Pathogenicity studies were carried out with selected isolates on 10 lentil genotypes, infected as detached shoots and as whole potted-plants in the plastic house. The isolates showed considerable variation in cultural characteristics through mycelial growth, mycelial pigmentation and sclerotial production in the media plates. There were significant differences in the growth and sclerotial production of most of the isolates, but no apparent correlation between mycelial growth and sclerotial production among the isolates. Genotype by isolate interactions was significant for the isolates tested for pathogenicity. These interactions, however, appeared to be caused by differences in virulence of the isolates and did not suggest the occurrence of distinct pathogenic races of the pathogen isolates.
Resumo:
This study investigated whether mixed-species designs can increase the growth of a tropical eucalypt when compared to monocultures. Monocultures of Eucalyptus pellita (E) and Acacia peregrina (A) and mixtures in various proportions (75E:25A, 50E:50A, 25E:75A) were planted in a replacement series design on the Atherton Tablelands of north Queensland, Australia. High mortality in the establishment phase due to repeated damage by tropical cyclones altered the trial design. Effects of experimental designs on tree growth were estimated using a linear mixed-effects model with restricted maximum likelihood analysis (REML). Volume growth of individual eucalypt trees were positively affected by the presence of acacia trees at age 5 years and this effect generally increased with time up to age 10 years. However, the stand volume and basal area increased with increasing proportions of E. pellita, due to its larger individual tree size. Conventional analysis did not offer convincing support for mixed-species designs. Preliminary individual-based modelling using a modified Hegyi competition index offered a solution and an equation that indicates acacias have positive ecological interactions (facilitation or competitive reduction) and definitely do not cause competition like a eucalypt. These results suggest that significantly increased in growth rates could be achieved with mixed-species designs. This statistical methodology could enable a better understanding of species interactions in similarly altered experiments, or undesigned mixed-species plantations.
Resumo:
The problem of cannibalism in communally reared crabs can be eliminated by separating the growing crabs into holding compartments. There is currently no information on optimal compartment size for growing crabs individually. 136 second instar crablets (Portunus sanguinolentus) (C2 ca. 7-10 mm carapace width (CW)) were grown for 90 days in 10 different-sized opaque and transparent walled acrylic compartments. The base area for each compartment ranged from small (32 mm × 32 mm) to large (176 mm × 176 mm). Effects of holding space and wall transparency on survival, CW, moult increment, intermoult period and average weekly gain (AWG) were examined. Most crabs reached instars C9-C10 (50-70 mm CW) by the end of experiment. The final survival rate in the smallest compartment was 25% mainly due to moult-related mortality predominantly occurring at the C9 instar. However, crabs in these smaller compartments had earlier produced significantly larger moult increments from instar to instar than those in the larger compartments (P < 0.05). Crabs in the smaller compartments (<65 mm × 65 mm) also showed significantly longer moult periods (P < 0.05). The net result was that AWG in CW was 5.22 mm week-1 for the largest compartment and 5.15 mm week-1 in smallest and did not differ significantly between compartment size groups (P = 0.916). Wall transparency had no impact on survival (P = 0.530) but a slight impact on AWG (P = 0.014). Survival rate was the best indicator of minimum acceptable compartment size (?43 mm × 43 mm) for C10 crablets because below this size death occurred before growth rate was significantly affected. For further growth, it would be necessary to transfer the crablets to larger compartments.
Resumo:
The parasitic weed Orobanche crenata inflicts major damage on faba bean, lentil, pea and other crops in Mediterranean environments. The development of methods to control O. crenata is to a large extent hampered by the complexity of host-parasite systems. Using a model of host-parasite interactions can help to explain and understand this intricacy. This paper reports on the evaluation and application of a model simulating host-parasite competition as affected by environment and management that was implemented in the framework of the Agricultural Production Systems Simulator (APSIM). Model-predicted faba bean and O. crenata growth and development were evaluated against independent data. The APSIM-Fababean and -Parasite modules displayed a good capability to reproduce effects of pedoclimatic conditions, faba bean sowing date and O. crenata infestation on host-parasite competition. The r(2) values throughout exceeded 0.84 (RMSD: 5.36 days) for phenological, 0.85 (RMSD: 223.00 g m(-2)) for host growth and 0.78 (RMSD: 99.82 g m(-2)) for parasite growth parameters. Inaccuracies of simulated faba bean root growth that caused some bias of predicted parasite number and host yield loss may be dealt with by more flexibly simulating vertical root distribution. The model was applied in simulation experiments to determine optimum sowing windows for infected and non-infected faba bean in Mediterranean environments. Simulation results proved realistic and testified to the capability of APSIM to contribute to the development of tactical approaches in parasitic weed control.
Resumo:
Residue retention is an important issue in evaluating the sustainability of production forestry. However, its long-term impacts have not been studied extensively, especially in sub-tropical environments. This study investigated the long-term impact of harvest residue retention on tree nutrition, growth and productivity of a F1 hybrid (Pinus elliottii var. elliottii × Pinus caribaea var. hondurensis) exotic pine plantation in sub-tropical Australia, under three harvest residue management regimes: (1) residue removal, RR0; (2) single residue retention, RR1; and (3) double residue retention, RR2. The experiment, established in 1996, is a randomised complete block design with 4 replicates. Tree growth measurements in this study were carried out at ages 2, 4, 6, 8 and 10 years, while foliar nutrient analyses were carried out at ages 2, 4, 6 and 10 years. Litter production and litter nitrogen (N) and phosphorus (P) measurements were carried out quarterly over a 15-month period between ages 9 and 10 years. Results showed that total tree growth was still greater in residue-retained treatments compared to the RR0 treatment. However, mean annual increments of diameter at breast height (MAID) and basal area (MAIB) declined significantly after age 4 years to about 68-78% at age 10 years. Declining foliar N and P concentrations accounted for 62% (p < 0.05) of the variation of growth rates after age 4 years, and foliar N and P concentrations were either marginal or below critical concentrations. In addition, litter production, and litter N and P contents were not significantly different among the treatments. This study suggests that the impact of residue retention on tree nutrition and growth rates might be limited over a longer period, and that the integration of alternative forest management practices is necessary to sustain the benefits of harvest residues until the end of the rotation.
Resumo:
An understanding of growth and photosynthetic potential of subtropical rainforest species to variations in light environment can be useful for determining the sequence of species introductions in rainforest restoration projects and mixed species plantations. We examined the growth and physiology of six Australian subtropical rainforest tree species in a greenhouse consisting of three artificial light environments (10%, 30%, and 60% full sunlight). Morphological responses followed the typical sun-shade dichotomy, with early and late secondary species (Elaeocarpus grandis, Flindersia brayleyana, Flindersia schottiana, and Gmelina leichhardtii) displaying higher relative growth rate (RGR) compared to mature stage species (Cryptocarya erythroxyion and Heritiera trifoliolatum). Growth and photosynthetic performance of most species reached a maximum in 30-60% full sunlight. Physiological responses provided limited evidence of a distinct dichotomy between early and late successional species. E. grandis and F brayleyana, provided a clear representation of early successional species, with marked increase in Am in high light and an ability to down regulate photosynthetic machinery in low light conditions. The remaining species (F. schottiana, G. leichhardtii, and H. trifoliolatum) were better represented as failing along a shade-tolerant continuum, with limited ability to adjust physiologically to an increase or decrease in light, maintaining similar A(max) across all light environments. Results show that most species belong to a shade-tolerant constituency, with an ability to grow and persist across a wide range of light environments. The species offer a wide range of potential planting scenarios and silvicultural options, with ample potential to achieve rapid canopy closure and rainforest restoration goals.
Resumo:
The effect of defoliation on Amarillo (Arachis pintoi cv. Amarillo) was studied in a glasshouse and in mixed swards with 2 tropical grasses. In the glasshouse, Amarillo plants grown in pots were subjected to a 30/20°C or 25/15°C temperature regime and to defoliation at 10-, 20- or 30-day intervals for 60 days. Two field plot studies were conducted on Amarillo with either irrigated kikuyu (Pennisetum clandestinum) in autumn and spring or dryland Pioneer rhodes grass (Chloris gayana) over summer and autumn. Treatments imposed were 3 defoliation intervals (7, 14 and 28 days) and 2 residual heights (5 and 10 cm for kikuyu; 3 and 10 cm for rhodes grass) with extra treatments (56 days to 3 cm for both grasses and 21 days to 5 cm for kikuyu). Defoliation interval had no significant effect on accumulated Amarillo leaf dry matter (DM) at either temperature regime. At the higher temperature, frequent defoliation reduced root dry weight (DW) and increased crude protein (CP) but had no effect on stolon DW or in vitro organic matter digestibility (OMD). On the other hand, at the lower temperature, frequent defoliation reduced stolon DW and increased OMD but had no effect on root DW or CP. Irrespective of temperaure and defoliation, water-soluble carbohydrate levels were higher in stolons than in roots (4.70 vs 3.65%), whereas for starch the reverse occured (5.37 vs 9.44%). Defoliating the Amarillo-kikuyu sward once at 56 days to 3 cm produced the highest DM yield in autumn and sprong (582 and 7121 kg/ha DM, respectively), although the Amarillo component and OMD were substantially reduced. Highest DM yields (1726 kg/ha) were also achieved in the Amarillo-rhodes grass sward when defoliated every 56 days to 3 cm, although the Amarillo component was unaffected. In a mixed sward with either kikuyu or rhodes grass, the Amarillo component in the sward was maintained up to a 28-day defoliation interval and was higher when more severely defoliated. The results show that Amarillo can tolerate frequent defoliation and that it can co-exist with tropical grasses of differing growth habits, provided the Amarillo-tropical grass sward is subject to frequent and severe defoliation.
Resumo:
Aconophora compressa (Hemiptera: Membracidae), a biological control agent introduced against the weed Lantana camara (Verbenaceae) in Australia, has since been observed on several non-target plant species, including native mangrove Avicennia marina (Acanthaceae). In this study we evaluated the suitability of two native mangroves, A. marina and Aegiceras corniculatum (Myrsinaceae), for the survival and development of A. compressa through no-choice field cage studies. The longevity of females was significantly higher on L. camara (57.7 ± 3.8 days) than on A. marina (43.3 ± 3.3 days) and A. corniculatum (45.7 ± 3.8 days). The proportion of females laying eggs was highest on L. camara (72%) followed by A. marina (36%) and A. corniculatum (17%). More egg batches per female were laid on L. camara than on A. marina and A. corniculatum. Though more nymphs per shoot emerged on L. camara (29.9 ± 2.8) than on A. marina (13 ± 4.8) and A. corniculatum (10 ± 5.3), the number of nymphs that developed through to adults was not significantly different. The duration of nymphal development was longer on A. marina (67 ± 5.8 days) than on L. camara (48 ± 4 days) and A. corniculatum (43 ± 4.6 days). The results, which are in contrast to those from previous glasshouse and quarantine trials, provide evidence that A. compressa adults can survive, lay eggs and complete nymphal development on the two non-target native mangroves in the field under no-choice condition.
Resumo:
Fibre diameter can vary dramatically along a wool staple, especially in the Mediterranean environment of southern Australia with its dry summers and abundance of green feed in spring. Other research results have shown a very low phenotypic correlation between fibre diameter grown between seasons. Many breeders use short staples to measure fibre diameter for breeding purposes and also to promote animals for sale. The effectiveness of this practice is determined by the relative response to selection by measuring fibre traits on a full 12 months wool staple as compared to measuring them only on part of a staple. If a high genetic correlation exists between the part record and the full record, then using part records may be acceptable to identify genetically superior animals. No information is available on the effectiveness of part records. This paper investigated whether wool growth and fibre diameter traits of Merino wool grown at different times of the year in a Mediterranean environment, are genetically the same trait, respectively. The work was carried out on about 7 dyebanded wool sections/animal.year, on ewes from weaning to hogget age, in the Katanning Merino resource flocks over 6 years. Relative clean wool growth of the different sections had very low heritability estimates of less than 0.10, and they were phenotypically and genetically poorly correlated with 6 or 12 months wool growth. This indicates that part record measurement of clean wool growth of these sections will be ineffective as indirect selection criteria to improve wool growth genetically. Staple length growth as measured by the length between dyebands, would be more effective with heritability estimates of between 0.20 and 0.30. However, these measurements were shown to have a low genetic correlation with wool grown for 12 months which implies that these staple length measurements would only be half as efficient as the wool weight for 6 or 12 months to improve total clean wool weight. Heritability estimates of fibre diameter, coefficient of variation of fibre diameter and fibre curvature were relatively high and were genetically and phenotypically highly correlated across sections. High positive phenotypic and genetic correlations were also found between fibre diameter, coefficient of variation of fibre diameter and fibre curvature of the different sections and similar measurements for wool grown over 6 or 12 months. Coefficient of variation of fibre diameter of the sections also had a moderate negative phenotypic and genetic correlation with staple strength of wool staples grown over 6 months indicating that coefficient of variation of fibre diameter of any section would be as good an indirect selection criterion to improve stable strength as coefficient of variation of fibre diameter for wool grown over 6 or 12 months. The results indicate that fibre diameter, coefficient of variation of fibre diameter and fibre curvature of wool grown over short periods of time have virtually the same heritability as that of wool grown over 12 months, and that the genetic correlation between fibre diameter, coefficient of variation of fibre diameter and fibre curvature on part and on full records is very high (rg > 0.85). This indicates that fibre diameter, coefficient of variation of fibre diameter and fibre curvature on part records can be used as selection criteria to improve these traits. However, part records of greasy and clean wool growth would be much less efficient than fleece weight for wool grown over 6 or 12 months because of the low heritability of part records and the low genetic correlation between these traits on part records and on wool grown for 12 months.
Resumo:
It has been reported that high-density planting of sugarcane can improve cane and sugar yield through promoting rapid canopy closure and increasing radiation interception earlier in crop growth. It is widely known that the control of adverse soil biota through fumigation (removes soil biological constraints and improves soil health) can improve cane and sugar yield. Whether the responses to high-density planting and improved soil health are additive or interactive has important implications for the sugarcane production system. Field experiments established at Bundaberg and Mackay, Queensland, Australia, involved all combinations of 2-row spacings (0.5 and 1.5 m), two planting densities (27 000 and 81 000 two-eyed setts/ha), and two soil fumigation treatments (fumigated and non-fumigated). The Bundaberg experiment had two cultivars (Q124, Q155), was fully irrigated, and harvested 15 months after planting. The Mackay experiment had one cultivar (Q117), was grown under rainfed conditions, and harvested 10 months after planting. High-density planting (81 000 setts/ha in 0.5-m rows) did not produce any more cane or sugar yield at harvest than low-density planting (27 000 setts/ha in 1.5-m rows) regardless of location, crop duration (15 v. 10 months), water supply (irrigated v. rainfed), or soil health (fumigated v. non-fumigated). Conversely, soil fumigation generally increased cane and sugar yields regardless of site, row spacing, and planting density. In the Bundaberg experiment there was a large fumigation x cultivar x density interaction (P<0.01). Cultivar Q155 responded positively to higher planting density in non-fumigated soil but not in fumigated soil, while Q124 showed a negative response to higher planting density in non-fumigated soil but no response in fumigated soil. In the Mackay experiment, Q117 showed a non-significant trend of increasing yield in response to increasing planting density in non-fumigated soil, similar to the Q155 response in non-fumigated soil at Bundaberg. The similarity in yield across the range of row spacings and planting densities within experiments was largely due to compensation between stalk number and stalk weight, particularly when fumigation was used to address soil health. Further, the different cultivars (Q124 and Q155 at Bundaberg and Q117 at Mackay) exhibited differing physiological responses to the fumigation, row spacing, and planting density treatments. These included the rate of tiller initiation and subsequent loss, changes in stalk weight, and propensity to lodging. These responses suggest that there may be potential for selecting cultivars suited to different planting configurations.