34 resultados para Feeding techniques
Resumo:
Laboratory colonies of 15 economically important species of multi-host fruit flies (Diptera:Tephritidae) have been established in eight South Pacific island countries for the purpose of undertaking biological studies, particularly host status testing and research on quarantine treatments. Laboratory rearing techniques are based on the development of artificial diets for larvae consisting predominately of the pulp of locally available fruits including pawpaw, breadfruit and banana. The pawpaw diet is the standard diet and is used in seven countries for rearing 11 species. Diet ingredients are standard proportions of fruit pulp, hydrolysed protein and a bacterial and fungal inhibitor. The diet is particularly suitable for post-harvest treatment studies when larvae of known age are required. Another major development in the laboratory rearing system is the use of pure strains of Enterobacteriaceae bacterial cultures as important adult-feeding supplements. These bacterial cultures are dissected out of the crop of wild females, isolated by sub-culturing, and identified before supply to adults on peptone yeast extract agar plates. Most species are egged using thin, plastic receptacles perforated with 1 mm oviposition holes, with fruit juice or larval diet smeared internally as an oviposition stimulant. Laboratory rearing techniques have been standardised for all of the Pacific countries. Quality control monitoring is based on acceptable ranges in per cent egg hatch, pupal weight and pupal mortality. Colonies are rejuvenated every 6 to 12 months by crossing wild males with laboratory-reared females and vice versa. The standard rearing techniques, equipment and ingredients used in collecting, establishment, maintenance and quality control of these fruit fly species are detailed in this paper.
Resumo:
Lates calcarifer supports important fisheries throughout tropical Australia. Community-driven fish stocking has resulted in the creation of impoundment fisheries and supplemental stocking of selected wild riverine populations. Using predominantly tag-recapture methods, condition assessment and stomach flushing techniques, this study compared the growth of stocked and wild L. calcarifer in a tropical Australian river (Johnstone River) and stocked fish in a nearby impoundment (Lake Tinaroo). Growth of L. calcarifer in the Johnstone River appeared resource-limited, with juvenile fish in its lower freshwater reaches feeding mainly on small aytid shrimp and limited quantities of fish. Growth was probably greatest in estuarine and coastal areas than in the lower freshwater river. Fish in Lake Tinaroo, where prey availability was greater, grew faster than either wild or stocked fish in the lower freshwater areas of the Johnstone River. Growth of L. calcarifer was highly seasonal with marked declines in the cooler months. This was reflected in both stomach fullness and the percentage of fish with empty stomachs but the condition of L. calcarifer was similar across most sites. In areas where food resources appear stretched, adverse effects on resident L. calcarifer populations and their attendant prey species should be minimised through cessation of, or more conservative, stocking practices.
Resumo:
From the findings of McPhee et al. (1988), there is an expectation that selection in the growing pig for bodyweight gain measured on restricted feeding will result in favourable responses in the rate and efficiency of growth of lean pork on different levels of feeding. This paper examines this in two lines of Australian Large White pigs which have undergone 3 years of selection for high and for low growth rate over a 6-week period starting at 50 kg liveweight. Over this test period, pigs of both lines are all fed the same total amount of grower food, restricted to an estimated 80% of average ad libitum intake. 'Animal production for a consuming world': proceedings of 9th Congress of the AAAAP Societies and 23rd Biennial Conference of the ASAP and 17th Annual Symposium of the University of Sydney, Dairy Research Foundation, (DRF). Sydney, Australia.
Resumo:
The widespread and increasing resistance of internal parasites to anthelmintic control is a serious problem for the Australian sheep and wool industry. As part of control programmes, laboratories use the Faecal Egg Count Reduction Test (FECRT) to determine resistance to anthelmintics. It is important to have confidence in the measure of resistance, not only for the producer planning a drenching programme but also for companies investigating the efficacy of their products. The determination of resistance and corresponding confidence limits as given in anthelmintic efficacy guidelines of the Standing Committee on Agriculture (SCA) is based on a number of assumptions. This study evaluated the appropriateness of these assumptions for typical data and compared the effectiveness of the standard FECRT procedure with the effectiveness of alternative procedures. Several sets of historical experimental data from sheep and goats were analysed to determine that a negative binomial distribution was a more appropriate distribution to describe pre-treatment helminth egg counts in faeces than a normal distribution. Simulated egg counts for control animals were generated stochastically from negative binomial distributions and those for treated animals from negative binomial and binomial distributions. Three methods for determining resistance when percent reduction is based on arithmetic means were applied. The first was that advocated in the SCA guidelines, the second similar to the first but basing the variance estimates on negative binomial distributions, and the third using Wadley’s method with the distribution of the response variate assumed negative binomial and a logit link transformation. These were also compared with a fourth method recommended by the International Co-operation on Harmonisation of Technical Requirements for Registration of Veterinary Medicinal Products (VICH) programme, in which percent reduction is based on the geometric means. A wide selection of parameters was investigated and for each set 1000 simulations run. Percent reduction and confidence limits were then calculated for the methods, together with the number of times in each set of 1000 simulations the theoretical percent reduction fell within the estimated confidence limits and the number of times resistance would have been said to occur. These simulations provide the basis for setting conditions under which the methods could be recommended. The authors show that given the distribution of helminth egg counts found in Queensland flocks, the method based on arithmetic not geometric means should be used and suggest that resistance be redefined as occurring when the upper level of percent reduction is less than 95%. At least ten animals per group are required in most circumstances, though even 20 may be insufficient where effectiveness of the product is close to the cut off point for defining resistance.
Resumo:
Fruit-piercing moths are significant pests of a range of fruit crops throughout much of the world's tropics and subtropics. Feeding damage by the adult moths is most widely reported in varieties of citrus. In the years 2003 and 2004, fruit-piercing moth activity was observed regularly at night in citrus crops in northeast Australia, to determine the level of maturity (based on rind colour) and soundness of fruit attacked. 'Navelina' navel and 'Washington' navel orange, grapefruit and mixed citrus crops were assessed, and fruit was rated and placed into five categories: green, colouring, ripe, overripe and damaged. There were no statistical differences in the percentage of fruit attacked in each category across crops. However, within the individual crops significant proportions of green 'Navelina' fruit (58.7%) and green mixed citrus (57.1%) were attacked in 2004. Among all the crops assessed, 25.1% of moth feeding occurred on overripe or damaged fruit. Crops started to be attacked at least 8 weeks before picking, but in two crops there were large influxes of moths (reaching 27 and 35 moths/100 trees, respectively) immediately before harvest. Moth activity was most intense between late February and late March. Eudocima fullonia (Clerck) represented 79.1% of all moths recorded on fruit, with Eudocima materna (L.), Eudocima salaminia (Cramer) and Serrodes campana (Guen.) the only other species observed capable of inflicting primary damage. Our results suggest that growers should monitor moth activity from 8 weeks before harvest and consider remedial action if moth numbers increase substantially as the crop matures or there is a history of moth problems. The number of fruit pickings could be increased to progressively remove ripe fruit or early harvest of the entire crop contemplated if late influxes of moths are known.
Resumo:
Objective: To assess the impact of feeding different amounts of sorghum ergot to sows before farrowing. Design: Fifty-one pregnant sows from a continually farrowing piggery were sequentially inducted into the experiment each week in groups of four to seven, as they approached within 14 days of farrowing. Diets containing sorghum ergot sclerotia within the range of 0 (control) up to 1.5% w/w (1.5% ergot provided 7 mg alkaloids/kg, including 6 mg dihydroergosine/kg) were randomly allocated and individually fed to sows. Ergot concentrations were varied with each subsequent group until an acceptable level of tolerance was achieved. Diets with ergot were replaced with control diets after farrowing. Post-farrowing milk production was assessed by direct palpation and observation of udders, and by piglet responses and growth. Blood samples were taken from sows on three days each week, for prolactin estimation. Results: Three sows fed 1.5% ergot for 6 to 10 days preceding farrowing produced no milk, and 87% of their piglets died despite supplementary feeding of natural and artificial colostrums, milk replacer, and attempts to foster them onto normally lactating sows. Ergot inclusions of 0.6% to 1.2% caused lesser problems in milk release and neo-natal piglet mortality. Of 23 sows fed either 0.3% or 0.6% ergot, lactation of only two first-litter sows were affected. Ergot caused pronounced reductions in blood prolactin, and first-litter sows had lower plasma prolactin than multiparous sows, increasing their susceptibility to ergot. Conclusion: Sorghum ergot should not exceed 0.3% (1 mg alkaloid/kg) in diets of multiparous sows fed before farrowing, and should be limited to 0.1 % for primiparous sows, or avoided completely.
Resumo:
Nutrient mass balances have been used to assess a variety of land resource scenarios, at various scales. They are widely used as a simple basis for policy, planning, and regulatory decisions but it is not clear how accurately they reflect reality. This study provides a critique of broad-scale nutrient mass balances, with particular application to the fertiliser use of beef lot-feeding manure in Queensland. Mass balances completed at the district and farm scale were found to misrepresent actual manure management behaviour and potentially the risk of nutrient contamination of water resources. The difficulties of handling stockpile manure and concerns about soil compaction mean that manure is spread thickly over a few paddocks at a time and not evenly across a whole farm. Consequently, higher nutrient loads were applied to a single paddock less frequently than annually. This resulted in years with excess nitrogen, phosphorus, and potassium remaining in the soil profile. This conclusion was supported by evidence of significant nutrient movement in several of the soil profiles studied. Spreading manure is profitable, but maximum returns can be associated with increased risk of nutrient leaching relative to conventional inorganic fertiliser practices. Bio-economic simulations found this increased risk where manure was applied to supply crop nitrogen requirements (the practice of the case study farms, 200-5000 head lot-feeders). Thus, the use of broad-scale mass balances can be misleading because paddock management is spatially heterogeneous and this leads to increased local potential for nutrient loss. In response to the effect of spatial heterogeneity policy makers who intend to use mass balance techniques to estimate potential for nutrient contamination should apply these techniques conservatively.
Resumo:
Milk obtained from cows on 2 subtropical dairy feeding systems were compared for their suitability for Cheddar cheese manufacture. Cheeses were made in a small-scale cheesemaking plant capable of making 2 blocks ( about 2 kg each) of Cheddar cheese concurrently. Its repeatability was tested over 10 separate cheesemaking days with no significant differences being found between the 2 vats in cheesemaking parameters or cheese characteristics. In the feeding trial, 16 pairs of Holstein - Friesian cows were used in 2 feeding systems (M1, rain-grown tropical grass pastures and oats; and M5, a feedlot, based on maize/barley silage and lucerne hay) over 2 seasons ( spring and autumn corresponding to early and late lactation, respectively). Total dry matter, crude protein (kg/cow. day) and metabolisable energy (MJ/cow.day) intakes were 17, 2.7, and 187 for M1 and 24, 4, 260 for M5, respectively. M5 cows produced higher milk yields and milk with higher protein and casein levels than the M1 cows, but the total solids and fat levels were similar (P > 0.05) for both M1 and M5 cows. The yield and yield efficiency of cheese produced from the 2 feeding systems were also not significantly different. The results suggest that intensive tropical pasture systems can produce milk suitable for Cheddar cheese manufacture when cows are supplemented with a high energy concentrate. Season and stage of lactation had a much greater effect than feeding system on milk and cheesemaking characteristics with autumn ( late lactation) milk having higher protein and fat contents and producing higher cheese yields.
Resumo:
Diets containing 3% sorghum ergot (16 mg alkaloids/kg, including 14 mg dihydroergosine/kg) were fed to 12 sows from 14 days post-farrowing until weaning 14 days later, and their performance was compared with that of 10 control sows. Ergot-fed sows displayed a smaller weight loss during lactation of 24 kg/head vs. 29 kg/head in control sows (p > 0.05) despite feed consumption being less (61 kg/head total feed intake vs. 73 kg/head by control sows; p < 0.05). Ergot-fed sows had poorer weight gain of litters over the 14-day period (16.6 kg/litter vs. 28.3 kg/litter for controls; p < 0.05) despite an increase in consumption of creep feed by the piglets from the ergot-fed sows (1.9 kg/litter compared with 1.1 kg/litter by the control; p > 0.05). Sow plasma prolactin was reduced with ergot feeding after 7 days to 4.8 μg/l compared with 15.1 μg/l in the control sows (p < 0.01) and then at weaning was 4.9 μg/l compared with 8.0 μg/l (p < 0.01) in the control sows. Two sows fed ergot ceased lactation early, and the above sow feed intakes, body weight losses with litter weight gains and creep consumption indirectly indicate an ergot effect on milk production.
Resumo:
The objective of this study was to examine genetic changes in reproduction traits in sows (total number born (TNB), number born alive (NBA), average piglet birth weight (ABW) and number of piglets weaned (NW), body weight prior to mating (MW), gestation length (GL) and daily food intake during lactation (DFI)) in lines of Large White pigs divergently selected over 4 years for high and low post-weaning growth rate on a restricted ration. Heritabilities and repeatabilities of the reproduction traits were also determined. The analyses were carried out on 913 litter records using average information-restricted maximum likelihood method applied to single trait animal models. Estimates of heritability for most traits were small, except for ABW (0·33) and MW (0·35). Estimates of repeatability were slightly higher than those of heritability for TNB, NBA and NW, but they were almost identical for ABW, MW, GL and DFI. After 4 years of selection, the high growth line sows had significantly heavier body weight prior to mating and produced significantly more piglets born alive with heavier average birth weight than the low line sows. There were, however, no statistical differences between the selected lines in TNB or NW. The lower food intake of high relative to low line sows during lactation was not significant, indicating that daily food intake differences found between grower pigs in the high and low lines (2·71 v. 2·76 kg/day, s.e.d. 0·024) on ad libitum feeding were not fully expressed in lactating sows. It is concluded that selection for growth rate on the restricted ration resulted in beneficial effects on important measures of reproductive performance of the sows.
Resumo:
Some whole leaf-clearing and staining techniques are described for the microscopic observation of the origin of powdery mildew conidiophores, whether from external mycelium or internal mycelium, emerging through stomata. These techniques enable separation of the two genera, Oidiopsis and Streptopodium, in the Erysiphaceae.
Resumo:
In a study that included C-4 tropical grasses, C-3 temperate grasses and C-3 pasture legumes, in vitro dry matter digestibility of extrusa, measured as in vitro dry matter loss (IVDML) during incubation, compared with that of the forage consumed, was greater for grass extrusa but not for legume extrusa. The increase in digestibility was not caused by mastication or by the freezing of extrusa samples during storage but by the action of saliva. Comparable increases in IVDML were achieved merely by mixing bovine saliva with ground forage samples. Differences were greater than could be explained by increases due to completely digestible salivary DM. There was no significant difference between animals in relation to the saliva effect on IVDML and, except for some minor differences, similar saliva effects on IVDML were measured using either the pepsin-cellulase or rumen fluid-pepsin in vitro techniques. For both C-4 and C-3 grasses the magnitude of the differences were inversely related to IVDML of the feed and there was little or no difference between extrusa and feed at high digestibilities (>70%) whereas differences of more than 10 percentage units were measured on low quality grass forages. The data did not suggest that the extrusa or saliva effect on digestibility was different for C-3 grasses than for C-4 grasses but data on C-3 grasses were limited to few species and to high digestibility samples. For legume forages there was no saliva effect when the pepsin-cellulase method was used but there was a small but significant positive effect using the rumen fluid-pepsin method. It was concluded that when samples of extrusa are analysed using in vitro techniques, predicted in vivo digestibility of the feed consumed will often be overestimated, especially for low quality grass diets. The implications of overestimating in vivo digestibility and suggestions for overcoming such errors are discussed.
Resumo:
The response of vegetative soybean (Glycine max) to Helicoverpa armigera feeding was studied in irrigated field cages over three years in eastern Australia to determine the relationship between larval density and yield loss, and to develop economic injury levels. Rather than using artificial defoliation techniques, plants were infested with either eggs or larvae of H. armigera, and larvae allowed to feed until death or pupation. Larvae were counted and sized regularly and infestation intensity was calculated in Helicoverpa injury equivalent (HIE) units, where 1 HIE was the consumption of one larva from the start of the infestation period to pupation. In the two experiments where yield loss occurred, the upper threshold for zero yield loss was 7.51 ± 0.21 HIEs and 6.43 ± 1.08 HIEs respectively. In the third experiment, infestation intensity was lower and no loss of seed yield was detected up to 7.0 HIEs. The rate of yield loss/HIE beyond the zero yield loss threshold varied between Experiments 1 and 2 (-9.44 ± 0.80 g and -23.17 ± 3.18 g, respectively). H. armigera infestation also affected plant height and various yield components (including pod and seed numbers and seeds/pod) but did not affect seed size in any experiment. Leaf area loss of plants averaged 841 and 1025 cm2/larva in the two experiments compared to 214 and 302 cm2/larva for cohort larvae feeding on detached leaves at the same time, making clear that artificial defoliation techniques are unsuitable for determining H. armigera economic injury levels on vegetative soybean. Analysis of canopy leaf area and pod profiles indicated that leaf and pod loss occurred from the top of the plant downwards. However, there was an increase in pod numbers closer to the ground at higher pest densities as the plant attempted to compensate for damage. Defoliation at the damage threshold was 18.6 and 28.0% in Experiments 1 and 2, indicating that yield loss from H. armigera feeding occurred at much lower levels of defoliation than previously indicated by artificial defoliation studies. Based on these results, the economic injury level for H. armigera on vegetative soybean is approximately 7.3 HIEs/row-metre in 91 cm rows or 8.0 HIEs/m2.
Resumo:
Navua sedge, a member of the Cyperaceae family, is an aggressive weed of pastures in Fiji, Sri Lanka, Malay Peninsula, Vanuatu, Samoa, Solomons, and Tahiti and is now a weed of pastures and roadsides in north Queensland, Australia. Primarily restricted to areas with an annual rainfall exceeding 2500 mm, Navua sedge is capable of forming dense stands smothering many tropical pasture species. Seventeen herbicides were field tested at three sites in north Queensland, with glyphosate, halosulfuron, hexazinone, imazapic, imazapyr, or MSMA the most effective for Navua sedge control. Environmental problems such as persistence in soil, lack of selectivity and movement off-site may occur using some herbicides at the predicted LC90 control level rates. A seasonality trial using halosulfuron (97.5 g ai/ha) gave better Navua sedge control (84%) spraying March to September than spraying at other times (50%). In a frequency trial, sequential glyphosate applications (2,160 g ae/ha) every two months was more effective for continued Navua sedge control (67%) than a single application of glyphosate (36%), though loss of ground cover would occur. In a management trial, single applications of glyphosate (2,160 to 3,570 g ae/ha) using either a rope wick, ground foliar spraying or a rotary rope wick gave 59 to 73% control, while other treatments (rotary hoe (3%), slashing (-13%) or crushing (-30%)) were less effective. In a second management trial, four monthly rotary wick applications were much more effective (98%) than four monthly crushing applications (42%). An effective management plan must include the application of regular herbicide treatments to eliminate Navua sedge seed being added to the soil seed bank. Treatments that result in seed burial, for example, discing are likely to prolong seed persistence and should be avoided. The sprouting activity of vegetative propagules and root fragmentation needs to also be considered when selecting control options.
Resumo:
Coastal seagrass habitats in tropical and subtropical regions support aggregations of resident green turtles (Chelonia mydas) from several genetically distinct breeding populations. Migration of individuals to their respective dispersed breeding sites provides a complex pattern of migratory connectivity among nesting and feeding habitats of this species. An understanding of this pattern is important in regions where the persistence of populations is under threat from anthropogenic impacts. The present study uses mitochondrial DNA and mixed-stock analyses to assess the connectivity among seven feeding grounds across the north Australian coast and adjacent areas and 17 genetically distinct breeding populations from the Indo-Pacific region. It was hypothesised that large and geographically proximate breeding populations would dominate at nearby feeding grounds. As expected, each sampled feeding area appears to support multiple breeding populations, with two aggregations dominated by a local breeding population. Geographic distance between breeding and feeding habitat strongly influenced whether a breeding population contributed to a feeding ground (wi = 0.654); however, neither distance nor size of a breeding population was a good predictor of the extent of their contribution. The differential proportional contributions suggest the impact of anthropogenic mortality at feeding grounds should be assessed on a case-by-case basis.