28 resultados para NAIDOC Week
em eResearch Archive - Queensland Department of Agriculture
Resumo:
The influence of barley and oat grain supplements on hay dry matter intake (DMI), carcass components gain and meat quality in lambs fed a low quality basal diet was examined. Thirty five crossbred wether lambs (9 months of age) were divided into four groups. After adaptation to a basal diet of 85% oat hay and 15% lucerne hay for one week, an initial group of 11 was slaughtered. The weights of carcass components and digesta-free empty body weight (EBW) of this group was used to estimate the weight of carcass components of the other three experimental groups at the start of the experiment. The remaining three groups were randomly assigned to pens and fed ad libitum the basal diet alone (basal), basal with 300 g air dry barley grain (barley), basal with 300 g air dry oat grain (oat). Supplements were fed twice weekly (i.e., 900 g on Tuesday and 1200 g on Friday). After 13 weeks of feeding, animals were slaughtered and, at 24 h post-mortem meat quality and subcutaneous fat colour were measured. Samples of longissimus muscle were collected for determination of sarcomere length and meat tenderness. Hay DMI was reduced (P<0.01) by both barley and oat supplements. Lambs fed barley or oat had a higher and moderate digestibility of DM, and a higher intake of CP (P<0.05) and ME (P<0.01) than basal lambs. Final live weight of barley and oat lambs was higher (P<0.05) than basal, but this was not reflected in EBW or hot carcass weight. Lambs fed barley or oat had increases in protein (P<0.01) and water (P<0.001) in the carcass, but fat gain was not changed (P>0.05). There were no differences in eye muscle area or fat depth (total muscle and adipose tissue depth at 12th rib, 110 mm from midline; GR) among groups. The increased levels of protein and water components in the carcass of barley and oat fed lambs, associated with improved muscle production, were small and did not alter (P>0.05) any of the carcass/meat quality attributes compared to lambs fed a low quality forage diet. Feeding barley or oat grain at 0.9–1% of live weight daily to lambs consuming poor quality hay may not substantially improve carcass quality, but may be useful in maintaining body condition of lambs through the dry season for slaughter out of season
Resumo:
From the findings of McPhee et al. (1988), there is an expectation that selection in the growing pig for bodyweight gain measured on restricted feeding will result in favourable responses in the rate and efficiency of growth of lean pork on different levels of feeding. This paper examines this in two lines of Australian Large White pigs which have undergone 3 years of selection for high and for low growth rate over a 6-week period starting at 50 kg liveweight. Over this test period, pigs of both lines are all fed the same total amount of grower food, restricted to an estimated 80% of average ad libitum intake. 'Animal production for a consuming world': proceedings of 9th Congress of the AAAAP Societies and 23rd Biennial Conference of the ASAP and 17th Annual Symposium of the University of Sydney, Dairy Research Foundation, (DRF). Sydney, Australia.
Resumo:
Traps baited with synthetic aggregation pheromones of Carpophilus hemipterus (L.), Carpophilus mutilatus Erichson and Carpophilus davidsoni Dobson and fermenting bread dough were used to identify the fauna and monitor the seasonal abundance of Carpophilus spp. in insecticide treated peach and nectarine orchards in the Gosford area of coastal New South Wales. In four orchards 67 178 beetles were trapped during 1994–1995, with C. davidsoni (82%) and Carpophilus gaveni (Dobson) (12.2%) dominating catches. Five species (C. hemipterus, C. mutilatus, Carpophilus marginellus Motschulsky, Carpophilus humeralis (F.) and an unidentified species) each accounted for 0.2–3.2% of trapped beetles. Carpophilus davidsoni was most abundant during late September–early October but numbers declined rapidly during October, usually before insecticides were applied. Spring populations of Carpophilus spp. were very large in 1994–1995 (1843–2588 per trap per week). However, despite a preharvest population decline of approximately 95% and 2–11 applications of insecticide, 14–545 beetles per trap per week (above the arbitrary fruit damage threshold of 10 beetles per trap per week) were recorded during the harvest period and fruit damage occurred at three of the four orchards. Lower preharvest populations in 1995–1996 (< 600 per trap per week) and up to six applications of insecticide resulted in < 10 beetles per trap per week during most of the harvest period and minimal or no fruit damage. The implications of these results for the integrated management of Carpophilus spp. in coastal and inland areas of southeastern Australia are discussed.
Resumo:
Fermenting apple juice (FAJ) contained within polyacrylamide granules was an effective pheromone coattractant for Carpophilus davidsoni in trapping experiments conducted in stone fruit orchards in southern New South Wales. Fermenting apple juice-baited traps captured as many beetles as traps baited with the 'standard' coattractant fermenting bread dough (FBD), either alone or in combination with aggregation pheromone. Increasing the interval of FAJ replacement to 2 weeks instead of 1 week, as is necessary for FBD, did not reduce trapping efficiency. Replacement of FAJ every three weeks did not affect captures of C. davidsoni in one experiment but did reduce captures of Carpophilus mutilatus. In a second experiment, captures of C. davidsoni were also reduced. Fermenting apple juice contained within polyacrylamide granules replaced at fortnightly intervals is an effective, convenient and practical pheromone coattractant for Carpophilus spp.
Resumo:
Traps baited with synthetic aggregation pheromone and fermenting bread dough were used to monitor seasonal incidence and abundance of the ripening fruit pests, Carpophilus hemipterus (L.), C. mutilatus Erichson and C. davidsoni Dobson in stone fruit orchards in the Leeton district of southern New South Wales during five seasons (1991-96). Adult beetles were trapped from September-May, but abundance varied considerably between years with the amount of rainfall in December-January having a major influence on population size and damage potential during the canning peach harvest (late February-March). Below average rainfall in December-January was associated with mean trap catches of < 10 beetles/trap/week in low dose pheromone traps during the harvest period in 1991/92 and 1993/94 and no reported damage to ripening fruit. Rainfall in December-January 1992/93 was more than double the average and mean trap catches ranged from 8-27 beetles/week during the harvest period with substantial damage to the peach crop. December-January rainfall was also above average in 1994/95 and 1995/96 and means of 50-300 beetles/trap/week were recorded in high dose pheromone traps during harvest periods. Carpophilus spp. caused economic damage to peach crops in both seasons. These data indicate that it may be possible to predict the likelihood of Carpophilus beetle damage to ripening stone fruit in inland areas of southern Australia, by routine pheromone-based monitoring of beetle populations and summer temperatures and rainfall.
Resumo:
Piggery pond sludge (PPS) was applied, as-collected (Wet PPS) and following stockpiling for 12 months (Stockpiled PPS), to a sandy Sodosol and clay Vertosol at sites on the Darling Downs of Queensland. Laboratory measures of N availability were carried out on unamended and PPS-amended soils to investigate their value in estimating supplementary N needs of crops in Australia's northern grains region. Cumulative net N mineralised from the long-term (30 weeks) leached aerobic incubation was described by a first-order single exponential model. The mineralisation rate constant (0.057/week) was not significantly different between Control and PPS treatments or across soil types, when the amounts of initial mineral N applied in PPS treatments were excluded. Potentially mineralisable N (No) was significantly increased by the application of Wet PPS, and increased with increasing rate of application. Application of Wet PPS significantly increased the total amount of inorganic N leached compared with the Control treatments. Mineral N applied in Wet PPS contributed as much to the total mineral N status of the soil as did that which mineralised over time from organic N. Rates of C02 evolution during 30 weeks of aerobic leached incubation indicated that the Stockpiled PPS was more stabilised (19-28% of applied organic C mineralised) than the WetPPS (35-58% of applied organic C mineralised), due to higher lignin content in the former. Net nitrate-N produced following 12 weeks of aerobic non-leached incubation was highly correlated with net nitrate-N leached during 12 weeks of aerobic incubation (R^2 = 0.96), although it was <60% of the latter in both sandy and clayey soils. Anaerobically mineralisable N determined by waterlogged incubation of laboratory PPS-amended soil samples increased with increasing application rate of Wet PPS. Anaerobically minemlisable N from field-moist soil was well correlated with net N mineralised during 30 weeks of aerobic leached incubation (R^2 =0.90 sandy soil; R^2=0.93 clay soil). In the clay soil, the amount of mineral N produced from all the laboratory incubations was significantly correlated with field-measured nitrate-N in the soil profile (0-1.5 m depth) after 9 months of weed-free fallow following PPS application. In contrast, only anaerobic mineralisable N was significantly correlated with field nitrate-N in the sandy soil. Anaerobic incubation would, therefore, be suitable as a rapid practical test to estimate potentially mineralisable N following applications of different PPS materials in the field.
Resumo:
Instances of morbidity amongst rock lobsters (Panulirus cygnus) arriving at factories in Western Australia (WA) have been attributed to stress during post-harvest handling. This study used discriminant analysis to determine whether physiological correlates of stress following a period of simulated post-harvest handling had any validity as predictors of future rejection or morbidity of western rock lobsters. Groups of 230 western rock lobsters were stored for 6 h in five environments (submerged/flowing sea water, submerged/re-circulating sea water, humid air, flowing sea water spray, and re-circulated sea water spray). The experiment was conducted in late spring (ambient sea water 22°C), and repeated again in early autumn (ambient sea water 26°C). After 6 h treatment, each lobster was graded for acceptability for live export, numbered, and its hemolymph was sampled. The samples were analysed for a number of physiological and health status parameters. The lobsters were then stored for a week in tanks in the live lobster factory to record mortality. The mortality of lobsters in the factory was associated with earlier deviations in hemolymph parameters as they emerged from the storage treatments. Discriminant analysis (DA) of the hemolymph assays enabled the fate of 80-90% of the lobsters to be correctly categorised within each experiment. However, functions derived from one experiment were less accurate at predicting mortality when applied to the other experiments. One of the reasons for this was the higher mortality and the more severe patho-physiological changes observed in lobsters stored in humid air or sprays at the higher temperature. The analysis identified lactate accumulation during emersion and associated physiological and hemocyte-related effects as a major correlate of mortality. Reducing these deviations, for example by submerged transport, is expected to ensure high levels of survival. None of the indicators tested predicted mortality with total accuracy. The simplest and most accurate means of comparing emersed treatments was to count the mortality afterwards.
Resumo:
In order to develop an efficient and reliable biolistics transformation system for pineapples parameters need to be optimised for growth, survival and development of explants pre- and post transformation. We have optimised in vitro conditions for culture media for the various stages of plant and callus initiation and development, and for effective selection of putative transgenic material. Shoot multiplication and proliferation is best on medium containing MS basic nutrients and vitamins with the addition of 0.1 mg/L myo-inositol, 20 g/L sucrose, 2.5 mg/L BAP and 3 g/L Phytagel, followed by transfer to basic MS medium for further development. Callus production on leaf base explants is best on MS nutrients and vitamins, to which 10 mg/L of BAP and NAA each was added. Optimum explant age for bombardment is 17-35 week old callus, while a pre-bombardment osmoticum treatment in the medium is not required. By comparing several antibiotics as selective agent, it has been established that a two-step selection of 2 fortnightly sub-cultures on 50 μg/mL of geneticin in the culture medium, followed by monthly sub-cultures on 100 μg/mL geneticin is optimal for survival of transgenic callus. Shoot regeneration from callus cultures is optimal on medium containing MS nutrients and vitamins, 5% coconut water and 400 mg/L casein hydrolysate. Plants can be readily regenerated and multiplied from transgenic callus through organogenesis. Rooting of shoots does not require any additional plant hormones to the medium. A transformation efficiency of 1 – 3.5% can be achieved, depending on the gene construct applied.
Resumo:
Creontiades spp. (Hemiptera: Miridae) are sucking pests that attack buds, flowers and young pods in mungbeans, Vigna radiata (L.), causing these structures subsequently to abort. If left uncontrolled, mirids can cause 25-50% yield loss. Traditional industry practice has involved prophylactic applications of dimethoate to control mirids at budding and again a week later. The present trial was initiated to highlight the dangers of such a practice, in particular the risk of a subsequent Helicoverpa spp. lepidopteran pest outbreak. A single application of dimethoate halved the population of important natural enemies of Helicoverpa spp., and caused an above-threshold outbreak of Helicoverpa spp. within 11 days. This shows that even a moderate (e.g. 50%) reduction in natural enemies may be sufficient to increase Helicoverpa spp. populations in mungbeans. As a result, prophylactic sprays should not be used for the control of mirids in mungbeans, and dimethoate should be applied only when mirids are above the economic threshold. Indoxacarb was also tested to establish its effect on Helicoverpa spp., mirids and natural enemies. Indoxacarb showed potential for Helicoverpa spp. control and suppression of mirids and had little impact on natural enemies.
Resumo:
Alternaria leaf blight is the most prevalent disease of cotton in northern Australia. A trial was conducted at Katherine Research Station, Northern Territory, Australia, to determine the effects of foliar application of potassium nitrate (KNO3) on the suppression of Alternaria leaf blight of cotton. Disease incidence, severity and leaf shedding were assessed at the bottom (1-7 nodes), middle (8-14 nodes) and the top (15+ nodes) of plants at weekly intervals from 7 July to 22 September 2004. Disease incidence, severity and shedding at the middle canopy level were significantly higher for all treatments than those from bottom and top canopies. Foliar KNO3, applied at 13 kg/ha, significantly (P < 0.05) reduced the mean disease incidence, severity and leaf shedding assessed during the trial period. KNO 3 significantly (P < 0.001) reduced the disease severity and leaf shedding at the middle canopy level. Almost all leaves in the middle canopy became infected in the first week of July in contrast to infection levels of 50-65% at the bottom and top of the canopy. Disease severity and leaf shedding in the middle canopy were significantly (P < 0.05) lower in KNO 3-treated plots than the control plots from the second and third weeks of July to the second and third weeks of August. This study demonstrates that foliar application of KNO3 may be effective in reducing the effect of Alternaria leaf blight of cotton in northern Australia.
Resumo:
The problem of cannibalism in communally reared crabs can be eliminated by separating the growing crabs into holding compartments. There is currently no information on optimal compartment size for growing crabs individually. 136 second instar crablets (Portunus sanguinolentus) (C2 ca. 7-10 mm carapace width (CW)) were grown for 90 days in 10 different-sized opaque and transparent walled acrylic compartments. The base area for each compartment ranged from small (32 mm × 32 mm) to large (176 mm × 176 mm). Effects of holding space and wall transparency on survival, CW, moult increment, intermoult period and average weekly gain (AWG) were examined. Most crabs reached instars C9-C10 (50-70 mm CW) by the end of experiment. The final survival rate in the smallest compartment was 25% mainly due to moult-related mortality predominantly occurring at the C9 instar. However, crabs in these smaller compartments had earlier produced significantly larger moult increments from instar to instar than those in the larger compartments (P < 0.05). Crabs in the smaller compartments (<65 mm × 65 mm) also showed significantly longer moult periods (P < 0.05). The net result was that AWG in CW was 5.22 mm week-1 for the largest compartment and 5.15 mm week-1 in smallest and did not differ significantly between compartment size groups (P = 0.916). Wall transparency had no impact on survival (P = 0.530) but a slight impact on AWG (P = 0.014). Survival rate was the best indicator of minimum acceptable compartment size (?43 mm × 43 mm) for C10 crablets because below this size death occurred before growth rate was significantly affected. For further growth, it would be necessary to transfer the crablets to larger compartments.
Resumo:
Objective: To assess the impact of feeding different amounts of sorghum ergot to sows before farrowing. Design: Fifty-one pregnant sows from a continually farrowing piggery were sequentially inducted into the experiment each week in groups of four to seven, as they approached within 14 days of farrowing. Diets containing sorghum ergot sclerotia within the range of 0 (control) up to 1.5% w/w (1.5% ergot provided 7 mg alkaloids/kg, including 6 mg dihydroergosine/kg) were randomly allocated and individually fed to sows. Ergot concentrations were varied with each subsequent group until an acceptable level of tolerance was achieved. Diets with ergot were replaced with control diets after farrowing. Post-farrowing milk production was assessed by direct palpation and observation of udders, and by piglet responses and growth. Blood samples were taken from sows on three days each week, for prolactin estimation. Results: Three sows fed 1.5% ergot for 6 to 10 days preceding farrowing produced no milk, and 87% of their piglets died despite supplementary feeding of natural and artificial colostrums, milk replacer, and attempts to foster them onto normally lactating sows. Ergot inclusions of 0.6% to 1.2% caused lesser problems in milk release and neo-natal piglet mortality. Of 23 sows fed either 0.3% or 0.6% ergot, lactation of only two first-litter sows were affected. Ergot caused pronounced reductions in blood prolactin, and first-litter sows had lower plasma prolactin than multiparous sows, increasing their susceptibility to ergot. Conclusion: Sorghum ergot should not exceed 0.3% (1 mg alkaloid/kg) in diets of multiparous sows fed before farrowing, and should be limited to 0.1 % for primiparous sows, or avoided completely.
Resumo:
BACKGROUND: Field studies of diuron and its metabolites 3-(3,4-dichlorophenyl)-1-methylurea (DCPMU), 3,4-dichlorophenylurea (DCPU) and 3,4-dichloroaniline (DCA) were conducted in a farm soil and in stream sediments in coastal Queensland, Australia. RESULTS: During a 38 week period after a 1.6 kg ha^-1 diuron application, 70-100% of detected compounds were within 0-15 cm of the farm soil, and 3-10% reached the 30-45 cm depth. First-order t1/2 degradation averaged 49 ± 0.9 days for the 0-15, 0-30 and 0-45 cm soil depths. Farm runoff was collected in the first 13-50 min of episodes lasting 55-90 min. Average concentrations of diuron, DCPU and DCPMU in runoff were 93, 30 and 83-825 µg L^-1 respectively. Their total loading in all runoff was >0.6% of applied diuron. Diuron and DCPMU concentrations in stream sediments were between 3-22 and 4-31 µg kg^-1 soil respectively. The DCPMU/diuron sediment ratio was >1. CONCLUSION: Retention of diuron and its metabolites in farm topsoil indicated their negligible potential for groundwater contamination. Minimal amounts of diuron and DCMPU escaped in farm runoff. This may entail a significant loading into the wider environment at annual amounts of application. The concentrations and ratio of diuron and DCPMU in stream sediments indicated that they had prolonged residence times and potential for accumulation in sediments. The higher ecotoxicity of DCPMU compared with diuron and the combined presence of both compounds in stream sediments suggest that together they would have a greater impact on sensitive aquatic species than as currently apportioned by assessments that are based upon diuron alone.
Resumo:
The ability of blocking ELISAs and haemagglutination-inhibition (HI) tests to detect antibodies in sera from chickens challenged with either Avibacterium (Haemophilus) paragallinarum isolate Hp8 (serovar A) or H668 (serovar C) was compared. Serum samples were examined weekly over the 9 weeks following infection. The results showed that the positive rate of serovar A specific antibody in the B-ELISA remained at 100% from the second week to the ninth week. In chickens given the serovar C challenge, the highest positive rate of serovar C specific antibody in the B-ELISA appeared at the seventh week (60% positive) and was then followed by a rapid decrease. The B-ELISA gave significantly more positives at weeks 2, 3, 7, 8 and 9 post-infection for serovar A and at week 7 post-infection for serovar C. In qualitative terms, for both serovar A and serovar C infections, the HI tests gave a lower percentage of positive sera at all time points except at 9 weeks post-infection with serovar C. The highest positive rate for serovar A HI antibodies was 70% of sera at the fourth and fifth weeks post-infection. The highest rate of serovar C HI antibodies was 20% at the fifth and sixth weeks post-infection. The results have provided further evidence of the suitability of the serovar A and C B-ELISAs for the diagnosis of infectious coryza.
Resumo:
Batches of glasshouse-grown flowering sorghum plants were placed in circular plots for 24 h at two field sites in southeast Queensland, Australia on 38 occasions in 2003 and 2004, to trap aerial inoculum of Claviceps africana. Plants were located 20-200 m from the centre of the plots. Batches of sorghum plants with secondary conidia of C. africana on inoculated spikelets were placed at the centre of each plot on some dates as a local point source of inoculum. Plants exposed to field inoculum were returned to a glasshouse, incubated at near-100% relative humidity for 48 h and then at ambient relative humidity for another week before counting infected spikelets to estimate pathogen dispersal. Three times as many spikelets became infected when inoculum was present within 200 m of trap plants, but infected spikelets did not decline with increasing distance from local source within the 200 m. Spikelets also became infected on all 10 dates when plants were exposed without a local source of infected plants, indicating that infection can occur from conidia surviving in the atmosphere. In 2005, when trap plants were placed at 14 locations along a 280 km route, infected spikelets diminished with increasing distance from sorghum paddocks and infection was sporadic for distances over 1 km. Multiple regression analysis showed significant influence of moisture related weather variables on inoculum dispersal. Results suggest that sanitation measures can help reduce ergot severity at the local level, but sustainable management will require better understanding of long-distance dispersal of C. africana inoculum.