15 resultados para 90-21-GC1

em eResearch Archive - Queensland Department of Agriculture


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The potential of near infra-red (NIR) spectroscopy for non-invasive measurement of fruit quality of pineapple (Ananas comosus var. Smooth Cayenne) and mango (Magnifera indica var. Kensington) fruit was assessed. A remote reflectance fibre optic probe, placed in contact with the fruit skin surface in a light-proof box, was used to deliver monochromatic light to the fruit, and to collect NIR reflectance spectra (760–2500 nm). The probe illuminated and collected reflected radiation from an area of about 16 cm2. The NIR spectral attributes were correlated with pineapple juice Brix and with mango flesh dry matter (DM) measured from fruit flesh directly underlying the scanned area. The highest correlations for both fruit were found using the second derivative of the spectra (d2 log 1/R) and an additive calibration equation. Multiple linear regression (MLR) on pineapple fruit spectra (n = 85) gave a calibration equation using d2 log 1/R at wavelengths of 866, 760, 1232 and 832 nm with a multiple coefficient of determination (R2) of 0.75, and a standard error of calibration (SEC) of 1.21 °Brix. Modified partial least squares (MPLS) regression analysis yielded a calibration equation with R2 = 0.91, SEC = 0.69, and a standard error of cross validation (SECV) of 1.09 oBrix. For mango, MLR gave a calibration equation using d2 log 1/R at 904, 872, 1660 and 1516 nm with R2 = 0.90, and SEC = 0.85% DM and a bias of 0.39. Using MPLS analysis, a calibration equation with R2 = 0.98, SEC = 0.54 and SECV = 1.19 was obtained. We conclude that NIR technology offers the potential to assess fruit sweetness in intact whole pineapple and DM in mango fruit, respectively, to within 1° Brix and 1% DM, and could be used for the grading of fruit in fruit packing sheds.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Techniques for the introduction of transgenes to control blackheart by particle bombardment and Agrobacterium co-transformation have been developed for pineapple cv. Smooth Cayenne. Polyphenol oxidase (PPO) is the enzyme responsible for blackheart development in pineapple fruit following chilling injury. Sense, anti-sense and hairpin constructs were used as a means to suppress PPO expression in plants. Average transformation efficiency for biolistics was approximately 1% and for Agrobacterium was approximately 1.5%. These results were considered acceptable given the high regeneration potential of between 80-90% from callus cultures. Southern blot analysis revealed stable integration of transgenes with lower copy number found in plants transformed with Agrobacterium compared to those transformed by biolistics. Over 5000 plants from 55 transgenic lines are now undergoing field evaluation in Australia

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A competitive enzyme-linked immunosorbent assay (cELISA) based on a broadly conserved, species-specific, B-cell epitope within the C terminus of Babesia bigemina rhoptry-associated protein 1a was validated for international use. Receiver operating characteristic analysis revealed 16% inhibition as the threshold for a negative result, with an associated specificity of 98.3% and sensitivity of 94.7%. Increasing the threshold to 21% increased the specificity to 100% but modestly decreased the sensitivity to 87.2%. By using 21% inhibition, the positive predictive values ranged from 90.7% (10% prevalence) to 100% (95% prevalence) and the negative predictive values ranged from 97.0% (10% prevalence) to 48.2% (95% prevalence). The assay was able to detect serum antibody as early as 7 days after intravenous inoculation. The cELISA was distributed to five different laboratories along with a reference set of 100 defined bovine serum samples, including known positive, known negative, and field samples. The pairwise concordance among the five laboratories ranged from 100% to 97%, and all kappa values were above 0.8, indicating a high degree of reliability. Overall, the cELISA appears to have the attributes necessary for international application.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Environmental heat can reduce conception rates (the proportion of services that result in pregnancy) in lactating dairy cows. The study objectives were to identify periods of exposure relative to the service date in which environmental heat is most closely associated with conception rates, and to assess whether the total time cows are exposed to high environmental heat within each 24-h period is more closely associated with conception rates than is the maximum environmental heat for each 24-h period. A retrospective observational study was conducted in 25 predominantly Holstein-Friesian commercial dairy herds located in Australia. Associations between weather and conception rates were assessed using 16,878 services performed over a 21-mo period. Services were classified as successful based on rectal palpation. Two measures of heat load were defined for each 24-h period: the maximum temperature-humidity index (THI) for the period, and the number of hours in the 24-h period when the THI was >72. Conception rates were reduced when cows were exposed to a high heat load from the day of service to 6 d after service, and in wk -1. Heat loads in wk -3 to -5 were also associated with reduced conception rates. Thus, management interventions to ameliorate the effects of heat load on conception rates should be implemented at least 5 wk before anticipated service and should continue until at least 1 wk after service. High autocorrelations existed between successive daily values in both measures, and associations between day of heat load relative to service day and conception rates differed substantially when ridge regression was used to account for this autocorrelation. This indicates that when assessing the effects of heat load on conception rates, the autocorrelation in heat load between days should be accounted for in analyses. The results suggest that either weekly averages or totals summarizing the daily heat load are adequate to describe heat load when assessing effects on conception rates in lactating dairy cows.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Dairy farms in subtropical Australia use irrigated, annually sown short-term ryegrass (Lolium multiflorum) or mixtures of short-term ryegrass and white (Trifolium repens) and Persian (shaftal) (T. resupinatum) clover during the winter-spring period in all-year-round milk production systems. A series of small plot cutting experiments was conducted in 3 dairying regions (tropical upland, north Queensland, and subtropical southeast Queensland and northern New South Wales) to determine the most effective rate and frequency of application of nitrogen (N) fertiliser. The experiments were not grazed, nor was harvested material returned to the plots, after sampling. Rates up to 100 kg N/ha.month (as urea or calcium ammonium nitrate) and up to 200 kg N/ha every 2 months (as urea) were applied to pure stands of ryegrass in 1991. In 1993 and 1994, urea, at rates up to 150 kg N/ha.month and to 200 kg N/ha every 2 months, was applied to pure stands of ryegrass; urea, at rates up to 50 kg N/ha.month, was also applied to ryegrass-clover mixtures. The results indicate that applications of 50-85 kg N/ha.month can be recommended for short-term ryegrass pastures throughout the subtropics and tropical uplands of eastern Australia, irrespective of soil type. At this rate, dry matter yields will reach about 90% of their potential, forage nitrogen concentration will be increased, there is minimal risk to stock from nitrate poisoning and there will be no substantial increase in soil N. The rate of N for ryegrass-clover pastures is slightly higher than for pure ryegrass but, at these rates, the clover component will be suppressed. However, increased ryegrass yields and higher forage nitrogen concentrations will compensate for the reduced clover component. At application rates up to 100 kg N/ha.month, build-up of NO3--N and NH4+-N in soil was generally restricted to the surface layers (0-20 cm) of the soil, but there was a substantial increase throughout the soil profile at 150 kg N/ha.month. The build-up of NO3--N and NH4+-N was greater and was found at lower rates on the lighter soil compared with heavy clays. Generally, most of the soil N was in the NO3--N form and most was in the top 20 cm.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Sporobolus pyramidalis, S. africanus, S. natalensis, S. fertilis and S. jacquemontii, known collectively as the weedy Sporobolus grasses, are exotic weeds causing serious economic losses in grazing areas along Australia's entire eastern coast. In one of the first attempts to provide biological control for a grass, the potential of a smut, Ustilago sporoboli-indici, as a biological control agent for all five weedy Sporobolus spp. found in Australia was evaluated in glasshouse studies. Application of basidiospores to 21-day-old Sporobolus seedlings and subsequent incubation in a moist chamber (26 °C, 90% RH, 48 h) resulted in infection of S. pyramidalis, S. africanus, S. natalensis and S. fertilis but not S. jacquemontii. Host-range trials with 13 native Australian Sporobolus spp. resulted in infection of four native species. Evaluation of damage caused by the smut on two Australian native and two weedy Sporobolus spp. showed that the total numbers of flowers infected for the four grasses were in the following order: S. creber > S. fertilis > S. elongatus > S. natalensis with percentage flower infections of 21%, 14%, 12% and 3%, respectively. Significant differences (P = 0.001) were found when the numbers of infected flowers caused by each treatment were compared. The infection of the four native Sporobolus spp. by the smut indicated that it was not sufficiently host specific for release in Australia and the organism was rejected as a potential biological control agent. The implications of these results are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Queensland, Australia, strawberries (Fragaria xananassa Duchesne) are grown in open fields and rainfall events can damage fruit. Cultivars that are resistant to rain damage may reduce losses and lower risk for the growers. However, little is known about the genetic control of resistance and in a subtropical climate, unpredictable rainfall events hamper evaluation. Rain damage was evaluated on seedling and clonal trials of one breeding population comprising 645 seedling genotypes and 94 clones and on a second clonal population comprising 46 clones from an earlier crossing to make preliminary estimates of heritability. The incidence of field damage from rainfall and damage after laboratory soaking was evaluated to determine if this soaking method could be used to evaluate resistance to rain damage. Narrow-sense heritability of resistance to rain damage calculated for seedlings was low (0.21 +/- 0.15) and not significantly different from zero; however, broad-sense heritability estimates were moderate in both seedlings (0.49 +/- 0.16) and clones (0.45 +/- 0.08) from the first population and similar in clones (0.56 +/- 0.21) from the second population. Immersion of fruit in deionized water produced symptoms consistent with rain damage in the field. Lengthening the duration of soaking of 'Festival' fruit in deionized water exponentially increased the proportion of damage to fruit ranging in ripeness from immature to ripe during the first 6-h period of soaking. When eight genotypes were evaluated, the proportion of sound fruit after soaking in deionized water in the laboratory for up to 5 h was linearly related (r(2) = 0.90) to the proportion of sound fruit in the field after 89 mm of rain. The proportion of sound fruit of the breeding genotype '2008-208' and 'Festival' under soaking (0.67, 0.60) and field (0.52, 0.43) evaluations, respectively, is about the same and these genotypes may be useful sources of resistance to rain damage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Choy sum (Brassica rapa subsp. parachinensis) is a dark green leafy vegetable that contains high folate (vitamin B9) levels comparable to spinach. Folate is essential for the maintenance of human health and is obtained solely through dietary means. Analysis of the edible portion of choy sum by both microbiological assay and LC-MS/MS indicated that total folate activity remained significantly unchanged over 3 weeks storage at 4 degrees C. Inedible fractions consisted primarily of outer leaves, which showed signs of rotting after 14d, and a combination of rotting and yellowing after 21 d, contributing to 20% and 40% of product removal, respectively. Following deconjugation of the folate present in choy sum to monoglutamate and diglutamate derivatives, the principal forms (vitamers) of folate detected in choy sum were 5-methyltetrahydrofolate and 5-formyl tetrahydrofolate, followed by tetrahydrofolate (THF), 5,10-methenyl-THF, and 10-formyl folic acid. During storage, a significant decline in 5-formyl-THF was observed, with a slight but not significant increase in the combined 5-methyl-THF derivatives. The decline in 5-formyl-THF in relation to the other folate vitamers present may indicate that 5-formyl-THF is being utilised as a folate storage reserve, being interconverted to more metabolically active forms of folate, such as 5-methyl-THF. Although folate vitamer profile changed over the storage period, total folate activity did not significantly change. From a human nutritional perspective this is important, as while particular folate vitamers (e.g. 5-methyl-THF) are necessary for maintaining vital aspects of plant metabolism, it is less important to the human diet, as humans can absorb and interconvert multiple forms of folate. The current trial indicates that it is possible to store choy sum for up to 3 weeks at 4 degrees C without significantly affecting total folate concentration of the edible portion. Crown Copyright (C) 2012 Published by Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ammonia can accumulate in highly stocked sheep accommodation, for example during live export shipments, and could affect sheep health and welfare. Thus, the objective of this experiment was to test the effects of 4 NH3 concentrations, 4 (control), 12, 21, and 34 mg/m(3), on the physiology and behavior of wether sheep. Sheep were held for 12 d under a micro-climate and stocking density similar to shipboard conditions recorded on voyages from Australia to the Middle East during the northern hemispheric summer. Ammonia increased macrophage activity in transtracheal aspirations, indicating active pulmonary infl ammation; however, it had no effect (P > 0.05) on hematological variables. Feed intake decreased (P = 0.002) in proportion to ammonia concentration, and BW gain decreased (P < 0.001) at the 2 greatest concentrations. Exposure to ammonia increased (P = 0.03) the frequency of sneezing, and at the greatest ammonia concentration, sheep were less active, with less locomotion, pawing, and panting. Twenty-eight days after exposure to NH3, the pulmonary macrophage activity and BW of the sheep returned to that of sheep exposed to only 4 mg/m(3). It was concluded that NH3 induced a temporary inflammatory response of the respiratory system and reduced BW gain, which together indicated a transitory adverse effect on the welfare of sheep.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We isolated and characterized 21 microsatellite loci in the vulnerable and iconic Australian lungfish, Neoceratodus forsteri. Loci were screened across eight individuals from the Burnett River and 40 individuals from the Pine River. Genetic diversity was low with between one and six alleles per locus within populations and a maximum expected heterozygosity of 0.774. These loci will now be available to assess effective population sizes and genetic structure in N. forsteri across its natural range in South East Queensland, Australia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An important focus of biosecurity is anticipating future risks, but time lags between introduction, naturalisation, and (ultimately) impact mean that future risks can be strongly influenced by history. We conduct a comprehensive historical analysis of tropical grasses (n = 155) that have naturalised in Australia since European settlement (1788) to determine what factors shaped historical patterns of naturalisation and future risks, including for the 21 species that cause serious negative impacts. Most naturalised species were from the Old World (78 %), were introduced for use in pasture (64.5 %), were first recorded prior to 1940 (84.5 %) and naturalised before 1980 (90.3 %). Patterns for high-impact species were similar, with all being first recorded in Australia by 1940, and only seven naturalised since then-five intentionally introduced as pasture species. Counter to expectations, we found no evidence for increased naturalisation with increasing trade, including for species introduced unintentionally for which the link was expected to be strongest. New pathways have not emerged since the 1930s despite substantial shifts in trading patterns. Furthermore, introduction and naturalisation rates are now at or approaching historically low levels. Three reasons were identified: (1) the often long lag phase between introduction and reported naturalisation means naturalisation rates reflect historical trends in introduction rates; (2) important introduction pathways are not directly related to trade volume and globalisation; and (3) that species pools may become depleted. The last of these appears to be the case for the most important pathway for tropical grasses, i.e. the intentional introduction of useful pasture species. Assuming that new pathways don't arise that might result in increased naturalisation rates, and that current at-border biosecurity practices remain in place, we conclude that most future high-impact tropical grass species are already present in Australia. Our results highlight the need to continually test underlying assumptions regarding future naturalisation rates of high-impact invasive species, as conclusions have important implications for how best to manage future biosecurity risks.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Probiotic supplements are single or mixed strain cultures of live microorganisms that benefit the host by improving the properties of the indigenous microflora (Seo et al 2010). In a pilot study at the University of Queensland, Norton et al (2008) found that Bacillus amyloliquefaciens Strain H57 (H57), primarily investigated as an inoculum to make high-quality hay, improved feed intake and nitrogen utilisation over several weeks in pregnant ewes. The purpose of the following study was to further challenge the potential of H57 -to show it survives the steam-pelleting process, and that it improves the performance of ewes fed pellets based on an agro-industrial by-product with a reputation for poor palatability, palm kernel meal (PKM), (McNeill 2013). Thirty-two first-parity White Dorper ewes (day 37 of pregnancy, mean liveweight = 47.3 kg, mean age = 15 months) were inducted into individual pens in the animal house at the University of Queensland, Gatton. They were adjusted onto PKM-based pellets (g/kg drymatter (DM): PKM, 408; sorghum, 430; chick pea hulls, 103; minerals and vitamins; Crude protein, 128; ME: 11.1MJ/kg DM) until day 89 of pregnancy and thereafter fed a predominately pelleted diet incorporating with or without H57 spores (10 9 colony forming units (cfu)/kg pellet, as fed), plus 100g/ewe/day oaten chaff, until day 7 of lactation. From day 7 to 20 of lactation the pelleted component of the diet was steadily reduced to be replaced by a 50:50 mix of lucerne: oaten chaff, fed ad libitum, plus 100g/ewe/day of ground sorghum grain with or without H57 (10 9 cfu/ewe/day). The period of adjustment in pregnancy (day 37-89) extended beyond expectations due to some evidence of mild ruminal acidosis after some initially high intakes that were followed by low intakes. During that time the diet was modified, in an attempt to improve palatability, by the addition of oaten chaff and the removal of an acidifying agent (NH4Cl) that was added initially to reduce the risk of urinary calculi. Eight ewes were removed due to inappetence, leaving 24 ewes to start the trial at day 90 of pregnancy. From day 90 of pregnancy until day 63 of lactation, liveweights of the ewes and their lambs were determined weekly and at parturition. Feed intakes of the ewes were determined weekly. Once lambing began, 1 ewe was removed as it gave birth to twin lambs (whereas the rest gave birth to a single lamb), 4 due to the loss of their lambs (2 to dystocia), and 1 due to copper toxicity. The PKM pellets were suspected to be the cause of the copper toxicity and so were removed in early lactation. Hence, the final statistical analysis using STATISTICA 8 (Repeated measures ANOVA for feed intake, One-way ANOVA for liveweight change and birth weight) was completed on 23 ewes for the pregnancy period (n = 11 fed H57; n = 12 control), and 18 ewes or lambs for the lactation period (n = 8 fed H57; n = 10 control). From day 90 of pregnancy until parturition the H57 supplemented ewes ate 17 more DM (g/day: 1041 vs 889, sed = 42.4, P = 0.04) and gained more liveweight (g/day: 193 vs 24.0, sed = 25.4, P = 0.0002), but produced lambs with a similar birthweight (kg: 4.18 vs 3.99, sed = 0.19, P = 0.54). Over the 63 days of lactation the H57 ewes ate similar amounts of DM but grew slower than the control ewes (g/day: 1.5 vs 97.0, sed = 21.7, P = 0.012). The lambs of the H57 ewes grew faster than those of the control ewes for the first 21 days of lactation (g/day: 356 vs 265, sed = 16.5, P = 0.006). These data support the findings of Norton et al (2008) and Kritas et al (2006) that certain Bacillus spp. supplements can improve the performance of pregnant and lactating ewes. In the current study we particularly highlighted the capacity of H57 to stimulate immature ewes to continue to grow maternal tissue through pregnancy, possibly through an enhanced appetite, which appeared then to stimulate a greater capacity to partition nutrients to their lambs through milk, at least for the first few weeks of lactation, a critical time for optimising lamb survival. To conclude, H57 can survive the steam pelleting process to improve feed intake and maternal liveweight gain in late pregnancy, and performance in early lactation, of first-parity ewes fed a diet based on PKM.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.