14 resultados para Adverse outcomes
em eResearch Archive - Queensland Department of Agriculture
Resumo:
A high proportion of the Australian and New Zealand dairy industry is based on a relatively simple, low input and low cost pasture feedbase. These factors enable this type of production system to remain internationally competitive. However, a key limitation of pasture-based dairy systems is periodic imbalances between herd intake requirements and pasture DM production, caused by strong seasonality and high inter-annual variation in feed supply. This disparity can be moderated to a certain degree through the strategic management of the herd through altering calving dates and stocking rates, and the feedbase by conserving excess forage and irrigating to flatten seasonal forage availability. Australasian dairy systems are experiencing emerging market and environmental challenges, which includes increased competition for land and water resources, decreasing terms of trade, a changing and variable climate, an increasing environmental focus that requires improved nutrient and water-use efficiency and lower greenhouse gas emissions. The integration of complementary forages has long been viewed as a means to manipulate the home-grown feed supply, to improve the nutritive value and DM intake of the diet, and to increase the efficiency of inputs utilised. Only recently has integrating complementary forages at the whole-farm system level received the significant attention and investment required to examine their potential benefit. Recent whole-of-farm research undertaken in both Australia and New Zealand has highlighted the importance of understanding the challenges of the current feedbase and the level of complementarity between forage types required to improve profit, manage risk and/or alleviate/mitigate against adverse outcomes. This paper reviews the most recent systems-level research into complementary forages, discusses approaches to modelling their integration at the whole-farm level and highlights the potential of complementary forages to address the major challenges currently facing pasture-based dairy systems.
Resumo:
The majority of Australian weeds are exotic plant species that were intentionally introduced for a variety of horticultural and agricultural purposes. A border weed risk assessment system (WRA) was implemented in 1997 in order to reduce the high economic costs and massive environmental damage associated with introducing serious weeds. We review the behaviour of this system with regard to eight years of data collected from the assessment of species proposed for importation or held within genetic resource centres in Australia. From a taxonomic perspective, species from the Chenopodiaceae and Poaceae were most likely to be rejected and those from the Arecaceae and Flacourtiaceae were most likely to be accepted. Dendrogram analysis and classification and regression tree (TREE) models were also used to analyse the data. The latter revealed that a small subset of the 35 variables assessed was highly associated with the outcome of the original assessment. The TREE model examining all of the data contained just five variables: unintentional human dispersal, congeneric weed, weed elsewhere, tolerates or benefits from mutilation, cultivation or fire, and reproduction by vegetative propagation. It gave the same outcome as the full WRA model for 71% of species. Weed elsewhere was not the first splitting variable in this model, indicating that the WRA has a capacity for capturing species that have no history of weediness. A reduced TREE model (in which human-mediated variables had been removed) contained four variables: broad climate suitability, reproduction in less or than equal to 1 year, self-fertilisation, and tolerates and benefits from mutilation, cultivation or fire. It yielded the same outcome as the full WRA model for 65% of species. Data inconsistencies and the relative importance of questions are discussed, with some recommendations made for improving the use of the system.
Resumo:
Over recent decades, Australian piggeries have commonly employed anaerobic ponds to treat effluent to a standard suitable for recycling for shed flushing purposes and for irrigation onto nearby agricultural land. Anaerobic ponds are generally sized according to the Rational Design Standard (RDS) developed by Barth (1985), resulting in large ponds, which can be expensive to construct, occupy large land areas, and are difficult and expensive to desludge, potentially disrupting the whole piggery operation. Limited anecdotal and scientific evidence suggests that anaerobic ponds that are undersized according to the RDS, operate satisfactorily, without excessive odour emission, impaired biological function or high rates of solids accumulation. Based on these observations, this paper questions the validity of rigidly applying the principles of the RDS and presents a number of alternate design approaches resulting in smaller, more highly loaded ponds that are easier and cheaper to construct and manage. Based on limited data of pond odour emission, it is suggested that higher pond loading rates may reduce overall odour emission by decreasing the pond volume and surface area. Other management options that could be implemented to reduce pond volumes include permeable pond covers, various solids separation methods, and bio-digesters with impermeable covers, used in conjunction with biofilters and/or systems designed for biogas recovery. To ensure that new effluent management options are accepted by regulatory authorities, it is important for researchers to address both industry and regulator concerns and uncertainties regarding new technology, and to demonstrate, beyond reasonable doubt, that new technologies do not increase the risk of adverse impacts on the environment or community amenity. Further development of raw research outcomes to produce relatively simple, practical guidelines and implementation tools also increases the potential for acceptance and implementation of new technology by regulators and industry.
Resumo:
Microsatellite markers were used to examine spatio-temporal genetic variation in the endangered eastern freshwater cod Maccullochella ikei in the Clarence River system, eastern Australia. High levels of population structure were detected. A model-based clustering analysis of multilocus genotypes identified four populations that were highly differentiated by F-statistics (FST = 0· 09 − 0· 49; P < 0· 05), suggesting fragmentation and restricted dispersal particularly among upstream sites. Hatchery breeding programmes were used to re-establish locally extirpated populations and to supplement remnant populations. Bayesian and frequency-based analyses of hatchery fingerling samples provided evidence for population admixture in the hatchery, with the majority of parental stock sourced from distinct upstream sites. Comparison between historical and contemporary wild-caught samples showed a significant loss of heterozygosity (21%) and allelic richness (24%) in the Mann and Nymboida Rivers since the commencement of stocking. Fragmentation may have been a causative factor; however, temporal shifts in allele frequencies suggest swamping with hatchery-produced M. ikei has contributed to the genetic decline in the largest wild population. This study demonstrates the importance of using information on genetic variation and population structure in the management of breeding and stocking programmes, particularly for threatened species.
Resumo:
Khaya senegalensis, African mahogany, a high-value hardwood, was introduced in the Northern Territory (NT) in the 1950s; included in various trials there and at Weipa, Q in the 1960s-1970s; planted on ex mine sites at Weipa (160 ha) until 1985; revived in farm plantings in Queensland and in trials in the NT in the 1990s; adopted for large-scale, annual planting in the Douglas-Daly region, NT from 2006 and is to have the planted area in the NT extended to at least 20,000 ha. The recent serious interest from plantation growers, including Forest Enterprises Australia Ltd (FEA), has seen the establishment of some large scale commercial plantations. FEA initiated the current study to process relatively young plantation stands from both Northern Territory and Queensland plantations to investigate the sawn wood and veneer recovery and quality from trees ranging from 14 years (NT – 36 trees) to 18-20 years (North Queensland – 31 trees). Field measures of tree size and straightness were complemented with log end splitting assessment and cross-sectional disc sample collection for laboratory wood properties measurements including colour and shrinkage. End-splitting scores assessed on sawn logs were relatively low compared to fast grown plantation eucalypts and did not impact processing negatively. Heartwood proportion in individual trees ranged from 50% up to 92 % of butt cross-sectional disc area for the visually-assessed dark coloured central heartwood and lighter coloured transition wood combined. Dark central heartwood proportion was positively related to tree size (R2 = 0.57). Chemical tests failed to assist in determining heartwood – sapwood boundary. Mean basic density of whole disc samples was 658 kg/m3 and ranged among trees from 603 to 712 kg/m3. When freshly sawn, the heartwood of African mahogany was orange-red to red. Transition wood appeared to be pinkish and the sapwood was a pale yellow colour. Once air dried the heartwood colour generally darkens to pinkish-brown or orange-brown and the effect of prolonged time and sun exposure is to darken and change the heartwood to a red-brown colour. A portable colour measurement spectrophotometer was used to objectively assess colour variation in CIE L*, a* and b* values over time with drying and exposure to sunlight. Capacity to predict standard colour values accurately after varying periods of direct sunlight exposure using results obtained on initial air-dried surfaces decreased with increasing time to sun exposure. The predictions are more accurate for L* values which represent brightness than for variation in the a* values (red spectrum). Selection of superior breeding trees for colour is likely to be based on dried samples exposed to sunlight to reliably highlight wood colour differences. A generally low ratio between tangential and radial shrinkages was found, which was reflected in a low incidence of board distortion (particularly cupping) during drying. A preliminary experiment was carried out to investigate the quality of NIR models to predict shrinkage and density. NIR spectra correlated reasonably well with radial shrinkage and air dried density. When calibration models were applied to their validation sets, radial shrinkage was predicted to an accuracy of 76% with Standard Error of Prediction of 0.21%. There was also a strong predictive power for wood density. These are encouraging results suggesting that NIR spectroscopy has good potential to be used as a non-destructive method to predict shrinkage and wood density using 12mm diameter increment core samples. Average green off saw recovery was 49.5% (range 40 to 69%) for Burdekin Agricultural College (BAC) logs and 41.9% (range 20 to 61%) for Katherine (NT) logs. These figures are about 10% higher than compared to 30-year-old Khaya study by Armstrong et al. (2007) however they are inflated as the green boards were not docked to remove wane prior to being tallied. Of the recovered sawn, dried and dressed volume from the BAC logs, based on the cambial face of boards, 27% could potentially be used for select grade, 40% for medium feature grade and 26% for high feature grades. The heart faces had a slightly higher recovery of select (30%) and medium feature (43%) grade boards with a reduction in the volume of high feature (22%) and reject (6%) grade boards. Distribution of board grades for the NT site aged 14 years followed very similar trends to those of the BAC site boards with an average (between facial and cambial face) 27% could potentially be used for select grade, 42% for medium feature grade, 26% for high feature grade and 5% reject. Relatively to some other subtropical eucalypts, there was a low incidence of borer attack. The major grade limiting defects for both medium and high feature grade boards recovered from the BAC site were knots and wane. The presence of large knots may reflect both management practices and the nature of the genetic material at the site. This stand was not managed for timber production with a very late pruning implemented at about age 12 years. The large amount of wane affected boards is indicative of logs with a large taper and the presence of significant sweep. Wane, knots and skip were the major grade limiting defects for the NT site reflecting considerable amounts of sweep with large taper as might be expected in younger trees. The green veneer recovered from billets of seven Khaya trees rotary peeled on a spindleless lathe produced a recovery of 83% of green billet volume. Dried veneer recovery ranged from 40 to 74 % per billet with an average of 64%. All of the recovered grades were suitable for use in structural ply in accordance to AS/NZ 2269: 2008. The majority of veneer sheets recovered from all billets was C grade (27%) with 20% making D grade and 13% B grade. Total dry sliced veneer recovery from the logs of the two largest logs from each location was estimated to be 41.1%. Very positive results have been recorded in this small scale study. The amount of colour development observed and the very reasonable recoveries of both sawn and veneer products, with a good representation of higher grades in the product distribution, is encouraging. The prospects for significant improvement in these results from well managed and productive stands grown for high quality timber should be high. Additionally, the study has shown the utility of non-destructive evaluation techniques for use in tree improvement programs to improve the quality of future plantations. A few trees combined several of the traits desired of individuals for a first breeding population. Fortunately, the two most promising trees (32, 19) had already been selected for breeding on external traits, and grafts of them are established in the seed orchard.
Resumo:
Objectives : To develop a method to mark hatchery reared saucer scallops to distinguish them from animals derived from wild populations. Outcomes achieved : Juvenile saucer scallop (Amusium balloti) shells have been successfully marked en masse using 3 chemicals, namely alizarin red S, calcein and oxytetracycline (OTC). Considering spat survival, mark quality and mark duration collectively, the most successful chemical was OTC. Scallop spat immersed for three days in 200 or 300 mg L-1 OTC resulted in good mark incorporation and high survival. Tris was an effective means of buffering pH change during OTC treatment, with no apparent adverse effects to the scallops. The marks from OTC treatment were still visible in live scallops for at least 10 months, even with exposure to natural filtered light during that period. A second discernible shell mark was added 27 days after the first with no evident toxicity to the scallops. A simulated seabed system was designed which provide marked improvements in scallop juvenile survival and growth. Advice on shell marking has been given to QSS by DPI&F, and the first commercial trials have now commenced, with initial results showing successful marking of juvenile scallops at QSS. This research will allow the industry to monitor the survival, growth and movement of specific cohorts of deployed scallops. This will provide valuable feedback to assess the value of the ranching venture, to optimise release strategies, and to develop improved species management plans.
Resumo:
Improving development outcomes for smallholder farmers through closer collaboration between landcare and other ACIAR projects.
Resumo:
The impact of excessive sediment loads entering into the Great Barrier Reef lagoon has led to increased awareness of land condition in grazing lands. Improved ground cover and land condition have been identified as two important factors in reducing sediment loads. This paper reports the economics of land regeneration using case studies for two different land types in the Fitzroy Basin. The results suggest that for sediment reduction to be achieved from land regeneration of more fertile land types (brigalow blackbutt) the most efficient method of allocating funds would be through extension and education. However for less productive country (narrow leaved ironbark woodlands) incentives will be required. The analysis also highlights the need for further scientific data to undertake similar financial assessments of land regeneration for other locations in Queensland.
Resumo:
This guide applies to spotted gum - ironbark forests and woodlands. Topics covered in the guide include: *The spotted gum - ironbark ecosystem; *General effects of burning practices; *Understandinng the effects of fire management; *Timber production; *Livestock grazing production; *Balancing production and biodiversity; *Fire management planning for the property; *Recommendtaions for landholders. These guidelines have been prepared for spotted gum - ironbark forests and woodlands and are not necessarily applicable to other forest and woodland ecosystems. The recommendations provided in these guidelines should be used as a guide only.
Resumo:
The welfare outcomes for Bos indicus cattle (100 heifers and 50 cows) spayed by either the dropped ovary technique (DOT) or ovariectomy via flank laparotomy (FL) were compared with cattle subjected to physical restraint (PR), restraint by electroimmobilization in conjunction with PR (EIM), and PR and mock AI (MAI). Welfare assessment used measures of morbidity, mortality, BW change, and behavior and physiology indicative of pain and stress. One FL heifer died at d 5 from peritonitis. In the 8-h period postprocedures, plasma bound cortisol concentrations of FL, DOT, and EIM cows were not different and were greater (P < 0.05) than PR and MAI. Similarly, FL and DOT heifers had greater (P < 0.05) concentrations than PR and MAI, with EIM intermediate. Creatine kinase and aspartate aminotransferase concentrations were greater (P < 0.05) in FL and EIM heifers compared with the other treatments, with a similar pattern seen in the cows. Haptoglobin concentrations were significantly (P < 0.05) increased in the FL heifers compared with other treatments in the 8- to 24-h and 24- to 96-h periods postprocedures, and in cows were significantly (P < 0.05) increased in the FL and DOT compared with PR in the 24- to 96-h period. Behavioral responses complemented the physiological responses; standing head down was shown by more (P < 0.05) FL cows and heifers to 3 d postprocedures compared with other treatments, although there was no difference between FL and DOT heifers at the end of the day of procedures. At this same time, fewer (P < 0.05) FL and DOT heifers and cows were observed feeding compared with other treatments, although in cows there was no difference between FL, DOT, and EIM. There were no significant differences (P > 0.05) between treatments in BW changes. For both heifers and cows, FL and DOT spaying caused similar levels of acute pain, but FL had longer-lasting adverse impacts on welfare. Electroimmobilization during FL contributed to the pain and stress of the procedure. We conclude that: i) FL and DOT spaying should not be conducted without measures to manage the associated pain and stress; ii) DOT spaying is preferable to FL spaying; iii) spaying heifers is preferable to spaying cows; and iv) electroimmobilization causes pain and stress and should not be routinely used as a method of restraint.
Resumo:
Tension banding castration of cattle is gaining favour because it is relatively simple to perform and is promoted by retailers of the banders as a humane castration method. Two experiments were conducted, under tropical conditions using Bos indicus bulls comparing tension banding (Band) and surgical (Surgical) castration of weaner (7–10 months old) and mature (22–25 months old) bulls with and without pain management (NSAID (ketoprofen) or saline injected intramuscularly immediately prior to castration). Welfare outcomes were assessed using a range of measures; this paper reports on some physiological, morbidity and productivity-related responses to augment the behavioural responses reported in an accompanying paper. Blood samples were taken on the day of castration (day 0) at the time of restraint (0 min) and 30 min (weaners) or 40 min (mature bulls), 2 h, and 7 h; and days 1, 2, 3, 7, 14, 21 and 28 post-castration. Plasmas from day 0 were assayed for cortisol, creatine kinase, total protein and packed cell volume. Plasmas from the other samples were assayed for cortisol and haptoglobin (plus the 0 min sample). Liveweights were recorded approximately weekly to 6 weeks and at 2 and 3 months post-castration. Castration sites were checked at these same times to 2 months post-castration to score the extent of healing and presence of sepsis. Cortisol concentrations (mean ± s.e. nmol/L) were significantly (P < 0.05) higher in the Band (67 ± 4.5) compared with Surgical weaners (42 ± 4.5) at 2 h post-castration, but at 24 h post-castration were greater in the Surgical (43 ± 3.2) compared with the Band weaners (30 ± 3.2). The main effect of ketoprofen was on the cortisol concentrations of the mature Surgical bulls; concentrations were significantly reduced at 40 min (47 ± 7.2 vs. 71 ± 7.2 nmol/L for saline) and 2 h post-castration (24 ± 7.2, vs. 87 ± 7.2 nmol/L for saline). Ketoprofen, however, had no effect on the Band mature bulls, with their cortisol concentrations averaging 54 ± 5.1 nmol/L at 40 min and 92 ± 5.1 nmol/L at 2 h. Cortisol concentrations were also significantly elevated in the Band (83 ± 3.0 nmol/L) compared with Surgical mature bulls (57 ± 3.0 nmol/L) at weeks 2–4 post-castration. The timing of this elevation coincided with significantly elevated haptoglobin concentrations (mg/mL) in the Band bulls (2.97 ± 0.102 for mature bulls and 1.71 ± 0.025 for weaners, vs. 2.10 ± 0.102 and 1.45 ± 0.025 respectively for the Surgical treatment) and evidence of slow wound healing and sepsis in both the weaner (0.81 ± 0.089 not healed at week 4 for Band, 0.13 ± 0.078 for Surgical) and mature bulls (0.81 ± 0.090 at week 4 for Band, 0.38 ± 0.104 for Surgical). Overall, liveweight gains of both age groups were not affected by castration method. The findings of acute pain, chronic inflammation and possibly chronic pain in the mature bulls at least, together with poor wound healing in the Band bulls support behavioural findings reported in the accompanying paper and demonstrate that tension banding produces inferior welfare outcomes for weaner and mature bulls compared with surgical castration.
Resumo:
Tension-band castration of cattle is gaining favour because it is relatively simple to perform and is promoted by retailers of the devices as a humane castration method. Furthermore, retailers encourage delaying castration to exploit the superior growth rates of bulls compared with steers. Two experiments were conducted, under tropical conditions, comparing tension banding and surgical castration of weaner (7–10 months old) and mature (22–25 months old) Bos indicus bulls with and without pain management (ketoprofen or saline injected intramuscularly immediately prior to castration). Welfare outcomes were assessed using a wide range of measures; this paper reports on the behavioural responses of the bulls and an accompanying paper reports on other measures. Behavioural data were collected at intervals by direct observation and continuously via data loggers on the hind leg of the bulls to 4 weeks post-castration. Tension-banded bulls performed less movement in the crush/chute than the surgically castrated bulls during the procedures (weaner: 2.63 vs. 5.69, P < 0.001; mature: 1.00 vs. 5.94; P < 0.001 for tension-band and surgical castration, respectively), indicating that tension banding was less painful then surgical castration during conduct. To 1.5 h post-castration, tension-banded bulls performed significantly (all P < 0.05) more active behavioural responses indicative of pain compared with surgical castrates, e.g., percentage time walking forwards (weaner: 15.0% vs. 8.1%; mature: 22.3% vs. 15.1%), walking backwards (weaner: 4.3% vs. 1.4%; mature: 2.4% vs. 0.5%), numbers of tail movements (weaner: 21.9 vs. 1.4; mature: 51.5 vs. 39.4) and leg movements (weaner: 12.9 vs. 0.9; mature: 8.5 vs. 1.5), respectively. In contrast, surgically castrated bulls performed more immobile behaviours compared with tension-banded bulls (e.g., standing in mature bulls was 56.6% vs. 34.4%, respectively, P = 0.002). Ketoprofen administration appeared effective in moderating pain-related behaviours in the mature bulls from 1.5 to 3 h, e.g., reducing abnormal standing (0.0% vs. 7.7%, P = 0.009) and increasing feeding (12.7% vs. 0.0%, P = 0.048) in NSAID- and saline-treated bulls, respectively. There were few behavioural differences subsequent to 24 h post-castration, but some limited evidence of chronic pain (3–4 weeks post-castration) with both methods. Interpretation, however, was difficult from behaviours alone. Thus, tension banding is less painful than surgical castration during conduct of the procedures and pain-related behavioural responses differ with castration method (active restlessness in response to tension banding and minimisation of movement in response to surgical castration). Ketoprofen administered immediately prior to castration was somewhat effective in reducing pain, particularly in the mature bulls.
Resumo:
Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.
Resumo:
Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.