42 resultados para frog decline
Resumo:
Growers working together have proven to be a successful method for improving the utilization of farm resources and accelerating the adoption of the Sugar Yield Decline Joint Venture principles (SYDJV). The Pinnacle Precision Farming Group was formed in 2004 with the aim to bring together the ideas, knowledge and resources of growers in the Herbert region. Along with their common interest in controlled traffic, minimal tillage and crop rotations, the grower group utilize a farm machinery contractor to provide some of their major farming operations. This paper provides an insight into the changes made by the Pinnacle Precision Farming Group and their journey to adopt the new farming system practices. This paper also details the changes made by the group machinery contractor and a comparison of the old and new farming systems used by a group member. A focus point of the document is the impact of the new farming system on the economic, social and environmental components of the farming business. Analysis of the new farming system with a legume crop rotation revealed an increase in the farm gross margin by AU$22 024 and, in addition, a reduction in tractor operation time by 38% across the whole farm. This represents a return on marginal capital of 14.68 times the original capital outlay required by the group member. Using the new farming system without a legume crop will still improve the group members whole of farm gross margin by AU$6 839 and reduce tractor operation time by 43% across the whole farm. The Pinnacle Precision Farming group recognize the need to continually improve their farming businesses and believe that the new farming system principles are critical for the long term viability of the industry. [U$1 = AU$1.19].
Resumo:
Detailed data on seagrass distribution, abundance, growth rates and community structure information were collected at Orman Reefs in March 2004 to estimate the above-ground productivity and carbon assimilated by seagrass meadows. Seagrass meadows were re-examined in November 2004 for comparison at the seasonal extremes of seagrass abundance. Ten seagrass species were identified in the meadows on Orman Reefs. Extensive seagrass coverage was found in March (18,700 ha) and November (21,600 ha), with seagrass covering the majority of the intertidal reef-top areas and a large proportion of the subtidal areas examined. There were marked differences in seagrass above-ground biomass, distribution and species composition between the two surveys. Major changes between March and November included a substantial decline in biomass for intertidal meadows and an expansion in area of subtidal meadows. Changes were most likely a result of greater tidal exposure of intertidal meadows prior to November leading to desiccation and temperature-related stress. The Orman Reef seagrass meadows had a total above-ground productivity of 259.8 t DW day-1 and estimated carbon assimilation of 89.4 t C day-1 in March. The majority of this production came from the intertidal meadows which accounted for 81% of the total production. Intra-annual changes in seagrass species composition, shoot density and size of meadows measured in this study were likely to have a strong influence on the total above-ground production during the year. The net estimated above-ground productivity of Orman Reefs meadows in March 2004 (1.19 g C m-2 day-1) was high compared with other tropical seagrass areas that have been studied and also higher than many other marine, estuarine and terrestrial plant communities.
Resumo:
Batches of glasshouse-grown flowering sorghum plants were placed in circular plots for 24 h at two field sites in southeast Queensland, Australia on 38 occasions in 2003 and 2004, to trap aerial inoculum of Claviceps africana. Plants were located 20-200 m from the centre of the plots. Batches of sorghum plants with secondary conidia of C. africana on inoculated spikelets were placed at the centre of each plot on some dates as a local point source of inoculum. Plants exposed to field inoculum were returned to a glasshouse, incubated at near-100% relative humidity for 48 h and then at ambient relative humidity for another week before counting infected spikelets to estimate pathogen dispersal. Three times as many spikelets became infected when inoculum was present within 200 m of trap plants, but infected spikelets did not decline with increasing distance from local source within the 200 m. Spikelets also became infected on all 10 dates when plants were exposed without a local source of infected plants, indicating that infection can occur from conidia surviving in the atmosphere. In 2005, when trap plants were placed at 14 locations along a 280 km route, infected spikelets diminished with increasing distance from sorghum paddocks and infection was sporadic for distances over 1 km. Multiple regression analysis showed significant influence of moisture related weather variables on inoculum dispersal. Results suggest that sanitation measures can help reduce ergot severity at the local level, but sustainable management will require better understanding of long-distance dispersal of C. africana inoculum.
Resumo:
Several species of marine mammals are at risk of extinction from being captured as bycatch in commercial fisheries. Various approaches have been developed and implemented to address this bycatch problem, including devices and gear changes, time and area closures and fisheries moratoria. Most of these solutions are difficult to implement effectively, especially for artisanal fisheries in developing countries and remote regions. Re-zoning of the Great Barrier Reef World Heritage Area (GBRWHA) in 2004 closed 33% of the region to extractive activities, including commercial fishing. However, the impact of re-zoning and the associated industry restructuring on a threatened marine mammal, the dugong (Dugong dugon), is difficult to quantify. Accurate information on dugong bycatch in commercial nets is unavailable because of the large geographic extent of the GBRWHA, the remoteness of the region adjacent to the Cape York Peninsula where most dugongs occur and the artisanal nature of the fishery. In the face of this uncertainty, a spatial risk-assessment approach was used to evaluate the re-zoning and associated industry restructuring for their ability to reduce the risk of dugong bycatch from commercial fisheries netting. The new zoning arrangements appreciably reduced the risk of dugong bycatch by reducing the total area where commercial netting is permitted. Netting is currently not permitted in 67% of dugong habitats of high conservation value, a 56% improvement over the former arrangements. Re-zoning and industry restructuring also contributed to a 22% decline in the spatial extent of conducted netting. Spatial risk assessment approaches that evaluate the risk of mobile marine mammals from bycatch are applicable to other situations where there is limited information on the location and intensity of bycatch, including remote regions and developing countries where resources are limited.
Resumo:
Three data sets were examined to define the level of interaction of reef associated sharks with the commercial Coral Reef Fin Fish Fishery within the Great Barrier Reef (GBR). Data were examined from fishery logbooks, an observer program within the fishery and a fishery-independent survey conducted as part of the Effects of Line Fishing (ELF) Experiment. The majority of the identified catch was comprised of grey reef (62-72%), whitetip reef (16-29%) and blacktip reef (6-13%) sharks. Logbook data revealed spatially and temporally variable landings of shark from the GBR. Catch per unit effort (CPUE) through time was stable for the period from 1989 to 2006 with no evidence of increase or decline. Data from observer and ELF data sets indicated no differences in CPUE among regions. The ELF data set demonstrated that CPUE was higher in Marine National Park zones (no fishing) when compared to General Use zones (open to fishing). The ongoing and consistent catches of reef sharks in the fishery and effectiveness of no-fishing zones suggest that management zones within the GBR Marine Park are effective at protecting a portion of the reef shark population from exploitation.
Resumo:
The effects of the hydrological regime on temporal changes to physical characteristics of substratum habitat, sediment texture of surface sediments (<10 cm), were investigated in a sub-tropical headwater stream over four years. Surface discharge was measured together with vertical hydraulic gradient and groundwater depth in order to explore features of sediment habitat that extend beyond the streambed surface. Whilst the typical discharge pattern was one of intermittent base flows and infrequent flow events associated with monsoonal rain patterns, the study period also encompassed a drought and a one-in-a-hundred-year flood. Rainfall and discharge did not necessarily reflect the actual conditions in the stream. Although surface waters were persistent long after discharge ceased, the streambed was completely dry on several occasions. Shallow groundwater was present at variable depths throughout the study period, being absent only at the height of the drought. The streambed sediments were mainly gravels, sand and clay. Finer sediment fractions showed a marked change in grain size over time, although bedload movement was limited to a single high discharge event. In response to a low discharge regimen (drought), sediments characteristically showed non-normal distributions and were dominated by finer materials. A high-energy discharge event produced a coarsening of sands and a diminished clay fraction in the streambed. Particulate organic matter from sediments showed trends of build-up and decline with the high and low discharge regimes, respectively. Within the surface sediment intersticies three potential categories of invertebrate habitat were recognised, each with dynamic spatial and temporal boundaries.
Resumo:
Adoption of conservation tillage practices on Red Ferrosol soils in the inland Burnett area of south-east Queensland has been shown to reduce runoff and subsequent soil erosion. However, improved infiltration resulting from these measures has not improved crop performance and there are suggestions of increased loss of soil water via deep drainage. This paper reports data monitoring soil water under real and artificial rainfall events in commercial fields and long-term tillage experiments, and uses the data to explore the rate and mechanisms of deep drainage in this soil type. Soils were characterised by large drainable porosities (≥0.10 m3/m3) in all parts of the profile to depths of 1.50 m, with drainable porosity similar to available water content (AWC) at 0.25 and 0.75 m, but >60% higher than AWC at 1.50 m. Hydraulic conductivity immediately below the tilled layer in both continuously cropped soils and those after a ley pasture phase was shown to decline with increasing soil moisture content, although the rate of decline was much greater in continuously cropped soil. At moisture contents approaching the drained upper limit (pore water pressure = -100cm H2O), estimates of saturated hydraulic conductivity after a ley pasture were 3-5 times greater than in continuously cropped soil, suggesting much greater rates of deep drainage in the former when soils are moist. Hydraulic tensiometers and fringe capacitance sensors monitored during real and artificial rainfall events showed evidence of soils approaching saturation in the surface layers (top 0.30-0.40 m), but there was no evidence of soil moistures exceeding the drained upper limit (i.e. pore water pressures ≤ -100 cm H2O) in deeper layers. Recovery of applied soil water within the top 1.00-1.20 m of the profile during or immediately after rainfall events declined as the starting profile moisture content increased. These effects were consistent with very rapid rates of internal drainage. Sensors deeper in the profile were unable to detect this drainage due to either non-uniformity of conducting macropores (i.e. bypass flow) or unsaturated conductivities in deeper layers that far exceed the saturated hydraulic conductivity of the infiltration throttle at the bottom of the cultivated layer. Large increases in unsaturated hydraulic conductivities are likely with only small increases in water content above the drained upper limit. Further studies with drainage lysimeters and large banks of hydraulic tensiometers are planned to quantify drainage risk in these soil types.
Resumo:
The rumen degradability parameters of the diet selected by two to four oesophageal-fistulated Brahman steers grazing a range of tropical pastures were determined by incubation of extrusa in nylon bags suspended in the rumen of rumen-fistulated (RF) Brahman steers. The effective protein degradability (Edg) was determined by measuring the rate of disappearance of neutral detergent insoluble nitrogen (NDIN) less acid detergent insoluble nitrogen (ADIN) in the incubated extrusa. Six to eight RF steers also grazed each of the pastures along with the oesophageal-fistulated steers, to allow determination of key rumen parameters and rumen particulate matter fractional outflow rates (FOR). The seven pastures studied included: native tropical grass (C4) pasture (major species Heteropogon contortus and Bothriochloa bladhii), studied in the early wet (NPEW), the wet/dry transition (NPT) and the dry (NPD) seasons; introduced tropical grass (C4) pasture (Bothriochloa insculpta), studied in the mid wet season (BB); the introduced tropical legumes (C3), Lablab purpureus (LL) and Clitoria ternatea (BP); and the temperate grass (C3) pasture, ryegrass (Lolium multiflorum, RG). Using the measured particle FOR values in calculations, the Edg estimates were very high for both C4 and C3 species: 0.82–0.91 and 0.95–0.98 g/g crude protein (CP), respectively. Substitution of an assumed FOR (kp = 0.02/h) for the measured values for each pasture type did not markedly affect estimates of Edg. However, C4 tropical grasses had much lower effective rumen degradable protein (ERDP) fractions (23–66 g/kg DM) than the C3 pasture species RG and LL (356 and 243 g/kg DM, respectively). This was associated with a lower potential degradability and degradation rate of organic matter (OM) in sacco, lower in vitro organic matter digestibility (IVOMD) and CP concentrations in the extrusa, and lower ammonia-N and branched-chain fatty acid concentrations in rumen fluid for the tropical grasses. As tropical grass pastures senesced, there was a decline in Edg, the ERDP and rumen undegradable protein (UDP) fractions, the potential degradability and degradation rate of OM and the IVOMD. These results provide useful data for estimating protein supply to cattle grazing tropical pastures.
Resumo:
Cultivation and cropping of soils results in a decline in soil organic carbon and soil nitrogen, and can lead to reduced crop yields. The CENTURY model was used to simulate the effects of continuous cultivation and cereal cropping on total soil organic matter (C and N), carbon pools, nitrogen mineralisation, and crop yield from 6 locations in southern Queensland. The model was calibrated for each replicate from the original datasets, allowing comparisons for each replicate rather than site averages. The CENTURY model was able to satisfactorily predict the impact of long-term cultivation and cereal cropping on total organic carbon, but was less successful in simulating the different fractions and nitrogen mineralisation. The model firstly over-predicted the initial (pre-cropping) soil carbon and nitrogen concentration of the sites. To account for the unique shrinking and swelling characteristics of the Vertosol soils, the default annual decomposition rates of the slow and passive carbon pools were doubled, and then the model accurately predicted initial conditions. The ability of the model to predict carbon pool fractions varied, demonstrating the difficulty inherent in predicting the size of these conceptual pools. The strength of the model lies in the ability to closely predict the starting soil organic matter conditions, and the ability to predict the impact of clearing, cultivation, fertiliser application, and continuous cropping on total soil carbon and nitrogen.
Resumo:
Negative potassium (K) balances in all broadacre grain cropping systems in northern Australia are resulting in a decline in the plant-available reserves of K and necessitating a closer examination of strategies to detect and respond to developing K deficiency in clay soils. Grain growers on the Red Ferrosol soils have increasingly encountered K deficiency over the last 10 years due to lower available K reserves in these soils in their native condition. However, the problem is now increasingly evident on the medium-heavy clay soils (Black and Grey Vertosols) and is made more complicated by the widespread adoption of direct drill cropping systems and the resulting strong strati. cation of available K reserves in the top 0.05-0.1 m of the soil pro. le. This paper reports glasshouse studies examining the fate of applied K fertiliser in key cropping soils of the inland Burnett region of south-east Queensland, and uses the resultant understanding of K dynamics to interpret results of field trials assessing the effectiveness of K application strategies in terms of K availability to crop plants. At similar concentrations of exchangeable K (K-exch), soil solution K concentrations and activity of K in the soil solution (AR(K)) varied by 6-7-fold between soil types. When K-exch arising from different rates of fertiliser application was expressed as a percentage of the effective cation exchange capacity (i.e. K saturation), there was evidence of greater selective adsorption of K on the exchange complex of Red Ferrosols than Black and Grey Vertosols or Brown Dermosols. Both soil solution K and AR(K) were much less responsive to increasing K-exch in the Black Vertosols; this is indicative of these soils having a high K buffer capacity (KBC). These contrasting properties have implications for the rate of diffusive supply of K to plant roots and the likely impact of K application strategies (banding v. broadcast and incorporation) on plant K uptake. Field studies investigating K application strategies (banding v. broadcasting) and the interaction with the degree of soil disturbance/mixing of different soil types are discussed in relation to K dynamics derived from glasshouse studies. Greater propensity to accumulate luxury K in crop biomass was observed in a Brown Ferrosol with a KBC lower than that of a Black Vertosol, consistent with more efficient diffusive supply to plant roots in the Ferrosol. This luxury K uptake, when combined with crops exhibiting low proportional removal of K in the harvested product (i.e. low K harvest index coarse grains and winter cereals) and residue retention, can lead to rapid re-development of stratified K profiles. There was clear evidence that some incorporation of K fertiliser into soil was required to facilitate root access and crop uptake, although there was no evidence of a need to incorporate K fertiliser any deeper than achieved by conventional disc tillage (i.e. 0.1-0.15 m). Recovery of fertiliser K applied in deep (0.25-0.3 m) bands in combination with N and P to facilitate root proliferation was quite poor in Red Ferrosols and Grey or Black Vertosols with moderate effective cation exchange capacity (ECEC, 25-35 cmol(+)/kg), was reasonable but not enough to overcome K deficiency in a Brown Dermosol (ECEC 11 cmol(+)/kg), but was quite good on a Black Vertosol (ECEC 50-60 cmol(+)/kg). Collectively, results suggest that frequent small applications of K fertiliser, preferably with some soil mixing, is an effective fertiliser application strategy on lighter clay soils with low KBC and an effective diffusive supply mechanism. Alternately, concentrated K bands and enhanced root proliferation around them may be a more effective strategy in Vertosol soils with high KBC and limited diffusive supply. Further studies to assess this hypothesis are needed.
Resumo:
Our evaluation of the predation of calves by wild dogs in the 1990s found that the number of calves killed and frequency of years that calf losses occurred, is higher in baited areas compared to adjoining, non-baited areas of similar size. Calf losses were highest with poor seasonal conditions, low prey numbers and where baited areas were re-colonised by wild dogs soon after baiting. We monitored wild dog “activity” before and after 35 baiting programs in southwest, central west and far north Queensland between 1994 and 2006 and found change in activity depends on the timing of the baiting. Baiting programs conducted between October and April show an increase in dog activity post-baiting (average increase of 219.1%, SEM 100.9, n=9, for programs conducted in October and November; an increase of 82.5%, SEM 54.5, n=7 for programs conducted in March and April; and a decrease in activity of 46.5%, SEM 10.2, n=19 for programs conducted between May and September). We monitored the seasonal activity and dispersal of wild dogs fitted with satellite transmitters 2006 to present. We have found that: • Activity of breeding males and females, whilst rearing and nurturing pups, is focussed around the den between July to September and away from areas of human activity. Activity of breeding groups appears to avoid locations of human activity until juveniles become independent (around late November). • While independent and solitary yearlings often have unstable, elliptically-shaped territories in less favourable areas, members of breeding groups have territories that appear seasonally stable and circular located in more favourable habitats. • Extra-territorial forays of solitary yearlings can be huge, in excess of 200 km. The largest forays we have monitored have occurred when the activity of pack members is focussed around rearing pups and juveniles (August to November). • Where wild dogs have dispersed or had significant territorial expansion, it has occurred within days of baiting programs and onto recently baited properties. • The wild dogs we have tracked have followed netting barrier fences for hundreds of kilometres and lived adjacent to or bypassed numerous grids in the barrier. Based on these studies, we conclude that a proportion of the perceived decline in dog activity between May and September, post baiting, is due to a decline in dog activity in areas associated with human activity. The increase in dog activity post-baiting between October and May (and increased calf predation on baited properties) is likely caused by wild dogs dispersing (juveniles and yearlings) or expanding (adults) their territory into baited, now ‘vacant’, areas. We hypothesise that baiting programs should be focussed in summer and autumn commencing late November as soon as juveniles become independent of adults. We also hypothesise that instead of large, annual or semi-annual baiting programs, laying the same number of baits over 4-6 weeks may be more effective. These hypotheses need to be tested through an adaptive management project.
Resumo:
Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.
Resumo:
Chytridiomycosis is an emerging infectious disease of amphibians caused by the fungal pathogen Batrachochytrium dendrobatidis, and its role in causing population declines and species extinctions worldwide has created an urgent need for methods to detect it. Several reports indicate that in anurans chytridiomycosis can cause the depigmentation of tadpole tnouthparts, but the accuracy of using depigmentation to determine disease status remains uncertain. Our objective was to determine for the Mountain Yellow-legged Frog (Rana muscosa) whether visual inspections of the extent of tadpole mouthpart depigmentation could be used to accurately categorize individual tadpoles or R. muscosa populations as B. dendrobatidis-positive or negative. This was accomplished by assessing the degree of mouthpart depigmentation in tadpoles of known disease status (based on PCR assays). The depigmentation of R. muscosa tadpole mouthparts was associated with the presence of B. dendrobatidis, and this association was particularly strong for upper jaw sheaths. Using a rule that classifies tadpoles with upper jaw sheaths that are 100% pigmented as uninfected and those with jaw sheaths that are <100% pigmented as infected resulted in the infection status of 86% of the tadpoles being correctly classified. By applying this rule to jaw sheath pigmentation scores averaged across all tadpoles inspected per site, we were able to correctly categorize the infection status of 92% of the study populations. Similar research on additional anurans is critically needed to determine how broadly applicable our results for R. muscosa are to other species.
Resumo:
The genetics of heifer performance in tropical 'wet' and 'dry' seasons, and relationships with steer performance, were studied in Brahman (BRAH) and Tropical Composite (TCOMP) (50% Bos indicus, African Sanga or other tropically adapted Bos taurus; 50% non-tropically adapted Bos taurus) cattle of northern Australia. Data were from 2159 heifers (1027 BRAH, 1132 TCOMP), representing 54 BRAH and 51 TCOMP sires. Heifers were assessed after post-weaning 'wet' (ENDWET) and 'dry' (ENDDRY) seasons. Steers were assessed post-weaning, at feedlot entry, over a 70-day feed test, and after similar to 120-day finishing. Measures studied in both heifers and steers were liveweight (LWT), scanned rump fat, rib fat and M. longissimus area (SEMA), body condition score (CS), hip height (HH), serum insulin-like growth factor-I concentration (IGF-I), and average daily gains (ADG). Additional steer measures were scanned intra-muscular fat%, flight time, and daily (DFI) and residual feed intake (RFI). Uni- and bivariate analyses were conducted for combined genotypes and for individual genotypes. Genotype means were predicted for a subset of data involving 34 BRAH and 26 TCOMP sires. A meta-analysis of genetic correlation estimates examined how these were related to the difference between measurement environments for specific traits. There were genotype differences at the level of means, variances and genetic correlations. BRAH heifers were significantly (P < 0.05) faster-growing in the 'wet' season, slower-growing in the 'dry' season, lighter at ENDDRY, and taller and fatter with greater CS and IGF-I at both ENDWET and ENDDRY. Heritabilities were generally in the 20 to 60% range for both genotypes. Phenotypic and genetic variances, and genetic correlations, were commonly lower for BRAH. Differences were often explained by the long period of tropical adaptation of B. indicus. Genetic correlations were high between corresponding measures at ENDWET and ENDDRY, positive between fat and muscle measures in TCOMP but negative in BRAH (mean of 13 estimates 0.50 and -0.19, respectively), and approximately zero between steer feedlot ADG and heifer ADG in BRAH. Numerous genetic correlations between heifers and steers differed substantially from unity, especially in BRAH, suggesting there may be scope to select differently in the sexes where that would aid the differing roles of heifers and steers in production. Genetic correlations declined as measurement environments became more different, the rates of decline (environment sensitivity) sometimes differing with genotype. Similar measures (LWT, HH and ADG; IGF-I at ENDWET in TCOMP) were genetically correlated with steer DFI in heifers as in steers. Heifer SEMA was genetically correlated with steer feedlot RFI in BRAH (0.75 +/- 0.27 at ENDWET, 0.66 +/- 0.24 at ENDDRY). Selection to reduce steer RFI would reduce SEMA in BRAH heifers but otherwise have only small effects on heifers before their first joining.
Resumo:
Numerous tests have been used to measure beef cattle temperament, but limited research has addressed the relationship between such tests and whether temperament can be modified. One-hundred-and-forty-four steers were given one of three human handling and yarding experiences on six occasions during a 12-month grazing period post-weaning (backgrounding): Good handling/yarding, Poor handling/yarding and Minimal handling/yarding. At the end of this phase the cattle were lot-fed for 78 days, with no handling/yarding treatments imposed, before being transported for commercial slaughter. Temperament was assessed at the start of the experiment, during backgrounding and lot-feeding by flight speed (FS) and a fear of humans test, which measured the proximity to a stimulus person (zone average; ZA), the closest approach to the person (CA) and the amount the cattle moved around the test arena (total transitions; TT). During backgrounding, FS decreased for all treatments and at the end of backgrounding there was no difference between them. The rate of decline, however, was greatest in the Good group, smallest in the Minimal group with the Poor intermediate. In contrast, ZA was affected by treatment, with a greater reduction for the Good group than the others (P = 0.012). During lot-feeding, treatment did not affect FS, but all groups showed a decrease in ZA, with the greatest change in the Poor group, the least in the Good and the Minimal intermediate (P = 0.052). CA was positively correlated with ZA (r = 0.18 to 0.66) and negatively with TT (r = -0.180 to -0.659). FS was consistently correlated with TT only (r = 0.17 to 0.49). These findings suggest that FS and TT measure a similar characteristic, as do ZA and CA, but that these characteristics are different from one another, indicating that temperament is not a unitary trait, but has different facets. FS and TT measure one facet that we suggest is general agitation, whilst ZA and CA measure fear of people. Thus, the cattle became less agitated during backgrounding, but the effect was not permanently influenced by the quantity and quality of handling/yarding. However, Good handling/yarding reduced fearfulness of people. Fear of people was also reduced during lot-feeding, probably as a consequence of frequent exposure to humans in a situation that was neutral or positive for the cattle.