89 resultados para Arid tropical environment
em eResearch Archive - Queensland Department of Agriculture
Resumo:
The influence of grazing management on total soil organic carbon (SOC) and soil total nitrogen (TN) in tropical grasslands is an issue of considerable ecological and economic interest. Here we have used linear mixed models to investigate the effect of grazing management on stocks of SOC and TN in the top 0.5 m of the soil profile. The study site was a long-term pasture utilization experiment, 26 years after the experiment was established for sheep grazing on native Mitchell grass (Astrebla spp.) pasture in northern Australia. The pasture utilization rates were between 0% (exclosure) and 80%, assessed visually. We found that a significant amount of TN had been lost from the top 0.1 m of the soil profile as a result of grazing, with 80% pasture utilization resulting in a loss of 84 kg ha−1 over the 26-year period. There was no significant effect of pasture utilization rate on TN when greater soil depths were considered. There was no significant effect of pasture utilization rate on stocks of SOC and soil particulate organic carbon (POC), or the C:N ratio at any depth; however, visual trends in the data suggested some agreement with the literature, whereby increased grazing pressure appeared to: (i) decrease SOC and POC stocks; and, (ii) increase the C:N ratio. Overall, the statistical power of the study was limited, and future research would benefit from a more comprehensive sampling scheme. Previous studies at the site have found that a pasture utilization rate of 30% is sustainable for grazing production on Mitchell grass; however, given our results, we conclude that N inputs (possibly through management of native N2-fixing pasture legumes) should be made for long-term maintenance of soil health, and pasture productivity, within this ecosystem.
Resumo:
Better understanding of seed-bank dynamics of Echinochloa colona, Urochloa panicoides and Hibiscus trionum, major crop weeds in sub-tropical Australia, was needed to improve weed control. Emergence patterns and seed persistence were investigated, with viable seeds sown at different depths in large in-ground pots. Seedlings of all species emerged between October and March when mean soil temperatures were 21-23C. However, E. colona emerged as a series of flushes predominantly in the first year, with most seedlings emerging from 0-2 cm. Urochloa panicoides emerged mostly as a single large flush in the first two years, with most seedlings emerging from 5 cm. Hibiscus trionum emerged as a series of flushes over three seasons, initially with majority from 5 cm and then 0-2 cm in the later seasons. Longevity of the grass seed was short, with <5% remaining after burial at 0-2 cm for 24 months. In contrast, 38% of H. trionum seeds remained viable after the same period. Persistence of all species increased significantly with burial depth. These data highlight that management strategies need to be tailored for each species, particularly relating to the need for monitoring, application times for control tactics, impact of tillage, and time needed to reduce the seed-bank to low numbers.
Resumo:
Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.
Resumo:
Castration of cattle using rubber rings is becoming increasingly popular due to the perceived ease of the procedure and greater operator safety when compared with surgical castration. Few comparative studies have investigated the effects of different castration methods and calf age on welfare outcomes, particularly in a tropical environment. Thirty Belmont Red (a tropically adapted breed), 3-month-old (liveweight 71–119 kg) and 30, 6-month-old (liveweight 141–189 kg) calves were assigned to a two age × three castration (surgical, ring and sham) treatment factorial study (Surg3, Surg6, Ring3, Ring6, Sham3 and Sham6, n = 10 for each treatment group). Welfare outcomes were assessed post-castration using: behaviour for 2 weeks; blood parameters (cortisol and haptoglobin concentrations) to 4 weeks; wound healing to 5 weeks; and liveweights to 6 weeks. More Surg calves struggled during castration compared with Sham and Ring (P < 0.05, 90 ± 7% vs. 20 ± 9% and 24 ± 10%) and performed more struggles (1.9 ± 0.2, 1.1 ± 0.3 and 1.1 ± 0.3 for Surg, Sham and Ring, respectively), suggesting that surgical castration caused most pain during performance of the procedure. A significant (P < 0.05) time × castration method × age interaction for plasma cortisol revealed that concentrations decreased most rapidly in Sham; the Ring6 calves failed to show reduced cortisol concentrations at 2 h post-castration, unlike other treatment groups. By 7 h post-castration, all treatment groups had similar concentrations. A significant (P < 0.01) interaction between time and castration method showed that haptoglobin concentrations increased slightly to 0.89 and 0.84 mg/mL for Surg and Ring, respectively over the first 3 days post-castration. Concentrations for Surg then decreased to levels similar to Sham by day 21 and, although concentrations for Ring decreased on day 7 to 0.76 mg/mL, they increased significantly on day 14 to 0.97 mg/mL before reducing to concentrations similar to the other groups (0.66 mg/mL) by day 21. Significantly (P < 0.05) more of the wounds of the 3-month compared with the 6-month calves scored as ‘healed’ at day 7 (74% vs. 39%), while more (P = 0.062) of the Surg than Ring scored as ‘healed’ at day 21 (60% vs. 29%). At day 14 there were significantly (P < 0.05) fewer healed wounds in Ring6 compared with other treatment groups (13% vs. 40–60%). Liveweight gain was significantly (P < 0.05) greater in 3-month (0.53 kg/day) than in 6-month calves (0.44 kg/day) and in Sham calves (P < 0.001, 0.54 kg/day), than in Ring (0.44 kg/day) and Surg (0.48 kg/day) calves. Overall, welfare outcomes were slightly better for Surg than Ring calves due to reduced inflammation and faster wound healing, with little difference between age groups.
Resumo:
With the aim of increasing peanut production in Australia, the Australian peanut industry has recently considered growing peanuts in rotation with maize at Katherine in the Northern Territory—a location with a semi-arid tropical climate and surplus irrigation capacity. We used the well-validated APSIM model to examine potential agronomic benefits and long-term risks of this strategy under the current and warmer climates of the new region. Yield of the two crops, irrigation requirement, total soil organic carbon (SOC), nitrogen (N) losses and greenhouse gas (GHG) emissions were simulated. Sixteen climate stressors were used; these were generated by using global climate models ECHAM5, GFDL2.1, GFDL2.0 and MRIGCM232 with a median sensitivity under two Special Report of Emissions Scenarios over the 2030 and 2050 timeframes plus current climate (baseline) for Katherine. Effects were compared at three levels of irrigation and three levels of N fertiliser applied to maize grown in rotations of wet-season peanut and dry-season maize (WPDM), and wet-season maize and dry-season peanut (WMDP). The climate stressors projected average temperature increases of 1°C to 2.8°C in the dry (baseline 24.4°C) and wet (baseline 29.5°C) seasons for the 2030 and 2050 timeframes, respectively. Increased temperature caused a reduction in yield of both crops in both rotations. However, the overall yield advantage of WPDM increased from 41% to up to 53% compared with the industry-preferred sequence of WMDP under the worst climate projection. Increased temperature increased the irrigation requirement by up to 11% in WPDM, but caused a smaller reduction in total SOC accumulation and smaller increases in N losses and GHG emission compared with WMDP. We conclude that although increased temperature will reduce productivity and total SOC accumulation, and increase N losses and GHG emissions in Katherine or similar northern Australian environments, the WPDM sequence should be preferable over the industry-preferred sequence because of its overall yield and sustainability advantages in warmer climates. Any limitations of irrigation resulting from climate change could, however, limit these advantages.
Resumo:
Objective To identify measures that most closely relate to hydration in healthy Brahman-cross neonatal calves that experience milk deprivation. Methods In a dry tropical environment, eight neonatal Brahman-cross calves were prevented from suckling for 2–3 days during which measurements were performed twice daily. Results Mean body water, as estimated by the mean urea space, was 74 ± 3% of body weight at full hydration. The mean decrease in hydration was 7.3 ± 1.1% per day. The rate of decrease was more than three-fold higher during the day than at night. At an ambient temperature of 39°C, the decrease in hydration averaged 1.1% hourly. Measures that were most useful in predicting the degree of hydration in both simple and multiple-regression prediction models were body weight, hindleg length, girth, ambient and oral temperatures, eyelid tenting, alertness score and plasma sodium. These parameters are different to those recommended for assessing calves with diarrhoea. Single-measure predictions had a standard error of at least 5%, which reduced to 3–4% if multiple measures were used. Conclusion We conclude that simple assessment of non-suckling Brahman-cross neonatal calves can estimate the severity of dehydration, but the estimates are imprecise. Dehydration in healthy neonatal calves that do not have access to milk can exceed 20% (>15% weight loss) in 1–3 days under tropical conditions and at this point some are unable to recover without clinical intervention.
Resumo:
Tillage is defined here in a broad sense, including disturbance of the soil and crop residues, wheel traffic and sowing opportunities. In sub-tropical, semi-arid cropping areas in Australia, tillage systems have evolved from intensively tilled bare fallow systems, with high soil losses, to reduced and no tillage systems. In recent years, the use of controlled traffic has also increased. These conservation tillage systems are successful in reducing water erosion of soil and sediment-bound chemicals. Control of runoff of dissolved nutrients and weakly sorbed chemicals is less certain. Adoption of new practices appears to have been related to practical and economic considerations, and proved to be more profitable after a considerable period of research and development. However there are still challenges. One challenge is to ensure that systems that reduce soil erosion, which may involve greater use of chemicals, do not degrade water quality in streams. Another challenge is to ensure that systems that improve water entry do not increase drainage below the crop root zone, which would increase the risk of salinity. Better understanding of how tillage practices influence soil hydrology, runoff and erosion processes should lead to better tillage systems and enable better management of risks to water quality and soil health. Finally, the need to determine the effectiveness of in-field management practices in achieving stream water quality targets in large, multi-land use catchments will challenge our current knowledge base and the tools available.
Resumo:
In semi-arid sub-tropical areas, a number of studies concerning no-till (NT) farming systems have demonstrated advantages in economic, environmental and soil quality aspects over conventional tillage (CT). However, adoption of continuous NT has contributed to the build-up of herbicide resistant weed populations, increased incidence of soil- and stubble-borne diseases, and stratification of nutrients and organic carbon near the soil surface. Some farmers often resort to an occasional strategic tillage (ST) to manage these problems of NT systems. However, farmers who practice strict NT systems are concerned that even one-time tillage may undo positive soil condition benefits of NT farming systems. We reviewed the pros and cons of the use of occasional ST in NT farming systems. Impacts of occasional ST on agronomy, soil and environment are site-specific and depend on many interacting soil, climatic and management conditions. Most studies conducted in North America and Europe suggest that introducing occasional ST in continuous NT farming systems could improve productivity and profitability in the short term; however in the long-term, the impact is negligible or may be negative. The short term impacts immediately following occasional ST on soil and environment include reduced protective cover, soil loss by erosion, increased runoff, loss of C and water, and reduced microbial activity with little or no detrimental impact in the long-term. A potential negative effect immediately following ST would be reduced plant available water which may result in unreliability of crop sowing in variable seasons. The occurrence of rainfall between the ST and sowing or immediately after the sowing is necessary to replenish soil water lost from the seed zone. Timing of ST is likely to be critical and must be balanced with optimising soil water prior to seeding. The impact of occasional ST varies with the tillage implement used; for example, inversion tillage using mouldboard tillage results in greater impacts as compared to chisel or disc. Opportunities for future research on occasional ST with the most commonly used implements such as tine and/or disc in Australia’s northern grains-growing region are presented in the context of agronomy, soil and the environment.
Resumo:
Analysis of headspace volatiles by gas chromatography/mass spectrometry from king (Penaeus plebejus), banana (P. merguiensis), tiger (P. esculentus/semisulcatus) and greasy (Metapenaeus bennettae) prawns stored in ice or ice slurry, which is effectively an environment of low oxygen tension, indicated the presence of amines at the early stages of storage (less than 8 days) irrespective of the nature of the storage media. Esters were more prevalent in prawns stored on ice (normal oxygen conditions) at the latter stages of storage (more than 8 days) and were only produced by Pseudomonas fragi, whereas sulphides and amines occurred whether the predominant spoilage organism was Ps.fragi or Shewanella putrefaciens. The free amino acid profiles of banana and king prawns were high in arginine (12–14%) and low in cysteine (0.1–0.17%) and methionine (0.1–0.2%). Filter sterilised raw banana prawn broth inoculated with a total of 15 cultures of Ps. fragi and S. putrefaciens and incubated for two weeks at 5°C, showed the presence of 17 major compounds in the headspace volatiles analysed using gas chromatography/mass spectrometry (GC/MS). These were mainly amines, sulphides, ketones and esters. Principal Component Analysis of the results for the comparative levels of the volatiles produced by pure cultures, inoculated into sterile prawn broth, indicated three subgroupings of the organisms; I, Ps. fragi from a particular geographic location; II, S. putrefaciens from another geographic location; and III, a mixture of Ps. fragi and S. putrefaciens from different geographic locations. The sensory impression created by the cultures was strongly related to the chemical profile as determined by GC/MS. Organisms, even within the same subgrouping classified as identical by the usual tests, produced a different range of volatiles in the same uniform substrate.
Resumo:
The recent introduction to Australia of superior sheep meat breeds from South Africa provides a basis for improving the quality and amount of sheep meat grown in Queensland’s semi arid area. Alternatively suitable breeds from existing Australian stocks of dual purpose and traditional terminal meat sheep may bring the desired attributes required by the market place. There has been no critical assessment of sheep meat breeds suitably adapted to the rangeland environment of western Queensland. Animal production for a consuming world : proceedings of 9th Congress of the Asian-Australasian Association of Animal Production Societies [AAAP] and 23rd Biennial Conference of the Australian Society of Animal Production [ASAP] and 17th Annual Symposium of the University of Sydney, Dairy Research Foundation, [DRF]. 2-7 July 2000, Sydney, Australia.
Resumo:
Previous research on P leaf analysis for detecting deficiencies in cotton (Gossypium hirsutum L.) has not considered temperature as a determining factor. This is despite correlations between leaf P content and temperature being observed in other crops. As part of research into a new cotton farming system for the semi-arid tropics of Australia, we conducted two P fertiliser rate experiments on recently cleared un-cropped (bicarbonate P < 5 mg kg- 1) and previously cropped (bicarbonate P 26 mg kg- 1) soil. They aimed to develop P requirements and more importantly to determine if temperature affects the leaf P concentrations used to diagnose P deficiencies. In 2002, optimal yield on un-cropped, low P soil was achieved with a 60 kg P ha- 1 rate. In 2003, residual P from the 40 kg P ha- 1 treatment produced optimal yield. On cropped, high P soil there was no yield response to treatments up to 100 kg P ha- 1. On low P soil, a positive correlation was observed between P concentration in the youngest fully-unfurled leaf (YFUL), fertiliser rate, and mean diurnal temperature in the seven days prior to sampling. On high P soil, a positive correlation was observed between the YFUL and mean diurnal temperature however there was no correlation with fertiliser rate. These results show that YFUL analysis can be used to diagnose P deficiencies in cotton, provided the temperature prior to sampling is considered.
Resumo:
Milk obtained from cows on 2 subtropical dairy feeding systems were compared for their suitability for Cheddar cheese manufacture. Cheeses were made in a small-scale cheesemaking plant capable of making 2 blocks ( about 2 kg each) of Cheddar cheese concurrently. Its repeatability was tested over 10 separate cheesemaking days with no significant differences being found between the 2 vats in cheesemaking parameters or cheese characteristics. In the feeding trial, 16 pairs of Holstein - Friesian cows were used in 2 feeding systems (M1, rain-grown tropical grass pastures and oats; and M5, a feedlot, based on maize/barley silage and lucerne hay) over 2 seasons ( spring and autumn corresponding to early and late lactation, respectively). Total dry matter, crude protein (kg/cow. day) and metabolisable energy (MJ/cow.day) intakes were 17, 2.7, and 187 for M1 and 24, 4, 260 for M5, respectively. M5 cows produced higher milk yields and milk with higher protein and casein levels than the M1 cows, but the total solids and fat levels were similar (P > 0.05) for both M1 and M5 cows. The yield and yield efficiency of cheese produced from the 2 feeding systems were also not significantly different. The results suggest that intensive tropical pasture systems can produce milk suitable for Cheddar cheese manufacture when cows are supplemented with a high energy concentrate. Season and stage of lactation had a much greater effect than feeding system on milk and cheesemaking characteristics with autumn ( late lactation) milk having higher protein and fat contents and producing higher cheese yields.
Resumo:
This special issue of Continental Shelf Research contains 20 papers giving research results produced as part of Australia's Torres Strait Co-operative Research Centre (CRC) Program, which was funded over a three-year period during 2003-2006. Marine biophysical, fisheries, socioeconomic-cultural and extension research in the Torres Strait region of northeastern Australia was carried out to meet three aims: 1) support the sustainable development of marine resources and minimize impacts of resource use in Torres Strait; 2) enhance the conservation of the marine environment and the social, cultural and economic well being of all stakeholders, particularly the Torres Strait peoples; and 3) contribute to effective policy formulation and management decision making. Subjects covered, including commercial and traditional fisheries management, impacts of anthropogenic sediment inputs on seagrass meadows and communication of science results to local communities, have broad applications to other similar environments.
Resumo:
The effects of fertilisers on 8 tropical turfgrasses growing in 100-L bags of sand were studied over winter in Murrumba Downs, just north of Brisbane in southern Queensland (latitude 27.4°S, longitude 153.1°E). The species used were: Axonopus compressus (broad-leaf carpetgrass), Cynodon dactylon (bermudagrass 'Winter Green') and C. dactylon x C. transvaalensis hybrid ('Tifgreen'), Digitaria didactyla (Queensland blue couch), Paspalum notatum (bahiagrass '38824'), Stenotaphrum secundatum (buffalograss 'Palmetto'), Eremochloa ophiuroides (centipedegrass 'Centec') and Zoysia japonica (zoysiagrass 'ZT-11'). Control plots were fertilised with complete fertilisers every month from May to September (72 kg N/ha, 31 kg P/ha, 84 kg K/ha, 48 kg S/ha, 30 kg Ca/ha and 7.2 kg Mg/ha), and unfertilised plots received no fertiliser. Carpetgrass and standard bermudagrass were the most sensitive species to nutrient supply, with lower shoot dry weights in the unfertilised plots (shoots mowed to thatch level) compared with the fertilised plots in June. There were lower shoot dry weights in the unfertilised plots in July for all species, except for buffalograss, centipedegrass and zoysiagrass, and lower shoot dry weights in the unfertilised plots in August for all species, except for centipedegrass. At the end of the experiment in September, unfertilised plots were 11% of the shoot dry weights of fertilised plots, with all species affected. Mean shoot nitrogen concentrations fell from 3.2 to 1.7% in the unfertilised plots from May to August, below the sufficiency range for turfgrasses (2.8-3.5%). There were also declines in P (0.45-0.36%), K (2.4-1.5%), S (0.35-0.25%), Mg (0.24-0.18%) and B (9-6 mg/kg), which were all in the sufficiency range. The shoots in the control plots took up the following levels (kg/ha.month) of nutrients: N, 10.0-27.0; P, 1.6-4.0; K, 8.2-19.8; S, 1.0-4.2; Ca, 1.1-3.3; and Mg, 0.8-2.2, compared with applications (kg/ha.month) of: N, 72; P, 31; K, 84; S, 48; Ca, 30; and Mg, 7.2, indicating a recovery of 14-38% for N, 5-13% for P, 10-24% for K, 2-9% for S, 4-11% for Ca and 11-30% for Mg. These results suggest that buffalograss, centipedegrass and zoysiagrass are less sensitive to low nutrient supply than carpetgrass, bermudagrass, blue couch and bahiagrass. Data on nutrient uptake showed that the less sensitive species required only half or less of the nitrogen required to maintain the growth of the other grasses, indicating potential savings for turf managers in fertiliser costs and the environment in terms of nutrients entering waterways.
Resumo:
Rainfall variability is a challenge to sustainable and pro. table cattle production in northern Australia. Strategies recommended to manage for rainfall variability, like light or variable stocking, are not widely adopted. This is due partly to the perception that sustainability and profitability are incompatible. A large, long-term grazing trial was initiated in 1997 in north Queensland, Australia, to test the effect of different grazing strategies on cattle production. These strategies are: (i) constant light stocking (LSR) at long-term carrying capacity (LTCC); (ii) constant heavy stocking (HSR) at twice LTCC; (iii) rotational wet-season spelling (R/Spell) at 1.5 LTCC; (iv) variable stocking (VAR), with stocking rates adjusted in May based on available pasture; and (v) a Southern Oscillation Index (SOI) variable strategy, with stocking rates adjusted in November, based on available pasture and SOI seasonal forecasts. Animal performance varied markedly over the 10 years for which data is presented, due to pronounced differences in rainfall and pasture availability. Nonetheless, lighter stocking at or about LTCC consistently gave the best individual liveweight gain (LWG), condition score and skeletal growth; mean LWG per annum was thus highest in the LSR (113 kg), intermediate in the R/Spell (104 kg) and lowest in the HSR(86 kg). MeanLWGwas 106 kg in the VAR and 103 kg in the SOI but, in all years, the relative performance of these strategies was dependent upon the stocking rate applied. After 2 years on the trial, steers from lightly stocked strategies were 60-100 kg heavier and received appreciable carcass price premiums at the meatworks compared to those under heavy stocking. In contrast, LWG per unit area was greatest at stocking rates of about twice LTCC; mean LWG/ha was thus greatest in the HSR (21 kg/ha), but this strategy required drought feeding in four of the 10 years and was unsustainable. Although LWG/ha was lower in the LSR (mean 14 kg/ha), or in strategies that reduced stocking rates in dry years like the VAR(mean 18 kg/ha) and SOI (mean 17 kg/ha), these strategies did not require drought feeding and appeared sustainable. The R/Spell strategy (mean 16 kg/ha) was compromised by an ill-timed fire, but also performed satisfactorily. The present results provide important evidence challenging the assumption that sustainable management in a variable environment is unprofitable. Further research is required to fully quantify the long-term effects of these strategies on land condition and profitability and to extrapolate the results to breeder performance at the property level.