50 resultados para DRE DEGREES
Resumo:
Rabbits released in Australia in 1859 spread to most areas of suitable habitat by 1910 causing great damage to the environment and primary industries. Measurement of damage is essential to justify spending money and utilising resources to remove rabbits. Damage to pasture and biodiversity may be irreversible and therefore difficult to measure without comparison with an area that has never suffered such damage. A rabbit proof fence completed in 1906 protected a large part of south east Queensland from rabbits. The Darling Downs Moreton Rabbit Board (DDMRB) continues to maintain the fence and keep the area relatively free of rabbits. This area is unique because it is highly suitable for rabbits and yet it has never ‘experienced’ the damage caused by plagues of uncontrolled rabbits. A study site was established where the DDMRB fence separates an area heavily used by rabbits (‘dirty side’) from an area that has never been infested by rabbits (‘clean side’). The number and location of all rabbit warrens and log piles were recorded. The absence of warrens from the ‘clean side’ shows clearly that the rabbit proof fence has prevented rabbits from establishing warren systems. The ‘dirty side’ is characterised by a high number of warrens, a high density of rabbits, fewer pasture species and low macropod activity. Future work will determine whether the rabbit populations are viable in the absence of rabbit warrens. We plan to radio collar rabbits on both sides of the fence to measure their survival rate. In selected warrens and log piles of varying degrees of complexity and size, rabbits will be trapped and information on reproduction and age structure will be collected. This will allow better targeting of the source of rabbits during control operations. Once the initial comparative analysis of the site has been completed, all rabbit warrens will be destroyed on the dirty side of the fence. After rabbits are removed from this area, monitoring will continue to determine if pasture and biodiversity on opposite sides of the fence begin to mirror each other.
Resumo:
When recapturing satellite collared wild dogs that had been trapped one month previous in padded foothold traps, we noticed varying degrees of pitting on the pads of their trapped paw. Veterinary advice, based on images taken of the injuries, suggests that the necrosis was caused by vascular compromise. Five of six dingoes we recaptured had varying degrees of necrosis restricted only to the trapped foot and ranging from single 5 mm holes to 25% sections of the toe pads missing or deformed, including loss of nails. The traps used were rubber-padded, two–coiled, Victor Soft Catch #3 traps. The springs are not standard Victor springs but were Beefer springs; these modifications slightly increase trap speed and the jaw pressure on the trapped foot. Despite this modification the spring pressure is still relatively mild in comparison to conventional long spring or four-coiled wild dog traps. The five wild dogs developing necrosis were trapped in November 2006 at 5-6 months of age. Traps were checked each morning so the dogs were unlikely to have been restrained in the trap for more than 12 hours. All dogs exhibited a small degree of paw damage at capture which presented itself as a swollen paw and compression at the capture point. In contrast, eight wild dogs, 7-8 month-old, were captured two months later in February. Upon their release, on advice from a veterinarian, we massaged the trapped foot to get blood flow back in to the foot and applied a bruise treatment (Heparinoid 8.33 mg/ml) to assist restoring blood flow. These animals were subsequently recaptured several months later and showed no signs of necrosis. While post-capture foot injuries are unlikely to be an issue in conventional control programs where the animal is immediately destroyed, caution needs to be used when releasing accidentally captured domestic dogs or research animals captured in rubber-padded traps. We have demonstrated that 7-8 month old dogs can be trapped and released without any evidence of subsequent necrosis following minimal veterinary treatment. We suspect that the rubber padding on traps may increase the tourniquet effect by wrapping around the paw and recommend the evaluation of offset laminated steel jaw traps as an alternative. Offset laminated steel jaw traps have been shown to be relatively humane producing as few foot injuries as rubber-jawed traps.
Resumo:
Grazing is a major land use in Australia's rangelands. The 'safe' livestock carrying capacity (LCC) required to maintain resource condition is strongly dependent on climate. We reviewed: the approaches for quantifying LCC; current trends in climate and their effect on components of the grazing system; implications of the 'best estimates' of climate change projections for LCC; the agreement and disagreement between the current trends and projections; and the adequacy of current models of forage production in simulating the impact of climate change. We report the results of a sensitivity study of climate change impacts on forage production across the rangelands, and we discuss the more general issues facing grazing enterprises associated with climate change, such as 'known uncertainties' and adaptation responses (e.g. use of climate risk assessment). We found that the method of quantifying LCC from a combination of estimates (simulations) of long-term (>30 years) forage production and successful grazier experience has been well tested across northern Australian rangelands with different climatic regions. This methodology provides a sound base for the assessment of climate change impacts, even though there are many identified gaps in knowledge. The evaluation of current trends indicated substantial differences in the trends of annual rainfall (and simulated forage production) across Australian rangelands with general increases in most of western Australian rangelands ( including northern regions of the Northern Territory) and decreases in eastern Australian rangelands and south-western Western Australia. Some of the projected changes in rainfall and temperature appear small compared with year-to-year variability. Nevertheless, the impacts on rangeland production systems are expected to be important in terms of required managerial and enterprise adaptations. Some important aspects of climate systems science remain unresolved, and we suggest that a risk-averse approach to rangeland management, based on the 'best estimate' projections, in combination with appropriate responses to short-term (1-5 years) climate variability, would reduce the risk of resource degradation. Climate change projections - including changes in rainfall, temperature, carbon dioxide and other climatic variables - if realised, are likely to affect forage and animal production, and ecosystem functioning. The major known uncertainties in quantifying climate change impacts are: (i) carbon dioxide effects on forage production, quality, nutrient cycling and competition between life forms (e.g. grass, shrubs and trees); and (ii) the future role of woody plants including effects of. re, climatic extremes and management for carbon storage. In a simple example of simulating climate change impacts on forage production, we found that increased temperature (3 degrees C) was likely to result in a decrease in forage production for most rangeland locations (e. g. -21% calculated as an unweighted average across 90 locations). The increase in temperature exacerbated or reduced the effects of a 10% decrease/increase in rainfall respectively (-33% or -9%). Estimates of the beneficial effects of increased CO2 (from 350 to 650 ppm) on forage production and water use efficiency indicated enhanced forage production (+26%). The increase was approximately equivalent to the decline in forage production associated with a 3 degrees C temperature increase. The large magnitude of these opposing effects emphasised the importance of the uncertainties in quantifying the impacts of these components of climate change. We anticipate decreases in LCC given that the 'best estimate' of climate change across the rangelands is for a decline (or little change) in rainfall and an increase in temperature. As a consequence, we suggest that public policy have regard for: the implications for livestock enterprises, regional communities, potential resource damage, animal welfare and human distress. However, the capability to quantify these warnings is yet to be developed and this important task remains as a challenge for rangeland and climate systems science.
Resumo:
Listeria and Salmonella are important foodborne pathogens normally associated with the shrimp production chain. This study investigated the potential of Salmonella Typhimurium, Salmonella Senftenberg, and Listeria monocytogenes (Scott A and V7) to attach to and colonize shrimp carapace. Attachment and colonization of Listeria and Salmonella were demonstrated. Shrimp abdominal carapaces showed higher levels of bacterial attachment (P < 0.05) than did head carapaces. Listeria consistently exhibited greater attachment (P < 0.05) than did Salmonella on all surfaces. Chitinase activity of all strains was tested and found not to occur at the three temperatures (10, 25. and 37 degrees C) tested. The surface physicochemical properties of bacterial cells and shrimp carapace were Studied to determine their role in attachment and colonization. Salmonella had significantly (P < 0.05) more positive (-3.9 and -6.0 mV) cell surface charge than Listeria (-18 and -22.8 mV) had. Both bacterial species were found to be hydrophilic (<35%) when measured by the bacterial adherence to hydrocarbon method and by contact angle (theta) measurements (Listeria, 21.3 and 24.8 degrees, and Salmonella, 14.5 and 18.9 degrees). The percentage of cells retained by Pheryl-Sepharose was lower for Salmonella (12.8 to 14.8%) than it was for Listeria (26.5 to 31.4%). The shrimp carapace was found to be hydrophobic (theta = 74.5 degrees), and a significant (P < 0.05) difference in surface roughness between carapace types was noted. There was a linear correlation between bacterial cell Surface charge (r(2) = 0.95) and hydrophobicity (r(2) = 0.85) and initial attachment (P < 0.05) of Listeria and Salmonella to carapaces. However, the same properties Could not be related to subsequent colonization.
Resumo:
Rabbit Haemorrhagic Disease Virus (RHDV) was introduced to Australia in 1995 for the control of wild rabbits. Initial outbreaks greatly reduced rabbit numbers and the virus has continued to control rabbits to varying degrees in different parts of Australia. However, recent field evidence suggests that the virus may be becoming less effective in those areas that have previously experienced repeated epizootics causing high mortality. There are also reports of rabbits returning to pre-1995 density levels, Virus and host can be expected to co-evolve. The host will develop resistance to the virus with the virus subsequently changing to overcome that resistance. It has been 12 years since the release of RHDV and it is an opportune time to examine where the dynamic currently stands between RHDV and rabbits. Laboratory challenge tests have indicated that resistance to RHDV has developed to different degrees in populations throughout Australia. In one population a low dose (1:25 dilution) of Czech strain RHDV failed to infect a single susceptible rabbit, yet infected a low to high (up to 73%) percentage across other populations tested. Different selection pressures are present in these populations and will be driving the level of resistance being seen. The mechanisms and genetics behind the development of resistance are also important as the on-going use of RHDV as a control tool in the management of rabbits relies on our understanding of factors influencing the efficacy of the virus. Understanding how resistance has developed may provide clues on how best to use the virus to circumvent these mechanisms. Similarly, it will help in managing populations that have yet to develop high levels of resistance.
Resumo:
Climate change projections for Australia predict increasing temperatures, changes to rainfall patterns, and elevated atmospheric carbon dioxide (CO2) concentrations. The aims of this study were to predict plant production responses to elevated CO2 concentrations using the SGS Pasture Model and DairyMod, and then to quantify the effects of climate change scenarios for 2030 and 2070 on predicted pasture growth, species composition, and soil moisture conditions of 5 existing pasture systems in climates ranging from cool temperate to subtropical, relative to a historical baseline. Three future climate scenarios were created for each site by adjusting historical climate data according to temperature and rainfall change projections for 2030, 2070 mid-and 2070 high-emission scenarios, using output from the CSIRO Mark 3 global climate model. In the absence of other climate changes, mean annual pasture production at an elevated CO2 concentration of 550 ppm was predicted to be 24-29% higher than at 380 ppm CO2 in temperate (C-3) species-dominant pastures in southern Australia, with lower mean responses in a mixed C-3/C-4 pasture at Barraba in northern New South Wales (17%) and in a C-4 pasture at Mutdapilly in south-eastern Queensland (9%). In the future climate scenarios at the Barraba and Mutdapilly sites in subtropical and subhumid climates, respectively, where climate projections indicated warming of up to 4.4 degrees C, with little change in annual rainfall, modelling predicted increased pasture production and a shift towards C-4 species dominance. In Mediterranean, temperate, and cool temperate climates, climate change projections indicated warming of up to 3.3 degrees C, with annual rainfall reduced by up to 28%. Under future climate scenarios at Wagga Wagga, NSW, and Ellinbank, Victoria, our study predicted increased winter and early spring pasture growth rates, but this was counteracted by a predicted shorter spring growing season, with annual pasture production higher than the baseline under the 2030 climate scenario, but reduced by up to 19% under the 2070 high scenario. In a cool temperate environment at Elliott, Tasmania, annual production was higher than the baseline in all 3 future climate scenarios, but highest in the 2070 mid scenario. At the Wagga Wagga, Ellinbank, and Elliott sites the effect of rainfall declines on pasture production was moderated by a predicted reduction in drainage below the root zone and, at Ellinbank, the use of deeper rooted plant systems was shown to be an effective adaptation to mitigate some of the effect of lower rainfall.
Resumo:
Glucosinolates are sulphur-containing glycosides found in brassicaceous plants that can be hydrolysed enzymatically by plant myrosinase or non-enzymatically to form primarily isothiocyanates and/or simple nitriles. From a human health perspective, isothiocyanates are quite important because they are major inducers of carcinogen-detoxifying enzymes. Two of the most potent inducers are benzyl isothiocyanate (BITC) present in garden cress (Lepidium sativum), and phenylethyl isothiocyanate (PEITC) present in watercress (Nasturtium officinale). Previous studies on these salad crops have indicated that significant amounts of simple nitriles are produced at the expense of the isothiocyanates. These studies also suggested that nitrile formation may occur by different pathways: (1) under the control of specifier protein in garden cress and (2) by an unspecified, non-enzymatic path in watercress. In an effort to understand more about the mechanisms involved in simple nitrile formation in these species, we analysed their seeds for specifier protein and myrosinase activities, endogenous iron content and glucosinolate degradation products after addition of different iron species, specific chelators and various heat treatments. We confirmed that simple nitrile formation was predominantly under specifier protein control (thiocyanate-forming protein) in garden cress seeds. Limited thermal degradation of the major glucosinolate, glucotropaeolin (benzyl glucosinolate), occurred when seed material was heated to >120 degrees C. In the watercress seeds, however, we show for the first time that gluconasturtiin (phenylethyl glucosinolate) undergoes a non-enzymatic, iron-dependent degradation to a simple nitrile. On heating the seeds to 120 degrees C or greater, thermal degradation of this heat-labile glucosinolate increased simple nitrile levels many fold.
Resumo:
Time to first root in cuttings varies under different environmental conditions and understanding these differences is critical for optimizing propagation of commercial forestry species. Temperature environment (15, 25, 30 or 35 +/- A 2A degrees C) had no effect on the cellular stages in root formation of the Slash x Caribbean Pine hybrid over 16 weeks as determined by histology. Initially callus cells formed in the cortex, then tracheids developed and formed primordia leading to external roots. However, speed of development followed a growth curve with the fastest development occurring at 25A degrees C and slowest at 15A degrees C with rooting percentages at week 12 of 80 and 0% respectively. Cutting survival was good in the three cooler temperature regimes (> 80%) but reduced to 59% at 35A degrees C. Root formation appeared to be dependant on the initiation of tracheids because all un-rooted cuttings had callus tissue but no tracheids, irrespective of temperature treatment and clone.
Resumo:
When exposed to hot (22-35 degrees C) and dry climatic conditions in the field during the final 4-6 weeks of pod filling, peanuts (Arachis hypogaea L.) can accumulate highly carcinogenic and immuno-suppressing aflatoxins. Forecasting of the risk posed by these conditions can assist in minimizing pre-harvest contamination. A model was therefore developed as part of the Agricultural Production Systems Simulator (APSIM) peanut module, which calculated an aflatoxin risk index (ARI) using four temperature response functions when fractional available soil water was <0.20 and the crop was in the last 0.40 of the pod-filling phase. ARI explained 0.95 (P <= 0.05) of the variation in aflatoxin contamination, which varied from 0 to c. 800 mu g/kg in 17 large-scale sowings in tropical and four sowings in sub-tropical environments carried out in Australia between 13 November and 16 December 2007. ARI also explained 0.96 (P <= 0.01) of the variation in the proportion of aflatoxin-contaminated loads (>15 mu g/kg) of peanuts in the Kingaroy region of Australia during the period between the 1998/99 and 2007/08 seasons. Simulation of ARI using historical climatic data from 1890 to 2007 indicated a three-fold increase in its value since 1980 compared to the entire previous period. The increase was associated with increases in ambient temperature and decreases in rainfall. To facilitate routine monitoring of aflatoxin risk by growers in near real time, a web interface of the model was also developed. The ARI predicted using this interface for eight growers correlated significantly with the level of contamination in crops (r=095, P <= 0.01). These results suggest that ARI simulated by the model is a reliable indicator of aflatoxin contamination that can be used in aflatoxin research as well as a decision-support tool to monitor pre-harvest aflatoxin risk in peanuts.
Resumo:
Raw data from SeaScan™ transects off Wide Bay (south Queensland) taken in August 2007 as part of a study of ecological factors influencing the distribution of spanner crabs (Ranina ranina). The dataset (comma-delimited ascii file) comprises the following fields: 1. record number 2. date-time (GMT) 3. date-time (AEST) 4. latitude (signed decimal degrees) 5. longitude (decimal degrees) 6. speed over ground (knots) 7. depth (m) 8. seabed roughness (v) 9. hardness (v) Indices of roughness and hardness (from the first and second echoes respectively) were obtained using a SeaScan™ 100 system (un-referenced) on board the Research Vessel Tom Marshall, with the ship’s Furuno FCV 1100 echo sounder and 1 kW, 50 kHz transducer. Generally vessel speed was kept below about 14 kt (typically ~12 kt), and the echo-sounder range set to 80 m. The data were filtered to remove errors due to data drop-out, straying beyond system depth limits (min. 10 m), or transducer interference.
Resumo:
Pseudocercospora macadamiae causes husk spot of macadamia. Husk spot control would be improved by verifying the stages in fruit development susceptible to infection, and determine some of the climatic conditions likely to lead to high disease pressure periods in the field. Our results showed that the percent conidia germination and growth of germ tubes and mycelia of P. macadamiae were greatest at 26 degrees C, with better conidia germination associated with high relative humidity and free water. The exposure of match-head-sized and pea-sized fruit stages to natural P. macadamiae inoculum in the field led to 2 5-fold increases in husk spot incidence, and up to 8.5-fold increases in premature abscission, compared with unexposed fruit. Exposure of fruit stages later than match-head-sized and pea-sized fruit generally caused no further increases in disease incidence or premature abscission. Climatic conditions were found to have a strong influence on the behaviour of P. macadamiae, the host, oil accumulation, and the subsequent impact of husk spot on premature abscission. Our findings suggest that fungicide application should target fruit at the match-head-sized stage of development in order to best reduce yield losses, particularly in seasons where oil accumulation in fruit is prolonged and climatic conditions are optimal for P. macadamiae.
Resumo:
This paper quantifies gaseous N losses due to ammonia volatilisation and denitrification under controlled conditions at 30 degrees C and 75% to 150% of Field Capacity (FC). Biosolids were mixed with two contrasting soils from subtropical Australia at a rate designed to meet crop N requirements for irrigated cotton or maize (i.e., equivalent to 180 kg N ha(-1)). In the first experiment, aerobically (AE) and anaerobically (AN) digested biosolids were mixed into a heavy Vertosol soil and then incubated for 105 days. Ammonia volatilization over 72 days accounted for less than 4% of the applied NH4-N but 24% (AN) to 29% (AE) of the total applied biosolids' N was lost through denitrification in 105 days. In the second experiment AN biosolids with and without added polyacrimide polymer were mixed with either a heavy Vertosol or a lighter Red Ferrosol and then incubated for 98 days. The N loss was higher from the Vertosol with 16-29% of total N applied versus the Red Ferrosol with 7-10% of total N applied, while addition of polymer to the biosolids increased N loss from 7 to 10% and from 16 to 29% in the Red Ferrosol and Vertosol, respectively. A major product from the denitrification process was N-2 gas, accounting for >90% of the emitted N gases from both experiments. Our findings demonstrate that denitrification could be a major pathway of gaseous N losses under warm and moist conditions.
Resumo:
Abstract Sceliodes cordalis, eggfruit caterpillar, is an important pest of eggplant in Australia but little information was available on its biology. This study was conducted to determine the effect of temperature on the development on eggplant of eggs, larvae and pupae. Insects were reared at five constant temperatures from 20.5°C to 30.5°C with a 12:12 L : D photoperiod and the thermal summation model was fitted to the developmental rate data. Developmental zeroes and thermal constants of 11.22°C and 61.32 day-degrees for eggs, 12.03°C and 179.60 day-degrees for larvae, and 14.43°C and 107.03 day-degrees for pupae were determined. Several larvae reared at 20.5°C entered diapause.
Resumo:
The physicochemical and functional properties of flours from 25 Papua New Guinean and Australian sweetpotato cultivars were evaluated. The cultivars (white-, orange-, cream-, and purple-fleshed, and with dry matter, from 15 to 28 g/100 g), were obovate, oblong, elliptic, curved, irregular in shape, and essentially thin-cortexed (1-2 mm). Flour yield was less than 90 g/100 g solids, while starch, protein, amylose, water absorption and solubility indices, as well as total sugars, varied significantly (p < 0.05). Potassium, sodium, calcium, and phosphorus were the major minerals measured, and there were differences in the pasting properties, which showed four classes of shear-thinning and shear-thickening behaviours. Differential scanning calorimetry showed single-stage gelatinisation behaviour, with cultivar-dependent temperatures (61-84 degrees C) and enthalpies (12-27 J/g dry starch). Oval-, round- and angular-shaped granules were observed with a scanning electron microscope, while X-ray diffraction revealed an A-type diffraction pattern in the cultivars, with about 30% crystallinity. This study shows a wide range of sweetpotato properties, reported for the first time.
Resumo:
The fertility of cryopreserved Lates calcarifer sperm was studied to increase the availability of semen for routine fertilization of stripped eggs and to provide a tool for selective breeding. Semen diluted (1:4 v/v) and frozen (-196 degrees C) with 5% dimethylsulfoxide (DMSO) or 10% glycerol (final concentration) as cryoprotectants was used to inseminate freshly stripped ova. Frozen-thawed sperm were motile for about 4 min after being mixed with seawater. In the DMSO medium, post-thaw sperm activation was immediate after dilution with seawater, but in the glycerol medium maximum motility intensity was delayed for up to 1 min. When eggs and sperm were mixed before the addition of seawater, semen frozen with DMSO as cryoprotectant gave a mean hatch rate (84.1%) no different (P > 0.05) from that of unfrozen semen diluted with Ringer's solution (80.7%) or with DMSO (83.7%), but higher (P < 0.05) than that of semen frozen with glycerol (60.9%). Adding sperm to seawater 30 s before mixing with eggs did not improve the fertility of sperm cryopreserved with glycerol. Eggs inseminated with glycerol-cryoprotected sperm showed higher mortality during incubation than those inseminated with DMSO-cryoprotected sperm. Sperm held in liquid nitrogen for 90 days with DMSO as cryoprotectant yielded acceptable fertilization and hatching rates with semen-to-ova ratios of up to 1:100 (v/v) , and produced fish with no apparent abnormalities over a 29-day period after hatch. These results show that cryopreservation of L. calcarifer sperm is feasible and well suited to a variety of hatchery purposes.