8 resultados para 7 (2 hydroxyethyl)guanine
em eResearch Archive - Queensland Department of Agriculture
Resumo:
Tension banding castration of cattle is gaining favour because it is relatively simple to perform and is promoted by retailers of the banders as a humane castration method. Two experiments were conducted, under tropical conditions using Bos indicus bulls comparing tension banding (Band) and surgical (Surgical) castration of weaner (7–10 months old) and mature (22–25 months old) bulls with and without pain management (NSAID (ketoprofen) or saline injected intramuscularly immediately prior to castration). Welfare outcomes were assessed using a range of measures; this paper reports on some physiological, morbidity and productivity-related responses to augment the behavioural responses reported in an accompanying paper. Blood samples were taken on the day of castration (day 0) at the time of restraint (0 min) and 30 min (weaners) or 40 min (mature bulls), 2 h, and 7 h; and days 1, 2, 3, 7, 14, 21 and 28 post-castration. Plasmas from day 0 were assayed for cortisol, creatine kinase, total protein and packed cell volume. Plasmas from the other samples were assayed for cortisol and haptoglobin (plus the 0 min sample). Liveweights were recorded approximately weekly to 6 weeks and at 2 and 3 months post-castration. Castration sites were checked at these same times to 2 months post-castration to score the extent of healing and presence of sepsis. Cortisol concentrations (mean ± s.e. nmol/L) were significantly (P < 0.05) higher in the Band (67 ± 4.5) compared with Surgical weaners (42 ± 4.5) at 2 h post-castration, but at 24 h post-castration were greater in the Surgical (43 ± 3.2) compared with the Band weaners (30 ± 3.2). The main effect of ketoprofen was on the cortisol concentrations of the mature Surgical bulls; concentrations were significantly reduced at 40 min (47 ± 7.2 vs. 71 ± 7.2 nmol/L for saline) and 2 h post-castration (24 ± 7.2, vs. 87 ± 7.2 nmol/L for saline). Ketoprofen, however, had no effect on the Band mature bulls, with their cortisol concentrations averaging 54 ± 5.1 nmol/L at 40 min and 92 ± 5.1 nmol/L at 2 h. Cortisol concentrations were also significantly elevated in the Band (83 ± 3.0 nmol/L) compared with Surgical mature bulls (57 ± 3.0 nmol/L) at weeks 2–4 post-castration. The timing of this elevation coincided with significantly elevated haptoglobin concentrations (mg/mL) in the Band bulls (2.97 ± 0.102 for mature bulls and 1.71 ± 0.025 for weaners, vs. 2.10 ± 0.102 and 1.45 ± 0.025 respectively for the Surgical treatment) and evidence of slow wound healing and sepsis in both the weaner (0.81 ± 0.089 not healed at week 4 for Band, 0.13 ± 0.078 for Surgical) and mature bulls (0.81 ± 0.090 at week 4 for Band, 0.38 ± 0.104 for Surgical). Overall, liveweight gains of both age groups were not affected by castration method. The findings of acute pain, chronic inflammation and possibly chronic pain in the mature bulls at least, together with poor wound healing in the Band bulls support behavioural findings reported in the accompanying paper and demonstrate that tension banding produces inferior welfare outcomes for weaner and mature bulls compared with surgical castration.
Resumo:
Phosphonate fungicides are used widely in the control of diseases caused by Phytophthora cinnamomi Rands. For the most part phosphonate is seen as a safe to use on crops with phytotoxicity rare. However, recent research has shown that phosphonate has detrimental effects on the floral biology of some indigenous Australian plants. Since phosphonate fungicides are regularly used for the control of Phytophthora root rot in avocados, research was carried out to study the translocation of phosphonate fungicide in 'Hass' trees and any effects on their floral biology. Field-grown trees were sprayed with 0, 0.06 or 0.12 M mono-dipotassium phosphonate (pH 7.2) at summer flush maturity, floral bud break or anthesis. Following treatment, phosphonic acid concentrations were determined in leaves, roots, inflorescence rachi and flowers and in vitro pollen germination and pollen tube growth studied. Phosphonic acid concentration in the roots and floral parts was related to their sink strength at the respective times of application with concentration in roots highest (36.9.mg g±1) after treatment at summer flush maturity and in flowers (234.7 mg g±1) after treatment during early anthesis. Phosphonate at >0.03 M was found to be significantly phytotoxic to in vitro pollen germination and pollen tube growth. However, this rate gave a concentration far in excess of that measured in plant tissues following standard commercial applications of mono-dipotassium phosphonate fungicide. There was a small effect on pollen germination and pollen tube growth when 0.06 and 0.12 M mono-dipotassium phosphonate was applied during early anthesis. However, under favourable pollination and fruit set conditions it is not expected to have commercial impact on tree yield. However, there may be detrimental commercial implications from phosphonate sprays at early anthesis if unfavourable climatic conditions for pollination and fruit set subsequently occur. A commercial implication from this study is that phosphonic acid root concentrations can be elevated and maintained with strategic foliar applications of phosphonate fungicide timed to coincide with peaks in root sink strength. These occur at the end of the spring and summer flushes when shoot growth is relatively quiescent. Additional foliar applications may be advantageous in under high disease-pressure situations but where possible should be timed to minimize overlap with other significant growth events in the tree such as rapid inflorescence, and fruit development and major vegetative flushing.
Resumo:
The effects of fertilisers on 8 tropical turfgrasses growing in 100-L bags of sand were studied over winter in Murrumba Downs, just north of Brisbane in southern Queensland (latitude 27.4°S, longitude 153.1°E). The species used were: Axonopus compressus (broad-leaf carpetgrass), Cynodon dactylon (bermudagrass 'Winter Green') and C. dactylon x C. transvaalensis hybrid ('Tifgreen'), Digitaria didactyla (Queensland blue couch), Paspalum notatum (bahiagrass '38824'), Stenotaphrum secundatum (buffalograss 'Palmetto'), Eremochloa ophiuroides (centipedegrass 'Centec') and Zoysia japonica (zoysiagrass 'ZT-11'). Control plots were fertilised with complete fertilisers every month from May to September (72 kg N/ha, 31 kg P/ha, 84 kg K/ha, 48 kg S/ha, 30 kg Ca/ha and 7.2 kg Mg/ha), and unfertilised plots received no fertiliser. Carpetgrass and standard bermudagrass were the most sensitive species to nutrient supply, with lower shoot dry weights in the unfertilised plots (shoots mowed to thatch level) compared with the fertilised plots in June. There were lower shoot dry weights in the unfertilised plots in July for all species, except for buffalograss, centipedegrass and zoysiagrass, and lower shoot dry weights in the unfertilised plots in August for all species, except for centipedegrass. At the end of the experiment in September, unfertilised plots were 11% of the shoot dry weights of fertilised plots, with all species affected. Mean shoot nitrogen concentrations fell from 3.2 to 1.7% in the unfertilised plots from May to August, below the sufficiency range for turfgrasses (2.8-3.5%). There were also declines in P (0.45-0.36%), K (2.4-1.5%), S (0.35-0.25%), Mg (0.24-0.18%) and B (9-6 mg/kg), which were all in the sufficiency range. The shoots in the control plots took up the following levels (kg/ha.month) of nutrients: N, 10.0-27.0; P, 1.6-4.0; K, 8.2-19.8; S, 1.0-4.2; Ca, 1.1-3.3; and Mg, 0.8-2.2, compared with applications (kg/ha.month) of: N, 72; P, 31; K, 84; S, 48; Ca, 30; and Mg, 7.2, indicating a recovery of 14-38% for N, 5-13% for P, 10-24% for K, 2-9% for S, 4-11% for Ca and 11-30% for Mg. These results suggest that buffalograss, centipedegrass and zoysiagrass are less sensitive to low nutrient supply than carpetgrass, bermudagrass, blue couch and bahiagrass. Data on nutrient uptake showed that the less sensitive species required only half or less of the nitrogen required to maintain the growth of the other grasses, indicating potential savings for turf managers in fertiliser costs and the environment in terms of nutrients entering waterways.
Resumo:
The Wet Tropics bioregion of north Queensland has been identified as an area of global significance. The world-heritage-listed rainforests have been invaded by feral pigs (Sus scrofa) that are perceived to cause substantial environmental damage. A community perception exists of an annual altitudinal migration of the feral-pig population. The present study describes the movements of 29 feral pigs in relation to altitudinal migration (highland, transitional and lowland areas). Feral pigs were sedentary and stayed within their home range throughout a 4-year study period. No altitudinal migration was detected; pigs moved no more than a mean distance of 1.0 km from the centre of their calculated home ranges. There was no significant difference between the mean (+/- 95% confidence interval) aggregate home ranges for males (8.7 +/- 4.3 km², n = 15) and females (7.2 +/- 1.8 km², n = 14). No difference in home range was detected among the three altitudinal areas: 7.2 +/- 2.4 km² for highland, 6.2 +/- 3.9 km² for transitional and 9.9 +/- 5.3 km² for lowland areas. The aggregate mean home range for all pigs in the present study was 8.0 +/- 2.4 km². The study also assessed the influence seasons had on the home range of eight feral pigs on the rainforest boundary; home ranges did not significantly vary in size between the tropical wet and dry seasons, although the mean home range in the dry season (7.7 +/- 6.9 km²) was more than twice the home range in the wet season (2.9 +/- 0.8 km²). Heavier pigs tended to have larger home ranges. The results of the present study suggest that feral pigs are sedentary throughout the year so broad-scale control techniques need to be applied over sufficient areas to encompass individual home ranges. Control strategies need a coordinated approach if a long-term reduction in the pig population is to be achieved.
Resumo:
Since meat from poultry colonized with Campylobacter spp. is a major cause of bacterial gastroenteritis, human exposure should be reduced by, among other things, prevention of colonization of broiler flocks. To obtain more insight into possible sources of introduction of Campylobacter into broiler flocks, it is essential to estimate the moment that the first bird in a flock is colonized. If the rate of transmission within a flock were known, such an estimate could be determined from the change in the prevalence of colonized birds in a flock over time. The aim of this study was to determine the rate of transmission of Campylobacter using field data gathered for 5 years for Australian broiler flocks. We used unique sampling data for 42 Campylobacter jejuni-colonized flocks and estimated the transmission rate, which is defined as the number of secondary infections caused by one colonized bird per day. The estimate was 2.37 +/- 0.295 infections per infectious bird per day, which implies that in our study population colonized flocks consisting of 20,000 broilers would have an increase in within-flock prevalence to 95% within 4.4 to 7.2 days after colonization of the first broiler. Using Bayesian analysis, the moment of colonization of the first bird in a flock was estimated to be from 21 days of age onward in all flocks in the study. This study provides an important quantitative estimate of the rate of transmission of Campylobacter in broiler flocks, which could be helpful in future studies on the epidemiology of Campylobacter in the field.
Resumo:
Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.
Resumo:
The recent emergence of heritable high level resistance to phosphine in stored grain pests is a serious concern among major grain growing countries around the world. Here we describe the genetics of phosphine resistance in the rust red flour beetle Tribolium castaneum (Herbst), a pest of stored grain as well as a genetic model organism. We investigated three field collected strains of T. castaneum viz., susceptible (QTC4), weakly resistant (QTC1012) and strongly resistant (QTC931) to phosphine. The dose-mortality responses of their test- and inter-cross progeny revealed that most resistance was conferred by a single major resistance gene in the weakly (3.2x) resistant strain. This gene was also found in the strongly resistant (431x) strain, together with a second major resistance gene and additional minor factors. The second major gene by itself confers only 12-206x resistance, suggesting that a strong synergistic epistatic interaction between the genes is responsible for the high level of resistance (431x) observed in the strongly resistant strain. Phosphine resistance is not sex linked and is inherited as an incompletely recessive, autosomal trait. The analysis of the phenotypic fitness response of a population derived from a single pair inter-strain cross between the susceptible and strongly resistant strains indicated the changes in the level of response in the strong resistance phenotype; however this effect was not consistent and apparently masked by the genetic background of the weakly resistant strain. The results from this work will inform phosphine resistance management strategies and provide a basis for the identification of the resistance genes.
Resumo:
Phosphine is the only economically viable fumigant for routine control of insect pests of stored food products, but its continued use is now threatened by the world-wide emergence of high-level resistance in key pest species. Phosphine has a unique mode of action relative to well-characterised contact pesticides. Similarly, the selective pressures that lead to resistance against field sprays differ dramatically from those encountered during fumigation. The consequences of these differences have not been investigated adequately. We determine the genetic basis of phosphine resistance in Rhyzopertha dominica strains collected from New South Wales and South Australia and compare this with resistance in a previously characterised strain from Queensland. The resistance levels range from 225 and 100 times the baseline response of a sensitive reference strain. Moreover, molecular and phenotypic data indicate that high-level resistance was derived independently in each of the three widely separated geographical regions. Despite the independent origins, resistance was due to two interacting genes in each instance. Furthermore, complementation analysis reveals that all three strains contain an incompletely recessive resistance allele of the autosomal rph1 resistance gene. This is particularly noteworthy as a resistance allele at rph1 was previously proposed to be a necessary first step in the evolution of high-level resistance. Despite the capacity of phosphine to disrupt a wide range of enzymes and biological processes, it is remarkable that the initial step in the selection of resistance is so similar in isolated outbreaks.