1000 resultados para term
Resumo:
Top-predators can be important components of resilient ecosystems, but they are still controlled in many places to mitigate a variety of economic, environmental and/or social impacts. Lethal control is often achieved through the broad-scale application of poisoned baits. Understanding the direct and indirect effects of such lethal control on subsequent movements and behaviour of survivors is an important pre-requisite for interpreting the efficacy and ecological outcomes of top-predator control. In this study, we use GPS tracking collars to investigate the fine-scale and short-term movements of dingoes (Canis lupus dingo and other wild dogs) in response to a routine poison-baiting program as an example of how a common, social top-predator can respond (behaviourally) to moderate levels of population reduction. We found no consistent control-induced differences in home range size or location, daily distance travelled, speed of travel, temporal activity patterns or road/trail usage for the seven surviving dingoes we monitored immediately before and after a typical lethal control event. These data suggest that the spatial behaviour of surviving dingoes was not altered in ways likely to affect their detectability, and if control-induced changes in dingoes' ecological function did occur, these may not be related to altered spatial behaviour or movement patterns.
Resumo:
Post-release survival of line-caught pearl perch (Glaucosoma scapulare) was assessed via field experiments where fish were angled using methods similar to those used by commercial, recreational and charter fishers. One hundred and eighty-three individuals were caught during four experiments, of which >91% survived up to three days post-capture. Hook location was found to be the best predictor of survival, with the survival of throat- or stomach-hooked pearl perch significantly (P < 0.05) lower than those hooked in either the mouth or lip. Post-release survival was similar for both legal (≥35 cm) and sub-legal (<35 cm) pearl perch, while those individuals showing no signs of barotrauma were more likely to survive in the short term. Examination of the swim bladders in the laboratory, combined with observations in the field, revealed that swim bladders rupture during ascent from depth allowing swim bladder gases to escape into the gut cavity. As angled fish approach the surface, the alimentary tract ruptures near the anus allowing swim bladder gases to escape the gut cavity. As a result, very few pearl perch exhibit barotrauma symptoms and no barotrauma mitigation strategies were recommended. The results of this study show that pearl perch are relatively resilient to catch-and-release suggesting that post-release mortality would not contribute significantly to total fishing mortality. We recommend the use of circle hooks, fished actively on tight lines, combined with minimal handling in order to maximise the post-release survival of pearl perch.
Resumo:
Previous short-term studies predict that the use of fire to manage lantana (Lantana camara) may promote its abundance. We tested this prediction by examining long-term recruitment patterns of lantana in a dry eucalypt forest in Australia from 1959 to 2007 in three fire frequency treatments: repeated annual burning, repeated triennial burning and long unburnt. The dataset was divided into two periods (1959–1972, 1974–2007) due to logging that occurred at the study site between 1972 and 1974 and the establishment of the triennial burn treatment in 1973. Our results showed that repeated burning decreased lantana regeneration under an annual burn regime in the pre- and post-logging periods and maintained low levels of regeneration in the triennial burn compartment during the post-logging period. In the absence of fire, lantana recruitment exhibited a dome-shaped response over time, with the total population peaking in 1982 before declining to 2007. In addition to fire regime, soil pH and carbon to nitrogen ratio, the density of taller conspecifics and the interaction between rainfall and fire regime were found to influence lantana regeneration change over time. The results suggest that the reported positive association between fire disturbance and abundance of lantana does not hold for all forest types and that fire should be considered as part of an integrated weed management strategy for lantana in more fire-tolerant ecosystems.
Resumo:
Glyphosate resistance is a rapidly developing threat to profitability in Australian cotton farming. Resistance causes an immediate reduction in the effectiveness of in-crop weed control in glyphosate-resistant transgenic cotton and summer fallows. Although strategies for delaying glyphosate resistance and those for managing resistant populations are qualitatively similar, the longer resistance can be delayed, the longer cotton growers will have choice over which tactics to apply and when to apply them. Effective strategies to avoid, delay, and manage resistance are thus of substantial value. We used a model of glyphosate resistance dynamics to perform simulations of resistance evolution in Sonchus oleraceus (common sowthistle) and Echinochloa colona (awnless barnyard grass) under a range of resistance prevention, delaying, and management strategies. From these simulations, we identified several elements that could contribute to effective glyphosate resistance prevention and management strategies. (i) Controlling glyphosate survivors is the most robust approach to delaying or preventing resistance. High-efficacy, high-frequency survivor control almost doubled the useful lifespan of glyphosate from 13 to 25 years even with glyphosate alone used in summer fallows. (ii) Two non-glyphosate tactics in-crop plus two in-summer fallows is the minimum intervention required for long-term delays in resistance evolution. (iii) Pre-emergence herbicides are important, but should be backed up with non-glyphosate knockdowns and strategic tillage; replacing a late-season, pre-emergence herbicide with inter-row tillage was predicted to delay glyphosate resistance by 4 years in awnless barnyard grass. (iv) Weed species' ecological characteristics, particularly seed bank dynamics, have an impact on the effectiveness of resistance strategies; S. oleraceus, because of its propensity to emerge year-round, was less exposed to selection with glyphosate than E. colona, resulting in an extra 5 years of glyphosate usefulness (18 v. 13 years) even in the most rapid cases of resistance evolution. Delaying tactics are thus available that can provide some or many years of continued glyphosate efficacy. If glyphosate-resistant cotton cropping is to remain profitable in Australian farming systems in the long-term, however, growers must adapt to the probability that they will have to deal with summer weeds that are no longer susceptible to glyphosate. Robust resistance management systems will need to include a diversity of weed control options, used appropriately.
Resumo:
In 2001, the red imported fire ant (Solenopsis invicta Buren) was identified in Brisbane, Australia. An eradication program involving broadcast bait treatment with two insect growth regulators and a metabolic inhibitor began in September of that year and is currently ongoing. To gauge the impacts of these treatments on local ant populations, we examined long-term monitoring data and quantified abundance patterns of S. invicta and common local ant genera using a linear mixed-effects model. For S. invicta, presence in pitfalls reduced over time to zero on every site. Significantly higher numbers of S. invicta workers were collected on high-density polygyne sites, which took longer to disinfest compared with monogyne and low-density polygyne sites. For local ants, nine genus groups of the 10 most common genera analyzed either increased in abundance or showed no significant trend. Five of these genus groups were significantly less abundant at the start of monitoring on high-density polygyne sites compared with monogyne and low-density polygyne sites. The genus Pheidole significantly reduced in abundance over time, suggesting that it was affected by treatment efforts. These results demonstrate that the treatment regime used at the time successfully removed S. invicta from these sites in Brisbane, and that most local ant genera were not seriously impacted by the treatment. These results have important implications for current and future prophylactic treatment efforts, and suggest that native ants remain in treated areas to provide some biological resistance to S. invicta.
Resumo:
PURPOSE To quantify the influence of short-term wear of miniscleral contact lenses on the morphology of the corneo-scleral limbus, the conjunctiva, episclera and sclera. METHODS OCT images of the anterior eye were captured before, immediately following 3h of wear and then 3h after removal of a miniscleral contact lens for 10 young (27±5 years) healthy participants (neophyte rigid lens wearers). The region of analysis encompassed 1mm anterior, to 3.5mm posterior to the scleral spur. Natural diurnal variations in thickness were measured on a separate day and compensated for in subsequent analyses. RESULTS Following 3h of lens wear, statistically significant tissue thinning was observed across all quadrants, with a mean decrease in thickness of -24.1±3.6μm (p<0.001), which diminished, but did not return to baseline 3h after lens removal (-16.9±1.9μm, p<0.001). The largest tissue compression was observed in the superior quadrant (-49.9±8.5μm, p<0.01) and in the annular zone 1.5mm from the scleral spur (-48.2±5.7μm), corresponding to the approximate edge of the lens landing zone. Compression of the conjunctiva/episclera accounted for about 70% of the changes. CONCLUSIONS Optimal fitting miniscleral contact lenses worn for three hours resulted in significant tissue compression in young healthy eyes, with the greatest thinning observed superiorly, potentially due to the additional force of the eyelid, with a partial recovery of compression 3h after lens removal. Most of the morphological changes occur in the conjunctiva/episclera layers.
Resumo:
Immediate and residual effects of two lengths of low plane of nutrition (PON) on the synthesis of milk protein and protein fractions were studied at the Mutdapilly Research Station, in south-east Queensland. Thirty-six multiparous Holstein-Friesian cows, between 46 and 102 days in milk (DIM) initially, were used in a completely randomised design experiment with three treatments. All cows were fed on a basal diet of ryegrass pasture (7.0 kg DM/cow.day), barley-sorghum concentrate mix (2.7 kg DM/cow.day) and a canola meal-mineral mix (1.3 kg DM/cow.day). To increase PON, 5.0 kg DM/cow.day supplemental maize and forage sorghum silage was added to the basal diet. The three treatments were (C) high PON (basal diet + supplemental silage); (L9) low PON (basal diet only) for a period of 9 weeks; and (L3) low PON (basal diet only) for a period of 3 weeks. The experiment comprised three periods (1) covariate – high PON, all groups (5 weeks), (2) period of low PON for either 3 weeks (L3) or 9 weeks (L9), and (3) period of high PON (all groups) to assess ability of cows to recover any production lost as a result of treatments (5 weeks). The low PON treatment periods for L3 and L9 were end-aligned so that all treatment groups began Period 3 together. Although there was a significant effect of L9 on yields of milk, protein, fat and lactose, and concentrations of true protein, whey protein and urea, these were not significantly different from L3. There were no residual effects of L3 or L9 on protein concentration or nitrogen distribution after 5 weeks of realimentation. There was no significant effect of low PON for 3 or 9 weeks on casein concentration or composition.
Resumo:
Background Hyperferritinemia-cataract syndrome (HCS) is a rare Mendelian condition characterized by bilateral cataract and high levels of serum ferritin in the absence of iron overload. Methods HCS was diagnosed in three adult siblings. In two of them it was possible to assess lens changes initially in 1995 and again in 2013. Serum ferritin, iron, transferrin concentrations and transferrin saturation percentage were also measured, and the Iron Responsive Element (IRE) region of the L-ferritin gene (FTL) was studied. Results Serum ferritin concentrations were considerably elevated while serum iron, transferrin and transferrin saturation levels were within the normal range in each sibling. Cataract changes in our patients were consistent with those previously reported in the literature. Progression of the cataract, an aspect of few studies in this syndrome, appeared to be quite limited in extent. The heterozygous +32G to T (-168G>T) substitution in the IRE of the FTL gene was detected in this family. Conclusions Ophthalmic and biochemical studies together with genetic testing confirmed HCS in three family members. Although the disorder has been extensively described in recent years, little is known regarding cataract evolution over time. In our cases, lens evaluations encompassed many years, identified bilateral cataract of typical morphology and supported the hypothesis that this unique clinical feature of the disease tends to be slowly progressive in nature, at least in adults.
Resumo:
Soil biogeochemical cycles are largely mediated by microorganisms, while fire significantly modifies biogeochemical cycles mainly via altering microbial community and substrate availability. Majority of studies on fire effects have focused on the surface soil; therefore, our understanding of the vertical distribution of microbial communities and the impacts of fire on nitrogen (N) dynamics in the soil profile is limited. Here, we examined the changes of soil denitrification capacity (DNC) and denitrifying communities with depth under different burning regimes, and their interaction with environmental gradients along the soil profile. Results showed that soil depth had a more pronounced impact than the burning treatment on the bacterial community size. The abundance of 16S rRNA and denitrification genes (narG, nirK, and nirS) declined exponentially with soil depth. Surprisingly, the nosZ-harboring denitrifiers were enriched in the deeper soil layers, which was likely to indicate that the nosZ-harboring denitrifiers could better adapt to the stress conditions (i.e., oxygen deficiency, nutrient limitation, etc.) than other denitrifiers. Soil nutrients, including dissolved organic carbon (DOC), total soluble N (TSN), ammonium (NH4 +), and nitrate (NO3 −), declined significantly with soil depth, which probably contributed to the vertical distribution of denitrifying communities. Soil DNC decreased significantly with soil depth, which was negligible in the depths below 20 cm. These findings have provided new insights into niche separation of the N-cycling functional guilds along the soil profile, under a varied fire disturbance regime.
Resumo:
Post-release survival of line-caught pearl perch (Glaucosoma scapulare) was assessed via field experiments where fish were angled using methods similar to those used by commercial, recreational and charter fishers. One hundred and eighty-three individuals were caught during four experiments, of which >91 survived up to three days post-capture. Hook location was found to be the best predictor of survival, with the survival of throat- or stomach-hooked pearl perch significantly (P < 0.05) lower than those hooked in either the mouth or lip. Post-release survival was similar for both legal (≥35 cm) and sub-legal (<35 cm) pearl perch, while those individuals showing no signs of barotrauma were more likely to survive in the short term. Examination of the swim bladders in the laboratory, combined with observations in the field, revealed that swim bladders rupture during ascent from depth allowing swim bladder gases to escape into the gut cavity. As angled fish approach the surface, the alimentary tract ruptures near the anus allowing swim bladder gases to escape the gut cavity. As a result, very few pearl perch exhibit barotrauma symptoms and no barotrauma mitigation strategies were recommended. The results of this study show that pearl perch are relatively resilient to catch-and-release suggesting that post-release mortality would not contribute significantly to total fishing mortality. We recommend the use of circle hooks, fished actively on tight lines, combined with minimal handling in order to maximise the post-release survival of pearl perch.
Resumo:
Spot measurements of methane emission rate (n = 18 700) by 24 Angus steers fed mixed rations from GrowSafe feeders were made over 3- to 6-min periods by a GreenFeed emission monitoring (GEM) unit. The data were analysed to estimate daily methane production (DMP; g/day) and derived methane yield (MY; g/kg dry matter intake (DMI)). A one-compartment dose model of spot emission rate v. time since the preceding meal was compared with the models of Wood (1967) and Dijkstra et al. (1997) and the average of spot measures. Fitted values for DMP were calculated from the area under the curves. Two methods of relating methane and feed intakes were then studied: the classical calculation of MY as DMP/DMI (kg/day); and a novel method of estimating DMP from time and size of preceding meals using either the data for only the two meals preceding a spot measurement, or all meals for 3 days prior. Two approaches were also used to estimate DMP from spot measurements: fitting of splines on a 'per-animal per-day' basis and an alternate approach of modelling DMP after each feed event by least squares (using Solver), summing (for each animal) the contributions from each feed event by best-fitting a one-compartment model. Time since the preceding meal was of limited value in estimating DMP. Even when the meal sizes and time intervals between a spot measurement and all feeding events in the previous 72 h were assessed, only 16.9% of the variance in spot emission rate measured by GEM was explained by this feeding information. While using the preceding meal alone gave a biased (underestimate) of DMP, allowing for a longer feed history removed this bias. A power analysis taking into account the sources of variation in DMP indicated that to obtain an estimate of DMP with a 95% confidence interval within 5% of the observed 64 days mean of spot measures would require 40 animals measured over 45 days (two spot measurements per day) or 30 animals measured over 55 days. These numbers suggest that spot measurements could be made in association with feed efficiency tests made over 70 days. Spot measurements of enteric emissions can be used to define DMP but the number of animals and samples are larger than are needed when day-long measures are made.
Resumo:
Prescribed fire is one of the most widely-used management tools for reducing fuel loads in managed forests. However the long-term effects of repeated prescribed fires on soil carbon (C) and nitrogen (N) pools are poorly understood. This study aimed to investigate how different fire frequency regimes influence C and N pools in the surface soils (0–10 cm). A prescribed fire field experiment in a wet sclerophyll forest established in 1972 in southeast Queensland was used in this study. The fire frequency regimes included long unburnt (NB), burnt every 2 years (2yrB) and burnt every 4 years (4yrB), with four replications. Compared with the NB treatment, the 2yrB treatment lowered soil total C by 44%, total N by 54%, HCl hydrolysable C and N by 48% and 59%, KMnO4 oxidizable C by 81%, microbial biomass C and N by 42% and 33%, cumulative CO2–C by 28%, NaOCl-non-oxidizable C and N by 41% and 51%, and charcoal-C by 17%, respectively. The 4yrB and NB treatments showed no significant differences for these soil C and N pools. All soil labile, biologically active and recalcitrant and total C and N pools were correlated positively with each other and with soil moisture content, but negatively correlated with soil pH. The C:N ratios of different C and N pools were greater in the burned treatments than in the NB treatments. This study has highlighted that the prescribed burning at four year interval is a more sustainable management practice for this subtropical forest ecosystem.
Resumo:
Bovine Viral Diarrhoea Virus (BVDV) is one of the most serious pathogen, which causes tremendous economic loss to the cattle industry worldwide, meriting the development of improved subunit vaccines. Structural glycoprotein E2 is reported to be a major immunogenic determinant of BVDV virion. We have developed a novel hollow silica vesicles (SV) based platform to administer BVDV-1 Escherichia coli-expressed optimised E2 (oE2) antigen as a nanovaccine formulation. The SV-140 vesicles (diameter 50 nm, wall thickness 6 nm, perforated by pores of entrance size 16 nm and total pore volume of 0.934 cm(3)g(-1)) have proven to be ideal candidates to load oE2 antigen and generate immune response. The current study for the first time demonstrates the ability of freeze-dried (FD) as well as non-FD oE2/SV140 nanovaccine formulation to induce long-term balanced antibody and cell mediated memory responses for at least 6 months with a shortened dosing regimen of two doses in small animal model. The in vivo ability of oE2 (100 mu g)/SV-140 (500 mu g) and FD oE2 (100 mu g)/SV-140 (500 mu g) to induce long-term immunity was compared to immunisation with oE2 (100 mu g) together with the conventional adjuvant Quil-A from the Quillaja saponira (10 mu g) in mice. The oE2/SV-140 as well as the FD oE2/SV-140 nanovaccine generated oE2-specific antibody and cell mediated responses for up to six months post the final second immunisation. Significantly, the cell-mediated responses were consistently high in mice immunised with oE2/SV-140 (1,500 SFU/million cells) at the six-month time point. Histopathology studies showed no morphological changes at the site of injection or in the different organs harvested from the mice immunised with 500 mu g SV-140 nanovaccine compared to the unimmunised control. The platform has the potential for developing single dose vaccines without the requirement of cold chain storage for veterinary and human applications.
Resumo:
This study investigated long-term use of custom-made orthopedic shoes (OS) at 1.5 years follow-up. In addition, the association between short-term outcomes and long-term use was studied. Patients from a previously published study who did use their first-ever pair of OS 3 months after delivery received another questionnaire after 1.5 years. Patients with different pathologies were included in the study (n = 269, response = 86%). Mean age was 63 ± 14 years, and 38% were male. After 1.5 years, 87% of the patients still used their OS (78% frequently [4-7 days/week] and 90% occasionally [1-3 days/week]) and 13% of the patients had ceased using their OS. Patients who were using their OS frequently after 1.5 years had significantly higher scores for 8 of 10 short-term usability outcomes (p-values ranged from <0.001 to 0.046). The largest differences between users and nonusers were found for scores on the short-term outcomes of OS fit and communication with the medical specialist and shoe technician (effect size range = 0.16 to 0.46). We conclude that patients with worse short-term usability outcomes for their OS are more likely to use their OS only occasionally or not at all at long-term follow-up.
Resumo:
Objective To determine mortality rates after a first lower limb amputation and explore the rates for different subpopulations. Methods Retrospective cohort study of all people who underwent a first amputation at or proximal to transtibial level, in an area of 1.7 million people. Analysis with Kaplan-Meier curves and Log Rank tests for univariate associations of psycho-social and health variables. Logistic regression for odds of death at 30-days, 1-year and 5-years. Results 299 people were included. Median time to death was 20.3 months (95%CI: 13.1; 27.5). 30-day mortality = 22%; odds of death 2.3 times higher in people with history of cerebrovascular disease (95%CI: 1.2; 4.7, P = 0.016). 1 year mortality = 44%; odds of death 3.5 times higher for people with renal disease (95%CI: 1.8; 7.0, P < 0.001). 5-years mortality = 77%; odds of death 5.4 times higher for people with renal disease (95%CI: 1.8; 16.0,P = 0.003). Variation in mortality rates was most apparent in different age groups; people 75–84 years having better short term outcomes than those younger and older. Conclusions Mortality rates demonstrated the frailty of this population, with almost one quarter of people dying within 30-days, and almost half at 1 year. People with cerebrovascular had higher odds of death at 30 days, and those with renal disease and 1 and 5 years, respectively.