25 resultados para Spill Over Effect

em eResearch Archive - Queensland Department of Agriculture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The membracid Aconophora compressa Walker, a biological control agent released in 1995 to control Lantana camara (Verbenaceae) in Australia, has since been collected on several nontarget plant species. Our survey suggests that sustained populations of A. compressa are found only on the introduced nontarget ornamental Citharexylum spinosum (Verbenaceae) and the target weed L. camara. It is found on other nontarget plant species only when populations on C. spinosum and L. camara are high, suggesting that the presence of populations on nontarget species may be a spill-over effect. Some of the incidence and abundance on nontarget plants could have been anticipated from host specificity studies done on this agent before release, whereas others could not. This raises important issues about predicting risks posed by weed biological control agents and the need for long-term postintroduction monitoring on nontarget species.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Viruses that originate in bats may be the most notorious emerging zoonoses that spill over from wildlife into domestic animals and humans. Understanding how these infections filter through ecological systems to cause disease in humans is of profound importance to public health. Transmission of viruses from bats to humans requires a hierarchy of enabling conditions that connect the distribution of reservoir hosts, viral infection within these hosts, and exposure and susceptibility of recipient hosts. For many emerging bat viruses, spillover also requires viral shedding from bats, and survival of the virus in the environment. Focusing on Hendra virus, but also addressing Nipah virus, Ebola virus, Marburg virus and coronaviruses, we delineate this cross-species spillover dynamic from the within-host processes that drive virus excretion to land-use changes that increase interaction among species. We describe how land-use changes may affect co-occurrence and contact between bats and recipient hosts. Two hypotheses may explain temporal and spatial pulses of virus shedding in bat populations: episodic shedding from persistently infected bats or transient epidemics that occur as virus is transmitted among bat populations. Management of livestock also may affect the probability of exposure and disease. Interventions to decrease the probability of virus spillover can be implemented at multiple levels from targeting the reservoir host to managing recipient host exposure and susceptibility.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Viruses that originate in bats may be the most notorious emerging zoonoses that spill over from wildlife into domestic animals and humans. Understanding how these infections filter through ecological systems to cause disease in humans is of profound importance to public health. Transmission of viruses from bats to humans requires a hierarchy of enabling conditions that connect the distribution of reservoir hosts, viral infection within these hosts, and exposure and susceptibility of recipient hosts. For many emerging bat viruses, spillover also requires viral shedding from bats, and survival of the virus in the environment. Focusing on Hendra virus, but also addressing Nipah virus, Ebola virus, Marburg virus and coronaviruses, we delineate this cross-species spillover dynamic from the within-host processes that drive virus excretion to land-use changes that increase interaction among species. We describe how land-use changes may affect co-occurrence and contact between bats and recipient hosts. Two hypotheses may explain temporal and spatial pulses of virus shedding in bat populations: episodic shedding from persistently infected bats or transient epidemics that occur as virus is transmitted among bat populations. Management of livestock also may affect the probability of exposure and disease. Interventions to decrease the probability of virus spillover can be implemented at multiple levels from targeting the reservoir host to managing recipient host exposure and susceptibility.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The achievement and measurement of improvements and innovations is not often an overt practice in the design and delivery of government services other than in health services. There is a need for specific mechanisms proven to increase the rate and scale of improvements and innovations in organisations, communities, regions and industries. This paper describes a model for the design, measurement and management of projects and services as systems for achieving and sustaining outcomes, improvements and innovations.The development of the model involved the practice of continuous improvement and innovation within and across a number of agricultural development projects in Australia and nternationally. Key learnings from the development and use of the model are: (1) all elements and factors critical for success can be implemented, measured and managed; (2) the design of a meaningful systemic measurement framework is possible; (3) all project partners can achieve and sustain rapid improvements and innovations; (4) outcomes can be achieved from early in the life of projects; and (5) significant spill-over benefits can be achieved beyond the scope, scale and timeframe of projects

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Species of Liposcelis psocids have emerged as major pests of stored grain in Australia in recent years. Several populations have been detected with high resistance to phosphine, the major chemical treatment. Highest resistance has been detected in the cosmopolitan species Liposcelis bostrychophila. As part of a national resistance management strategy to maintain the viability of phosphine, we are developing minimum effective dosage regimes (concentration x time) required to control all life stages of resistant L. bostrychophila at a range of grain temperatures. Four concentrations of phosphine, 0.1, 0.17, 0.3 aid 1 mg/L, were evaluated for their effectiveness against strongly resistant L. bostrychophila at a series of fumigation temperatures: 20, 25, 30 and 35°C. Results were recorded as the least number of days taken to achieve population extinction. We found that, at any fixed concentration of phosphine, time to population extinction decreased as fumigation temperature increased from 20 to 30°C. For example, at 0.1 mg/L, it took more than 14 days at 20°C to completely control these insects, whereas at 30°C it took only seven days. Increase in fumigation temperature from 25OC to 30°C dramatically reduced the exposure period needed to achieve population extinction of resistant psocids. For example, a dose of 0.17 mg/L over six days at 30°C completely controlled strongly resistant L. bostrychophila populations that can survive at 1 mg/L and 25°C over the same exposure period. Findings from our study will be used to formulate recommendations for registered dosage rates and fumigation periods for use in Australia.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Soil nitrogen (N) supply in the Vertosols of southern Queensland, Australia has steadily declined as a result of long-term cereal cropping without N fertiliser application or rotations with legumes. Nitrogen-fixing legumes such as lucerne may enhance soil N supply and therefore could be used in lucerne-wheat rotations. However, lucerne leys in this subtropical environment can create a soil moisture deficit, which may persist for a number of seasons. Therefore, we evaluated the effect of varying the duration of a lucerne ley (for up to 4 years) on soil N increase, N supply to wheat, soil water changes, wheat yields and wheat protein on a fertility-depleted Vertosol in a field experiment between 1989 and 1996 at Warra (26degrees 47'S, 150degrees53'E), southern Queensland. The experiment consisted of a wheat-wheat rotation, and 8 treatments of lucerne leys starting in 1989 (phase 1) or 1990 (phase 2) for 1,2,3 or 4 years duration, followed by wheat cropping. Lucerne DM yield and N yield increased with increasing duration of lucerne leys. Soil N increased over time following 2 years of lucerne but there was no further significant increase after 3 or 4 years of lucerne ley. Soil nitrate concentrations increased significantly with all lucerne leys and moved progressively downward in the soil profile from 1992 to 1995. Soil water, especially at 0.9-1.2 m depth, remained significantly lower for the next 3 years after the termination of the 4 year lucerne ley than under continuous wheat. No significant increase in wheat yields was observed from 1992 to 1995, irrespective of the lucerne ley. However, wheat grain protein concentrations were significantly higher under lucerne-wheat than under wheat wheat rotations for 3-5 years. The lucerne yield and soil water and nitrate-N concentrations were satisfactorily simulated with the APSIM model. Although significant N accretion occurred in the soil following lucerne leys, in drier seasons, recharge of the drier soil profile following long duration lucerne occurred after 3 years. Consequently, 3- and 4-year lucerne-wheat rotations resulted in more variable wheat yields than wheat-wheat rotations in this region. The remaining challenge in using lucerne-wheat rotations is balancing the N accretion benefits with plant-available water deficits, which are most likely to occur in the highly variable rainfall conditions of this region.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Tick infestation occurs over 1.3 x 106 km2 in northern Australia. It has been difficult to estimate the economic effects of ticks due to a lack of information on their effects on growth and reproduction (Anon 1975). 12th Biennial Conference. February 1978. Melbourne, Victoria

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objective: To assess the value of s-methylmethionine sulphonium chloride (SMMSC) (200 mg/kg) on nutritional performance of pigs and as prevention or therapy for oesophagogastric ulcers. Design: Sixty pigs from a high health status herd with continuing oesophagogastric ulcer problems were endoscopically assessed for the presence or absence of oesophagogastric ulcers. Forty-eight pigs were then selected and allocated according to an initial oesophagogastric epithelial (ulcer score) classification to replicated treatment groups in a 2 × 2 factorial design. Weight gain and feed intake were measured over 49 d, after which pigs were killed and stomachs were collected, re-examined and scored for oesophagogastric ulceration. Results: There was no difference over the 49 d in weight gain, feed intake and backfat in pigs with and without SMMSC supplementation between pigs with or without fully developed oesophagogastric ulcers at the start of the study. In pigs with an initially low ulcer score, feeding SMMSC did not prevent further oesophagogastric ulcer development. No significant effect of SMMSC was apparent when final mean oesophagogastric ulcer scores were compared in pigs with existing high ulcer score. However, further analysis of the changes in individual pig oesophagogastric ulcer scores during the experiment showed that the observed reductions in scores of the high ulcer group was significantly different from all other groups. Conclusion: This study has indicated that supplementation of pig diets with SMMSC cannot be justified unless the slight ulcer score improvement observed could be translated to some commercial production advantage such as a reduction in pig mortalities due to oesophagogastric ulcers. This study has further confirmed the benefit of endoscopy as a tool to enable objective assessment of oesophageal gastric health.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Quantifying the local crop response to irrigation is important for establishing adequate irrigation management strategies. This study evaluated the effect of irrigation applied with subsurface drip irrigation on field corn (Zea mays L.) evapotranspiration (ETc), yield, water use efficiencies (WUE = yield/ETc, and IWUE = yield/irrigation), and dry matter production in the semiarid climate of west central Nebraska. Eight treatments were imposed with irrigation amounts ranging from 53 to 356 mm in 2005 and from 22 to 226 mm in 2006. A soil water balance approach (based on FAO-56) was used to estimate daily soil water and ETc. Treatments resulted in seasonal ETc of 580-663 mm and 466-656 mm in 2005 and 2006, respectively. Yields among treatments differed by as much as 22% in 2005 and 52% in 2006. In both seasons, irrigation significantly affected yields, which increased with irrigation up to a point where irrigation became excessive. Distinct relationships were obtained each season. Yields increased linearly with seasonal ETc (R 2 = 0.89) and ETc/ETp (R 2 = 0.87) (ETp = ETc with no water stress). The yield response factor (ky), which indicates the relative reduction in yield to relative reduction in ETc, averaged 1.58 over the two seasons. WUE increased non-linearly with seasonal ETc and with yield. WUE was more sensitive to irrigation during the drier 2006 season, compared with 2005. Both seasons, IWUE decreased sharply with irrigation. Irrigation significantly affected dry matter production and partitioning into the different plant components (grain, cob, and stover). On average, the grain accounted for the majority of the above-ground plant dry mass (≈59%), followed by the stover (≈33%) and the cob (≈8%). The dry mass of the plant and that of each plant component tended to increase with seasonal ETc. The good relationships obtained in the study between crop performance indicators and seasonal ETc demonstrate that accurate estimates of ETc on a daily and seasonal basis can be valuable for making tactical in-season irrigation management decisions and for strategic irrigation planning and management.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of fungal endophyte (Neotyphodium lolii) infection on the performance of perennial ryegrass (Lolium perenne) growing under irrigation in a subtropical environment was investigated. Seed of 4 cultivars, infected with standard (common toxic or wild-type) endophyte or the novel endophyte AR1, or free of endophyte (Nil), was sown in pure swards, which were fertilised with 50 kg N/ha.month. Seasonal and total yield, persistence, and rust susceptibility were assessed over 3 years, along with details of the presence of endophyte and alkaloids in plant shoots. Endophyte occurrence in tillers in both the standard and AR1 treatments was above 95% for Bronsyn and Impact throughout and rose to that level in Samson by the end of the second year. Meridian AR1 only reached 93% while, in the standard treatment, the endophyte had mostly died before sowing. Nil Zendophyte treatments carried an average of ?0.6% infection throughout. Infection of the standard endophyte was associated with increased dry matter (DM) yields in all 3 years compared with no endophyte. AR1 also significantly increased yields in the second and third years. Over the full 3 years, standard and AR1 increased yields by 18% and 11%, respectively. Infection with both endophytes was associated with increased yields in all 4 seasons, the effects increasing in intensity over time. There was 27% better persistence in standard infected plants compared with Nil at the end of the first year, increasing to 198% by the end of the experiment, while for AR1 the improvements were 20 and 134%, respectively. The effect of endophyte on crown rust (Puccinia coronata) infection was inconsistent, with endophyte increasing rust damage on one occasion and reducing it on another. Cultivar differences in rust infection were greater than endophyte effects. Plants infected with the AR1 endophyte had no detectable ergovaline or lolitrem B in leaf, pseudostem, or dead tissue. In standard infected plants, ergovaline and lolitrem B were highest in pseudostem and considerably lower in leaf. Dead tissue had very low or no detectable ergovaline but high lolitrem B concentrations. Peramine concentration was high and at similar levels in leaf and pseudostem, but not detectable in dead material. Concentration was similar in both AR1 and standard infected plants. Endophyte presence appeared to have a similar effect in the subtropics as has been demonstrated in temperate areas, in terms of improving yields and persistence and increasing tolerance of plants to stress factors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diets containing 3% sorghum ergot (16 mg alkaloids/kg, including 14 mg dihydroergosine/kg) were fed to 12 sows from 14 days post-farrowing until weaning 14 days later, and their performance was compared with that of 10 control sows. Ergot-fed sows displayed a smaller weight loss during lactation of 24 kg/head vs. 29 kg/head in control sows (p > 0.05) despite feed consumption being less (61 kg/head total feed intake vs. 73 kg/head by control sows; p < 0.05). Ergot-fed sows had poorer weight gain of litters over the 14-day period (16.6 kg/litter vs. 28.3 kg/litter for controls; p < 0.05) despite an increase in consumption of creep feed by the piglets from the ergot-fed sows (1.9 kg/litter compared with 1.1 kg/litter by the control; p > 0.05). Sow plasma prolactin was reduced with ergot feeding after 7 days to 4.8 μg/l compared with 15.1 μg/l in the control sows (p < 0.01) and then at weaning was 4.9 μg/l compared with 8.0 μg/l (p < 0.01) in the control sows. Two sows fed ergot ceased lactation early, and the above sow feed intakes, body weight losses with litter weight gains and creep consumption indirectly indicate an ergot effect on milk production.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of defoliation on Amarillo (Arachis pintoi cv. Amarillo) was studied in a glasshouse and in mixed swards with 2 tropical grasses. In the glasshouse, Amarillo plants grown in pots were subjected to a 30/20°C or 25/15°C temperature regime and to defoliation at 10-, 20- or 30-day intervals for 60 days. Two field plot studies were conducted on Amarillo with either irrigated kikuyu (Pennisetum clandestinum) in autumn and spring or dryland Pioneer rhodes grass (Chloris gayana) over summer and autumn. Treatments imposed were 3 defoliation intervals (7, 14 and 28 days) and 2 residual heights (5 and 10 cm for kikuyu; 3 and 10 cm for rhodes grass) with extra treatments (56 days to 3 cm for both grasses and 21 days to 5 cm for kikuyu). Defoliation interval had no significant effect on accumulated Amarillo leaf dry matter (DM) at either temperature regime. At the higher temperature, frequent defoliation reduced root dry weight (DW) and increased crude protein (CP) but had no effect on stolon DW or in vitro organic matter digestibility (OMD). On the other hand, at the lower temperature, frequent defoliation reduced stolon DW and increased OMD but had no effect on root DW or CP. Irrespective of temperaure and defoliation, water-soluble carbohydrate levels were higher in stolons than in roots (4.70 vs 3.65%), whereas for starch the reverse occured (5.37 vs 9.44%). Defoliating the Amarillo-kikuyu sward once at 56 days to 3 cm produced the highest DM yield in autumn and sprong (582 and 7121 kg/ha DM, respectively), although the Amarillo component and OMD were substantially reduced. Highest DM yields (1726 kg/ha) were also achieved in the Amarillo-rhodes grass sward when defoliated every 56 days to 3 cm, although the Amarillo component was unaffected. In a mixed sward with either kikuyu or rhodes grass, the Amarillo component in the sward was maintained up to a 28-day defoliation interval and was higher when more severely defoliated. The results show that Amarillo can tolerate frequent defoliation and that it can co-exist with tropical grasses of differing growth habits, provided the Amarillo-tropical grass sward is subject to frequent and severe defoliation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Rainfall variability is a challenge to sustainable and pro. table cattle production in northern Australia. Strategies recommended to manage for rainfall variability, like light or variable stocking, are not widely adopted. This is due partly to the perception that sustainability and profitability are incompatible. A large, long-term grazing trial was initiated in 1997 in north Queensland, Australia, to test the effect of different grazing strategies on cattle production. These strategies are: (i) constant light stocking (LSR) at long-term carrying capacity (LTCC); (ii) constant heavy stocking (HSR) at twice LTCC; (iii) rotational wet-season spelling (R/Spell) at 1.5 LTCC; (iv) variable stocking (VAR), with stocking rates adjusted in May based on available pasture; and (v) a Southern Oscillation Index (SOI) variable strategy, with stocking rates adjusted in November, based on available pasture and SOI seasonal forecasts. Animal performance varied markedly over the 10 years for which data is presented, due to pronounced differences in rainfall and pasture availability. Nonetheless, lighter stocking at or about LTCC consistently gave the best individual liveweight gain (LWG), condition score and skeletal growth; mean LWG per annum was thus highest in the LSR (113 kg), intermediate in the R/Spell (104 kg) and lowest in the HSR(86 kg). MeanLWGwas 106 kg in the VAR and 103 kg in the SOI but, in all years, the relative performance of these strategies was dependent upon the stocking rate applied. After 2 years on the trial, steers from lightly stocked strategies were 60-100 kg heavier and received appreciable carcass price premiums at the meatworks compared to those under heavy stocking. In contrast, LWG per unit area was greatest at stocking rates of about twice LTCC; mean LWG/ha was thus greatest in the HSR (21 kg/ha), but this strategy required drought feeding in four of the 10 years and was unsustainable. Although LWG/ha was lower in the LSR (mean 14 kg/ha), or in strategies that reduced stocking rates in dry years like the VAR(mean 18 kg/ha) and SOI (mean 17 kg/ha), these strategies did not require drought feeding and appeared sustainable. The R/Spell strategy (mean 16 kg/ha) was compromised by an ill-timed fire, but also performed satisfactorily. The present results provide important evidence challenging the assumption that sustainable management in a variable environment is unprofitable. Further research is required to fully quantify the long-term effects of these strategies on land condition and profitability and to extrapolate the results to breeder performance at the property level.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Water regulations have decreased irrigation water supplies in Nebraska and some other areas of the USA Great Plains. When available water is not enough to meet crop water requirements during the entire growing cycle, it becomes critical to know the proper irrigation timing that would maximize yields and profits. This study evaluated the effect of timing of a deficit-irrigation allocation (150 mm) on crop evapotranspiration (ETc), yield, water use efficiency (WUE = yield/ETc), irrigation water use efficiency (IWUE = yield/irrigation), and dry mass (DM) of corn (Zea mays L.) irrigated with subsurface drip irrigation in the semiarid climate of North Platte, NE. During 2005 and 2006, a total of sixteen irrigation treatments (eight each year) were evaluated, which received different percentages of the water allocation during July, August, and September. During both years, all treatments resulted in no crop stress during the vegetative period and stress during the reproductive stages, which affected ETc, DM, yield, WUE and IWUE. Among treatments, ETc varied by 7.2 and 18.8%; yield by 17 and 33%; WUE by 12 and 22%, and IWUE by 18 and 33% in 2005 and 2006, respectively. Yield and WUE both increased linearly with ETc and with ETc/ETp (ETp = seasonal ETc with no water stress), and WUE increased linearly with yield. The yield response factor (ky) averaged 1.50 over the two seasons. Irrigation timing affected the DM of the plant, grain, and cob, but not that of the stover. It also affected the percent of DM partitioned to the grain (harvest index), which increased linearly with ETc and averaged 56.2% over the two seasons, but did not affect the percent allocated to the cob or stover. Irrigation applied in July had the highest positive coefficient of determination (R2) with yield. This high positive correlation decreased considerably for irrigation applied in August, and became negative for irrigation applied in September. The best positive correlation between the soil water deficit factor (Ks) and yield occurred during weeks 12-14 from crop emergence, during the "milk" and "dough" growth stages. Yield was poorly correlated to stress during weeks 15 and 16, and the correlation became negative after week 17. Dividing the 150 mm allocation about evenly among July, August and September was a good strategy resulting in the highest yields in 2005, but not in 2006. Applying a larger proportion of the allocation in July was a good strategy during both years, and the opposite resulted when applying a large proportion of the allocation in September. The different results obtained between years indicate that flexible irrigation scheduling techniques should be adopted, rather than relying on fixed timing strategies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The effect of ergot (Claviceps africana) in naturally infected sorghum was assessed in feedlot rations. Thirty-two Hereford steers (Bos taurus) in individual pens with access to shade were adapted to feedlot conditions and then offered one of four rations containing 0, 4.4, 8.8 or 17.6 mg/kg of ergot alkaloids (84% dihydroergosine, 10% dihydroelymoclavine and 6% festuclavine), equivalent to ~0, 10, 20 or 40 g/kg ergot (sclerotia/sphacelia) in the rations. These rations were withdrawn at noon on the second day because of severe hyperthermia and almost complete feed refusal in ergot-fed steers. After recovery on ergot-free rations for 5 days, treatment groups were incrementally introduced, over a further 3–12 days, to rations containing 0, 1.1, 2.2 or 4.4 mg/kg of alkaloids (~0, 2.5, 5 or 10 g/kg ergot, respectively). Relative exposure to ergot was maintained, so that the zero- (control), low-, medium- and high-ergot groups remained so. Steers were individually fed ad libitum, and water was freely available. Steers in all ergot-fed groups had significantly elevated rectal temperatures at 0800–1000 hours, even when the temperature–humidity index was only moderate (~70), and displayed other signs of hyperthermia (increased respiration rate, mouth breathing, excessive salivation and urination), as the temperature–humidity index increased to 73–79 during the day. Plasma prolactin was significantly reduced in ergot-fed groups. Voluntary feed intakes (liveweight basis) of the ergot-fed groups were significantly reduced, averaging 94, 86 and 86%, respectively, of the feed intakes of the control group. Hair coats were rough. While the control steers grew from a mean initial liveweight of 275 kg to a suitable slaughter weight of 455 kg in 17 weeks (growth rate 1.45 kg/day), ergot-fed groups gained only 0.77–1.10 kg/day and took at least 5 weeks longer to reach the slaughter weight, despite removal of ergot at the same time as control steers were sent to slaughter. Sorghum ergot, even at low concentrations (1.1 mg alkaloids/kg feed) is severely detrimental to the performance of steers in the feedlot.