19 resultados para Emergence Traps
em eResearch Archive - Queensland Department of Agriculture
Resumo:
In south-eastern Queensland, Australia, sorghum planted in early spring usually escapes sorghum midge, Stenodiplosis sorghicola, attack. Experiments were conducted to better understand the role of winter diapause in the population dynamics of this pest. Emergence patterns of adult midge from diapausing larvae on the soil surface and at various depths were investigated during spring to autumn of 1987/88–1989/90. From 1987/88 to 1989/90, 89%, 65% and 98% of adult emergence, respectively, occurred during November and December. Adult emergence from larvae diapausing on the soil surface was severely reduced due to high mortality attributed to surface soil temperatures in excess of 40°C, with much of this mortality occurring between mid-September and mid-October. Emergence of adults from the soil surface was considerably delayed in the 1988/89 season compared with larvae buried at 5 or 10 cm which had similar emergence patterns for all three seasons. In 1989/90, when a 1-cm-deep treatment was included, there was a 392% increase in adult emergence from this treatment compared with deeper treatments. Some diapausing larvae on the surface did not emerge at the end of summer in only 1 year (1989/90), when 28.0% of the larvae on the surface remained in diapause, whereas only 0.8% of the buried larvae remained in diapause. We conclude that the pattern of emergence explains why spring plantings of sorghum in south-eastern Queensland usually escape sorghum midge attack.
Resumo:
Wheat is occasionally exposed to freezing temperatures during ear emergence and can suffer severe frost damage. Few studies have attempted to understand the characteristics of freezing and frost damage to wheat during late development stages. It was clearly shown that wheat appears to have an inherent frost resistance to temperatures down to −5 °C but is extensively damaged below this temperature. Acclimation, whilst increasing the frost resistance of winter wheat in a vegetative state was incapable of increasing frost resistance of plants at ear emergence. It is proposed that the ability to upregulate frost resistance is lost once vernalisation requirement is fulfilled. Culms and ears of wheat were able to escape frost damage at temperatures below −5 °C by supercooling even to as low as −15 °C and evidence collected by infrared thermography suggested that individual culms on a plant froze as independent units during freezing with little or no cross ice-nucleation strategies to protect wheat from frost damage in the field appear to revolve around avoiding ice nucleation.
Resumo:
When recapturing satellite collared wild dogs that had been trapped one month previous in padded foothold traps, we noticed varying degrees of pitting on the pads of their trapped paw. Veterinary advice, based on images taken of the injuries, suggests that the necrosis was caused by vascular compromise. Five of six dingoes we recaptured had varying degrees of necrosis restricted only to the trapped foot and ranging from single 5 mm holes to 25% sections of the toe pads missing or deformed, including loss of nails. The traps used were rubber-padded, two–coiled, Victor Soft Catch #3 traps. The springs are not standard Victor springs but were Beefer springs; these modifications slightly increase trap speed and the jaw pressure on the trapped foot. Despite this modification the spring pressure is still relatively mild in comparison to conventional long spring or four-coiled wild dog traps. The five wild dogs developing necrosis were trapped in November 2006 at 5-6 months of age. Traps were checked each morning so the dogs were unlikely to have been restrained in the trap for more than 12 hours. All dogs exhibited a small degree of paw damage at capture which presented itself as a swollen paw and compression at the capture point. In contrast, eight wild dogs, 7-8 month-old, were captured two months later in February. Upon their release, on advice from a veterinarian, we massaged the trapped foot to get blood flow back in to the foot and applied a bruise treatment (Heparinoid 8.33 mg/ml) to assist restoring blood flow. These animals were subsequently recaptured several months later and showed no signs of necrosis. While post-capture foot injuries are unlikely to be an issue in conventional control programs where the animal is immediately destroyed, caution needs to be used when releasing accidentally captured domestic dogs or research animals captured in rubber-padded traps. We have demonstrated that 7-8 month old dogs can be trapped and released without any evidence of subsequent necrosis following minimal veterinary treatment. We suspect that the rubber padding on traps may increase the tourniquet effect by wrapping around the paw and recommend the evaluation of offset laminated steel jaw traps as an alternative. Offset laminated steel jaw traps have been shown to be relatively humane producing as few foot injuries as rubber-jawed traps.
Resumo:
Seed persistence is poorly quantified for invasive plants of subtropical and tropical environments and Lantana camara, one of the world's worst weeds, is no exception. We investigated germination, seedling emergence, and seed survival of two lantana biotypes (Pink and pink-edged red [PER]) in southeastern Queensland, Australia. Controlled experiments were undertaken in 2002 and repeated in 2004, with treatments comprising two differing environmental regimes (irrigated and natural rainfall) and sowing depths (0 and 2 cm). Seed survival and seedling emergence were significantly affected by all factors (time, biotype, environment, sowing depth, and cohort) (P < 0.001). Seed dormancy varied with treatment (environment, sowing depth, biotype, and cohort) (P < 0.001), but declined rapidly after 6 mo. Significant differential responses by the two biotypes to sowing depth and environment were detected for both seed survival and seedling emergence (P < 0.001). Seed mass was consistently lower in the PER biotype at the population level (P < 0.001), but this variation did not adequately explain the differential responses. Moreover, under natural rainfall the magnitude of the biotype effect was unlikely to result in ecologically significant differences. Seed survival after 36 mo under natural rainfall ranged from 6.8 to 21.3%. Best fit regression analysis of the decline in seed survival over time yielded a five-parameter exponential decay model with a lower asymptote approaching −0.38 (% seed survival = [( 55 − (−0.38)) • e (k • t)] + −0.38; R2 = 88.5%; 9 df). Environmental conditions and burial affected the slope parameter or k value significantly (P < 0.01). Seed survival projections from the model were greatest for buried seeds under natural rainfall (11 yr) and least under irrigation (3 yr). Experimental data and model projections suggest that lantana has a persistent seed bank and this should be considered in management programs, particularly those aimed at eradication.
Resumo:
Sonchus oleraceus (common sowthistle) is a dominant weed and has increased in prevalence in conservation cropping systems of the subtropical grain region of Australia. Four experiments were undertaken to define the environmental factors that favor its germination, emergence, and seed persistence. Seeds were germinated at constant temperatures between 5 and 35C and water potentials between 0 and -1.4 MPa. The maximum germination rate of 86-100% occurred at 0 and -0.2 MPa, irrespective of the temperature when exposed to light (12 h photoperiod light/dark), but the germination rate was reduced by 72% without light. At water potentials of -0.6 to -0.8 MPa, the germination rate was reduced substantially by higher temperatures; no seed germinated at a water potential >-1.0 MPa. Emergence and seed persistence were measured over 30 months following seed burial at 0 (surface), 1, 2, 5, and 10 cm depths in large pots that were buried in a south-eastern Queensland field. Seedlings emerged readily from the surface and 1 cm depth, with no emergence from below the 2 cm depth. The seedlings emerged during any season following rain but, predominantly, within 6 months of planting. Seed persistence was short-term on the soil surface, with 2% of seeds remaining after 6 months, but it increased with the burial depth, with 12% remaining after 30 months at 10 cm. Thus, a minimal seed burial depth with reduced tillage and increased surface soil water with stubble retention has favored the proliferation of this weed in any season in a subtropical environment. However, diligent management without seed replenishment will greatly reduce this weed problem within a short period.
Resumo:
Better understanding of seed-bank dynamics of Echinochloa colona, Urochloa panicoides and Hibiscus trionum, major crop weeds in sub-tropical Australia, was needed to improve weed control. Emergence patterns and seed persistence were investigated, with viable seeds sown at different depths in large in-ground pots. Seedlings of all species emerged between October and March when mean soil temperatures were 21-23C. However, E. colona emerged as a series of flushes predominantly in the first year, with most seedlings emerging from 0-2 cm. Urochloa panicoides emerged mostly as a single large flush in the first two years, with most seedlings emerging from 5 cm. Hibiscus trionum emerged as a series of flushes over three seasons, initially with majority from 5 cm and then 0-2 cm in the later seasons. Longevity of the grass seed was short, with <5% remaining after burial at 0-2 cm for 24 months. In contrast, 38% of H. trionum seeds remained viable after the same period. Persistence of all species increased significantly with burial depth. These data highlight that management strategies need to be tailored for each species, particularly relating to the need for monitoring, application times for control tactics, impact of tillage, and time needed to reduce the seed-bank to low numbers.
Resumo:
Post head-emergence frost causes substantial losses for Australian barley producers. Varieties with improved resistance would have a significant positive impact on Australian cropping enterprises. Five barley genotypes previously tested for reproductive frost resistance in southern Australia were tested, post head-emergence, in the northern grain region of Australia and compared with the typical northern control cultivars, Gilbert and Kaputar. All tested genotypes suffered severe damage to whole heads and stems at plant minimum temperatures less than -8degreesC. In 2003, 2004 and 2005, frost events reaching a plant minimum temperature of ~-6.5degreesC did not result in the complete loss of grain yield. Rather, partial seed set was observed. The control genotype, Gilbert, exhibited seed set that was greater than or equal to that of any genotype in each year, as did Kaputar when tested in 2005. Thus, Gilbert and Kaputar were at least as resistant as any tested genotype. This contrasts with trial results from the southern grain region where Gilbert was reported to be less resistant than Franklin, Amagi Nijo and Haruna Nijo. Hence, rankings for post head-emergence frost damage in the northern grain region differ from those previously reported. These results indicate that Franklin, Amagi Nijo and Haruna Nijo are not likely to provide useful sources of frost resistance or markers to develop improved varieties for the northern grain region of Australia.
Resumo:
This study was initiated in response to a scarcity of data on the efficiency, selectivity and discard mortality of baited traps to target Scylla serrata. Five replicates of four traps, including "hoop nets", rigid "wire pots", and collapsible "round" and "rectangular" pots were deployed for 3, 6 and 24 h in two Australian estuaries. Trapped S. serrata were "discarded" into cages and monitored with controls over 3 d. All S. serrata were assessed for damage, while subsets of immediately caught and monitored individuals had haemolymph constituents quantified as stress indices. All traps retained similar-sized (8.119.1 cm carapace width) S. serrata, with catches positively correlated to deployment duration. Round pots were the most efficient for S. serrata and fishmostly Acanthopagrus australis (3 mortality). Hoop nets were the least efficient and were often damaged. No S. serrata died, but 18 were wounded (biased towards hoop nets), typically involving a missing swimmeret. Physiological responses were mild and mostly affected by biological factors. The results validate discarding unwanted S. serrata for controlling exploitation, but larger mesh sizes or escape vents in pots and restrictions on hoop nets would minimise unnecessary catches, pollution and ghost fishing. © 2012 International Council for the Exploration of the Sea. Published by Oxford University Press. All rights reserved.
Resumo:
Spontaneous sequence changes and the selection of beneficial mutations are driving forces of gene diversification and key factors of evolution. In highly dynamic co-evolutionary processes such as plant-pathogen interactions, the plant's ability to rapidly adapt to newly emerging pathogens is paramount. The hexaploid wheat gene Lr34, which encodes an ATP-binding cassette (ABC) transporter, confers durable field resistance against four fungal diseases. Despite its extensive use in breeding and agriculture, no increase in virulence towards Lr34 has been described over the last century. The wheat genepool contains two predominant Lr34 alleles of which only one confers disease resistance. The two alleles, located on chromosome 7DS, differ by only two exon-polymorphisms. Putatively functional homoeologs and orthologs of Lr34 are found on the B-genome of wheat and in rice and sorghum, but not in maize, barley and Brachypodium. In this study we present a detailed haplotype analysis of homoeologous and orthologous Lr34 genes in genetically and geographically diverse selections of wheat, rice and sorghum accessions. We found that the resistant Lr34 haplotype is unique to the wheat D-genome and is not found in the B-genome of wheat or in rice and sorghum. Furthermore, we only found the susceptible Lr34 allele in a set of 252 Ae. tauschii genotypes, the progenitor of the wheat D-genome. These data provide compelling evidence that the Lr34 multi-pathogen resistance is the result of recent gene diversification occurring after the formation of hexaploid wheat about 8,000 years ago.
Resumo:
Cereal crops can suffer substantial damage if frosts occur at heading. Identification of post-head-emergence frost (PHEF) resistance in cereals poses a number of unique and difficult challenges. Many decades of research have failed to identify genotypes with PHEF resistance that could offer economically significant benefit to growers. Research and breeding gains have been limited by the available screening systems. Using traditional frost screening systems, genotypes that escape frost injury in trials due to spatial temperature differences and/or small differences in phenology can be misidentified as resistant. We believe that by improving techniques to minimize frost escapes, such ofalse-positive' results can be confidently identified and eliminated. Artificial freezing chambers or manipulated natural frost treatments offer many potential advantages but are not yet at the stage where they can be reliably used for frost screening in breeding programmes. Here we describe the development of a novel photoperiod gradient method (PGM) that facilitates screening of genotypes of different phenology under natural field frosts at matched developmental stages. By identifying frost escapes and increasing the efficiency of field screening, the PGM ensures that research effort can be focused on finding genotypes with improved PHEF resistance. To maximize the likelihood of identifying PHEF resistance, we propose that the PGM form part of an integrated strategy to (i) source germplasm;(ii) facilitate high throughput screening; and (iii) permit detailed validation. PGM may also be useful in other studies where either a range of developmental stages and/or synchronized development are desired.
Resumo:
Emerging zoonoses threaten global health, yet the processes by which they emerge are complex and poorly understood. Nipah virus (NiV) is an important threat owing to its broad host and geographical range, high case fatality, potential for human-to-human transmission and lack of effective prevention or therapies. Here, we investigate the origin of the first identified outbreak of NiV encephalitis in Malaysia and Singapore. We analyse data on livestock production from the index site (a commercial pig farm in Malaysia) prior to and during the outbreak, on Malaysian agricultural production, and from surveys of NiV's wildlife reservoir (flying foxes). Our analyses suggest that repeated introduction of NiV from wildlife changed infection dynamics in pigs. Initial viral introduction produced an explosive epizootic that drove itself to extinction but primed the population for enzootic persistence upon reintroduction of the virus. The resultant within-farm persistence permitted regional spread and increased the number of human infections. This study refutes an earlier hypothesis that anomalous El Nino Southern Oscillation-related climatic conditions drove emergence and suggests that priming for persistence drove the emergence of a novel zoonotic pathogen. Thus, we provide empirical evidence for a causative mechanism previously proposed as a precursor to widespread infection with H5N1 avian influenza and other emerging pathogens.
Resumo:
Effective arbovirus surveillance is essential to ensure the implementation of control strategies, such as mosquito suppression, vaccination, or dissemination of public warnings. Traditional strategies employed for arbovirus surveillance, such as detection of virus or virus-specific antibodies in sentinel animals, or detection of virus in hematophagous arthropods, have limitations as an early-warning system. A system was recently developed that involves collecting mosquitoes in CO2-baited traps, where the insects expectorate virus on sugar-baited nucleic acid preservation cards. The cards are then submitted for virus detection using molecular assays. We report the application of this system for detecting flaviviruses and alphaviruses in wild mosquito populations in northern Australia. This study was the first to employ nonpowered passive box traps (PBTs) that were designed to house cards baited with honey as the sugar source. Overall, 20/144 (13.9%) of PBTs from different weeks contained at least one virus-positive card. West Nile virus Kunjin subtype (WNVKUN), Ross River virus (RRV), and Barmah Forest virus (BFV) were detected, being identified in 13/20, 5/20, and 2/20 of positive PBTs, respectively. Importantly, sentinel chickens deployed to detect flavivirus activity did not seroconvert at two Northern Territory sites where four PBTs yielded WNVKUN. Sufficient WNVKUN and RRV RNA was expectorated onto some of the honey-soaked cards to provide a template for gene sequencing, enhancing the utility of the sugar-bait surveillance system for investigating the ecology, emergence, and movement of arboviruses. © 2014, Mary Ann Liebert, Inc.
Resumo:
In Sudan Chickpea chlorotic dwarf virus (CpCDV, genus Mastrevirus, family Geminiviridae) is an important pathogen of pulses that are grown both for local consumption, and for export. Although a few studies have characterised CpCDV genomes from countries in the Middle East, Africa and the Indian subcontinent, little is known about CpCDV diversity in any of the major chickpea production areas in these regions. Here we analyse the diversity of 146 CpCDV isolates characterised from pulses collected across the chickpea growing regions of Sudan. Although we find that seven of the twelve known CpCDV strains are present within the country, strain CpCDV-H alone accounted for ∼73% of the infections analysed. Additionally we identified four new strains (CpCDV-M, -N, -O and -P) and show that recombination has played a significant role in the diversification of CpCDV, at least in this region. Accounting for observed recombination events, we use the large amounts of data generated here to compare patterns of natural selection within protein coding regions of CpCDV and other dicot-infecting mastrevirus species.
Resumo:
There is limited understanding about how insect movement patterns are influenced by landscape features, and how landscapes can be managed to suppress pest phytophage populations in crops. Theory suggests that the relative timing of pest and natural enemy arrival in crops may influence pest suppression. However, there is a lack of data to substantiate this claim. We investigate the movement patterns of insects from native vegetation (NV) and discuss the implications of these patterns for pest control services. Using bi-directional interception traps we quantified the number of insects crossing an NV/crop ecotone relative to a control crop/crop interface in two agricultural regions early in the growing season. We used these data to infer patterns of movement and net flux. At the community-level, insect movement patterns were influenced by ecotone in two out of three years by region combinations. At the functional-group level, pests and parasitoids showed similar movement patterns from NV very soon after crop emergence. However, movement across the control interface increased towards the end of the early-season sampling period. Predators consistently moved more often from NV into crops than vice versa, even after crop emergence. Not all species showed a significant response to ecotone, however when a response was detected, these species showed similar patterns between the two regions. Our results highlight the importance of NV for the recruitment of natural enemies for early season crop immigration that may be potentially important for pest suppression. However, NV was also associated with crop immigration by some pest species. Hence, NV offers both opportunities and risks for pest management. The development of targeted NV management may reduce the risk of crop immigration by pests, but not of natural enemies.
Resumo:
Reducing crop row spacing and delaying time of weed emergence may provide crops a competitive edge over weeds. Field experiments were conducted to evaluate the effects of crop row spacing (11, 15, and 23-cm) and weed emergence time (0, 20, 35, 45, 55, and 60 days after wheat emergence; DAWE) on Galium aparine and Lepidium sativum growth and wheat yield losses. Season-long weed-free and crop-free treatments were also established to compare wheat yield and weed growth, respectively. Row spacing and weed emergence time significantly affected the growth of both weed species and wheat grain yields. For both weed species, the maximum plant height, shoot biomass, and seed production were observed in the crop-free plots, and delayed emergence decreased these variables. In weed-crop competition plots, maximum weed growth was observed when weeds emerged simultaneously with the crop in rows spaced 23-cm apart. Less growth of both weed species was observed in narrow row spacing (11-cm) of wheat as compared with wider rows (15 and 23-cm). These weed species produced less than 5 seeds plant-1 in 11-cm wheat rows when they emerged at 60 DAWE. Presence of weeds in the crop especially at early stages was devastating for wheat yields. Therefore, maximum grain yield (4.91tha-1) was recorded in the weed-free treatment at 11-cm row spacing. Delay in time of weed emergence and narrow row spacing reduced weed growth and seed production and enhanced wheat grain yield, suggesting that these strategies could contribute to weed management in wheat.