14 resultados para High Risk

em eResearch Archive - Queensland Department of Agriculture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

We used an established seagrass monitoring programme to examine the short and longer-term impacts of an oil spill event on intertidal seagrass meadows. Results for potentially impacted seagrass areas were compared with existing monitoring data and with control seagrass meadows located outside of the oil spill area. Seagrass meadows were not significantly affected by the oil spill. Declines in seagrass biomass and area 1 month post-spill were consistent between control and impact meadows. Eight months post-spill, seagrass density and area increased to be within historical ranges. The declines in seagrass meadows were likely attributable to natural seasonal variation and a combination of climatic and anthropogenic impacts. The lack of impact from the oil spill was due to several mitigating factors rather than a lack of toxic effects to seagrasses. The study demonstrates the value of long-term monitoring of critical habitats in high risk areas to effectively assess impacts.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Given the limited resources available for weed management, a strategic approach is required to give the best bang for your buck. The current study incorporates: (1) a model ensemble approach to identify areas of uncertainty and commonality regarding a species invasive potential, (2) current distribution of the invaded species, and (3) connectivity of systems to identify target regions and focus efforts for more effective management. Uncertainty in the prediction of suitable habitat for H. amplexicaulis (study species) in Australia was addressed in an ensemble-forecasting approach to compare distributional scenarios from four models (CLIMATCH; CLIMEX; boosted regression trees [BRT]; maximum entropy [Maxent]). Models were built using subsets of occurrence and environmental data. Catchment risk was determined through incorporating habitat suitability, the current abundance and distribution of H. amplexicaulis, and catchment connectivity. Our results indicate geographic differences between predictions of different approaches. Despite these differences a number of catchments in northern, central, and southern Australia were identified as high risk of invasion or further spread by all models suggesting they should be given priority for the management of H. amplexicaulis. The study also highlighted the utility of ensemble approaches in indentifying areas of uncertainty and commonality regarding the species invasive potential.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Hazard site surveillance is a system for post-border detection of new pest incursions, targeting sites that are considered potentially at high risk of such introductions. Globalisation, increased volumes of containerised freight and competition for space at domestic ports means that goods are increasingly being first opened at premises some distance from the port of entry, thus dispersing risk away from the main inspection point. Hazard site surveillance acts as a backstop to border control to ensure that new incursions are detected sufficiently early to allow the full range of management options, including eradication and containment, to be considered. This is particularly important for some of the more cryptic forest pests whose presence in a forest often is not discovered until populations are already high and the pest is well established. General requirements for a hazard site surveillance program are discussed using a program developed in Brisbane, Australia, in 2006 as a case study. Some early results from the Brisbane program are presented. In total 67 species and 5757 individuals of wood-boring beetles have been trapped and identified during the program to date. Scolytines are the most abundant taxa, making up 83% of the catch. No new exotics have been trapped but 19 of the species and 60% of all specimens caught are exotics that are already established in Australia.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The introduction of glyphosate tolerant cotton has significantly improved the flexibility and management of a number of problem weeds in cotton systems. However, reliance on glyphosate poses risks to the industry in term of glyphosate resistance and species shift. The aims of this project were to identify these risks, and determine strategies to prevent and mitigate the potential for resistance evolution. Field surveys identified fleabane as the most common weed now in both irrigated and dryland system. Sowthistle has also increased in prevalence, and bladder ketmia and peachvine remained common. The continued reliance on glyphosate has favoured small seeded, and glyphosate tolerant species. Fleabane is both of these, with populations confirmed resistant in grains systems in Queensland and NSW. When species were assessed for their resistance risk, fleabane, liverseed grass, feathertop Rhodes grass, sowthistle and barnyard grass were determined to have high risk ratings. Management practices were also determined to rely heavily on glyphosate and therefore be high risk in summer fallows, and dryland glyphosate tolerant and conventional cotton. Situations were these high risk species are present in high risk cropping phases need particular attention. The confirmation of a glyphosate resistance barnyard grass population in a dryland glyphosate tolerant cotton system means resistance is now a reality for the cotton industry. However, experiments have shown that resistant populations can be managed with other herbicide options currently available. However, the options for fleabane management in cotton are still limited. Although some selective residual herbicides are showing promise, the majority of fleabane control tactics can only be used in other phases of the cotton rotation. An online glyphosate resistance tool has been developed. This tool allows growers to assess their individual glyphosate resistance risks, and how they can adjust their practices to reduce their risks. It also provides researchers with current information on weed species present and practices used across the industry. This tool will be extremely useful in tailoring future research and extension efforts. Simulations from the expanded glyphosate resistance model have shown that glyphosate resistance can be prevented and managed in glyphosate-tolerant cotton farming systems. However, for strategies to be successful, some effort is required. Simulations have shown the importance of controlling survivors of glyphosate applications, using effective glyphosate alternatives in fallows, and combining several effective glyphosate alternatives in crop, and these are the key to the prevention and management of glyphosate resistance.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The Fitzroy Basin is the second largest catchment area in Australia covering 143,00 km² and is the largest catchment for the Great Barrier Reef lagoon (Karfs et al., 2009). The Great Barrier Reef is the largest reef system in the world; it covers an area of approximately 225,000 km² in the northern Queensland continental shelf. There are approximately 750 reefs that exist within 40 km of the Queensland Coast (Haynes et al., 2007). The prime determinant for the changes in water quality have been attributed to grazing, with beef production the largest single land use industry comprising 90% of the land area (Karfs et al., 2009). In response to the depletion of water quality in the reef, in 2003 a Reef Water Quality plan was developed by the Australian and Queensland governments. The plan targets as a priority sediment contributions from grazing cattle in high risk catchments (The State of Queensland and Commonwealth of Australia, 2003). The economic incentive strategy designed includes analysing the costs and benefits of best management practice that will lead to improved water quality (The State of Queensland and Commonwealth of Australia, 2003).

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Converting from an existing irrigation system is often seen as high risk by the land owner. The significant financial investment and the long period over which the investment runs is also complicated by the uncertainty associated with long term input costs (such as energy), crop production, and the continually evolving natural resource management rules and policy. Irrigation plays a pivotal part in the Burdekin sugarcane farming system. At present the use of furrow irrigation is by far the most common form due to the ease of use, relatively low operating cost and well established infrastructure currently in place. The Mulgrave Area Farmer Integrated Action (MAFIA) grower group, located near Clare in the lower Burdekin region, identified the need to learn about sustainable farming systems with a focus on the environment, social and economic implications. In early 2007, Hesp Faming established a site to investigate the use of overhead irrigation as an alternative to furrow irrigation and its integration with new farming system practices, including Green Cane Trash Blanketing (GCTB). Although significant environmental and social benefits exist, the preliminary investment analysis indicates that the Overhead Low Pressure (OHLP) irrigation system is not adding financial value to the Hesp Farming business. A combination of high capital costs and other offsetting factors resulted in the benefits not being fully realised. A different outcome is achieved if Hesp Farming is able to realise value on the water saved, with both OHLP irrigation systems displaying a positive NPV. This case study provides a framework to further investigate the economics of OHLP irrigation in sugarcane and it is anticipated that with additional data a more definitive outcome will be developed in the future.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Understanding the host range for all of the fruit fly species within the South Pacific region is vital to establishing trade and quarantine protocols. This is important for the countries within the region and their trade partners. A significant aspect of the Australian Centre for International Agricultural Research (ACIAR) and Regional Fruit Fly Projects (RFFP) has been host fruit collecting which has provided information on fruit fly host records in the seven participating countries. This work is still continuing in all project countries at different intensities. In the Cook Islands, Fiji, Tonga and Western Samoa, fruit surveys have assumed a quarantine surveillance role, with a focus on high risk fruits, such as guava, mango, citrus, bananas, cucurbits and solanaceous fruits. In the Solomon Islands, Vanuatu and the Federated States of Micronesia (FSM), fruit surveys are still at the stage where host ranges are far from complete. By the end of the current project a more complete picture of the fruit fly hosts in these countries will have been gained. A brief summary of the data collected to date is as follows: 23 947 fruit samples collected to date; 2181 positive host fruit records; 31 fruit fly species reared from fruit; 12 species reared from commercial fruit. A commercial fruit is classed as an edible fruit with potential for trade at either a local or international level. This allows for the inclusion of endemic fruit species that have cultural significance as a food source. On the basis of these results, there are fruit fly species of major economic importance in the South Pacific region. However, considerably more fruit survey work is required in order to establish a detailed understanding of all the pest species.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Multiple Trichinella species are reported from the Australasian region although mainland Australia has never confirmed an indigenous case of Trichinella infection in humans or animals. Wildlife surveys in high-risk regions are essential to truly determine the presence or absence of Trichinella, but in mainland Australia are largely lacking. In this study, a survey was conducted in wild pigs from mainland Australia's Cape York Peninsula and Torres Strait region for the presence of Trichinella, given the proximity of a Trichinella papuae reservoir in nearby PNG. We report the detection of a Trichinella infection in a pig from an Australian island in the Torres Strait, a narrow waterway that separates the islands of New Guinea and continental Australia. The larvae were characterised as T. papuae (Kikori strain) by PCR and sequence analysis. No Trichinella parasites were found in any pigs from the Cape York Peninsula. These results highlight the link the Torres Strait may play in providing a passage for introduction of Trichinella parasites from the Australasian region to the Australian mainland. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Data from 9296 calves born to 2078 dams over 9 years across five sites were used to investigate factors associated with calf mortality for tropically adapted breeds (Brahman and Tropical Composite) recorded in extensive production systems, using multivariate logistic regression. The average calf mortality pre-weaning was 9.5% of calves born, varying from 1.5% to 41% across all sites and years. In total, 67% of calves that died did so within a week of their birth, with cause of death most frequently recorded as unknown. The major factors significantly (P < 0.05) associated with mortality for potentially large numbers of calves included the specific production environment represented by site-year, low calf birthweight (more so than high birthweight) and horn status at branding. Almost all calf deaths post-branding (assessed from n = 8348 calves) occurred in calves that were dehorned, totalling 2.1% of dehorned calves and 15.9% of all calf deaths recorded. Breed effects on calf mortality were primarily the result of breed differences in calf birthweight and, to a lesser extent, large teat size of cows; however, differences in other breed characteristics could be important. Twin births and calves assisted at birth had a very high risk of mortality, but <1% of calves were twins and few calves were assisted at birth. Conversely, it could not be established how many calves would have benefitted from assistance at birth. Cow age group and outcome from the previous season were also associated with current calf mortality; maiden or young cows (<4 years old) had increased calf losses overall. More mature cows with a previous outcome of calf loss were also more likely to have another calf loss in the subsequent year, and this should be considered for culling decisions. Closer attention to the management of younger cows is warranted to improve calf survival.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Trichinella surveillance in wildlife relies on muscle digestion of large samples which are logistically difficult to store and transport in remote and tropical regions as well as labour-intensive to process. Serological methods such as enzyme-linked immunosorbent assays (ELISAs) offer rapid, cost-effective alternatives for surveillance but should be paired with additional tests because of the high false-positive rates encountered in wildlife. We investigated the utility of ELISAs coupled with Western blot (WB) in providing evidence of Trichinella exposure or infection in wild boar. Serum samples were collected from 673 wild boar from a high- and low-risk region for Trichinella introduction within mainland Australia, which is considered Trichinella-free. Sera were examined using both an 'in-house' and a commercially available indirect-ELISA that used excretory secretory (E/S) antigens. Cut-off values for positive results were determined using sera from the low-risk population. All wild boar from the high-risk region (352) and 139/321 (43.3%) of the wild boar from the low-risk region were tested by artificial digestion. Testing by Western blot using E/S antigens, and a Trichinella-specific real-time PCR was also carried out on all ELISA-positive samples. The two ELISAs correctly classified all positive controls as well as one naturally infected wild boar from Gabba Island in the Torres Strait. In both the high- and low-risk populations, the ELISA results showed substantial agreement (k-value = 0.66) that increased to very good (k-value = 0.82) when WB-positive only samples were compared. The results of testing sera collected from the Australian mainland showed the Trichinella seroprevalence was 3.5% (95% C.I. 0.0-8.0) and 2.3% (95% C.I. 0.0-5.6) using the in-house and commercial ELISA coupled with WB respectively. These estimates were significantly higher (P < 0.05) than the artificial digestion estimate of 0.0% (95% C.I. 0.0-1.1). Real-time PCR testing of muscle from seropositive animals did not detect Trichinella DNA in any mainland animals, but did reveal the presence of a second larvae-positive wild boar on Gabba Island, supporting its utility as an alternative, highly sensitive method in muscle examination. The serology results suggest Australian wildlife may have been exposed to Trichinella parasites. However, because of the possibility of non-specific reactions with other parasitic infections, more work using well-defined cohorts of positive and negative samples is required. Even if the specificity of the ELISAs is proven to be low, their ability to correctly classify the small number of true positive sera in this study indicates utility in screening wild boar populations for reactive sera which can be followed up with additional testing. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

A recent report to the Australian Government identified concerns relating to Australia's capacity to respond to a medium to large outbreak of FMD. To assess the resources required, the AusSpread disease simulation model was used to develop a plausible outbreak scenario that included 62 infected premises in five different states at the time of detection, 28 days after the disease entered the first property in Victoria. Movements of infected animals and/or contaminated product/equipment led to smaller outbreaks in NSW, Queensland, South Australia and Tasmania. With unlimited staff resources, the outbreak was eradicated in 63 days with 54 infected premises and a 98% chance of eradication within 3 months. This unconstrained response was estimated to involve 2724 personnel. Unlimited personnel was considered unrealistic, and therefore, the course of the outbreak was modelled using three levels of staffing and the probability of achieving eradication within 3 or 6 months of introduction determined. Under the baseline staffing level, there was only a 16% probability that the outbreak would be eradicated within 3 months, and a 60% probability of eradication in 6 months. Deployment of an additional 60 personnel in the first 3 weeks of the response increased the likelihood of eradication in 3 months to 68%, and 100% in 6 months. Deployment of further personnel incrementally increased the likelihood of timely eradication and decreased the duration and size of the outbreak. Targeted use of vaccination in high-risk areas coupled with the baseline personnel resources increased the probability of eradication in 3 months to 74% and to 100% in 6 months. This required 25 vaccination teams commencing 12 days into the control program increasing to 50 vaccination teams 3 weeks later. Deploying an equal number of additional personnel to surveillance and infected premises operations was equally effective in reducing the outbreak size and duration.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Grasses, legumes, saltbushes and herbs were evaluated at 6 sites in southern inland Queensland to identify potential pasture and forage plants for use on marginal cropping soils. The region experiences summer heat waves and severe winter frosts. Emphasis was on perennial plants, and native species were included. Seedlings were transplanted into the unfertilized fields in either summer or autumn to suit the growing season of plants, and watered to ensure estab-lishment. Summer-growing grasses were the most successful group, while cool season-growing perennials mostly failed. Summer legumes were disappointing, with Stylosanthes scabra and Indigofera schimperi performing best. Some lines such as I. schimperi and the Eragrostis hybrid cv. Cochise were assessed as potential weeds owing to low animal acceptance. Native Rhynchosia minima grew well at some sites and deserves more study. Cenchrus ciliaris was always easy to establish and produced the highest yields. Persistence of some Digitaria and Bothriochloa species, Eragrostis curvula and Fingerhuthia africana at specific sites was encouraging, but potential weediness needs careful assessment. Standard species were identified to represent the main forage types, such as Austrostipa scabra for cool season-growing grasses, for incorporation into future trials with new genetic materials. The early field testing protocol used should be considered for use elsewhere, if unreliable rainfall poses a high risk of establishment failure from scarce seed.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Lethal control of wild dogs - that is Dingo (Canis lupus dingo) and Dingo/Dog (Canis lupus familiaris) hybrids - to reduce livestock predation in Australian rangelands is claimed to cause continental-scale impacts on biodiversity. Although top predator populations may recover numerically after baiting, they are predicted to be functionally different and incapable of fulfilling critical ecological roles. This study reports the impact of baiting programmes on wild dog abundance, age structures and the prey of wild dogs during large-scale manipulative experiments. Wild dog relative abundance almost always decreased after baiting, but reductions were variable and short-lived unless the prior baiting programme was particularly effective or there were follow-up baiting programmes within a few months. However, age structures of wild dogs in baited and nil-treatment areas were demonstrably different, and prey populations did diverge relative to nil-treatment areas. Re-analysed observations of wild dogs preying on kangaroos from a separate study show that successful chases that result in attacks of kangaroos by wild dogs occurred when mean wild dog ages were higher and mean group size was larger. It is likely that the impact of lethal control on wild dog numbers, group sizes and age structures compromise their ability to handle large difficult-to-catch prey. Under certain circumstances, these changes sometimes lead to increased calf loss (Bos indicus/B. taurus genotypes) and kangaroo numbers. Rangeland beef producers could consider controlling wild dogs in high-risk periods when predation is more likely and avoid baiting at other times.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Wheat is at peak quality soon after harvest. Subsequently, diverse biota use wheat as a resource in storage, including insects and mycotoxin-producing fungi. Transportation networks for stored grain are crucial to food security and provide a model system for an analysis of the population structure, evolution, and dispersal of biota in networks. We evaluated the structure of rail networks for grain transport in the United States and Eastern Australia to identify the shortest paths for the anthropogenic dispersal of pests and mycotoxins, as well as the major sources, sinks, and bridges for movement. We found important differences in the risk profile in these two countries and identified priority control points for sampling, detection, and management. An understanding of these key locations and roles within the network is a new type of basic research result in postharvest science and will provide insights for the integrated pest management of high-risk subpopulations, such as pesticide-resistant insect pests.