21 resultados para Risk management tools

em eResearch Archive - Queensland Department of Agriculture


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aim: Decision-making in weed management involves consideration of limited budgets, long time horizons, conflicting priorities, and as a result, trade-offs. Economics provides tools that allow these issues to be addressed and is thus integral to management of the risks posed by weeds. One of the critical issues in weed risk management during the early stages of an invasion concerns feasibility of eradication. We briefly review how economics may be used in weed risk management, concentrating on this management strategy. Location: Australia. Methods: A range of innovative studies that investigate aspects of weed risk management are reviewed. We show how these could be applied to newly invading weeds, focussing on methods for investigating eradication feasibility. In particular, eradication feasibility is analysed in terms of cost and duration of an eradication programme, using a simulation model based on field-derived parameter values for chromolaena, Chromolaena odorata. Results: The duration of an eradication programme can be reduced by investing in progressively higher amounts of search effort per hectare, but increasing search area will become relatively more expensive as search effort increases. When variation in survey and control success is taken into account, increasing search effort also reduces uncertainty around the required duration of the eradication programme. Main conclusions: Economics is integral to the management of the risks posed by weeds. Decision analysis, based on economic principles, is now commonly used to tackle key issues that confront weed managers. For eradication feasibility, duration and cost of a weed eradication programme are critical components; the dimensions of both factors can usefully be estimated through simulation. © 2013 John Wiley & Sons Ltd.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Decision-making in agriculture is carried out in an uncertain environment with farmers often seeking information to reduce risk. As a result of the extreme variability of rainfall and stream-flows in north-eastern Australia, water supplies for irrigated agriculture are a limiting factor and a source of risk. The present study examined the use of seasonal climate forecasting (SCF) when calculating planting areas for irrigated cotton in the northern Murray Darling Basin. Results show that minimising risk by adjusting plant areas in response to SCF can lead to significant gains in gross margin returns. However, how farmers respond to SCF is dependent on several other factors including irrigators’ attitude towards risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent incidents of mycotoxin contamination (particularly aflatoxins and fumonisins) have demonstrated a need for an industry-wide management system to ensure Australian maize meets the requirements of all domestic users and export markets. Results of recent surveys are presented, demonstrating overall good conformity with nationally accepted industry marketing standards but with occasional samples exceeding these levels. This paper describes mycotoxin-related hazards inherent in the Australian maize production system and a methodology combining good agricultural practices and the hazard analysis critical control point framework to manage risk.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The complexity, variability and vastness of the northern Australian rangelands make it difficult to assess the risks associated with climate change. In this paper we present a methodology to help industry and primary producers assess risks associated with climate change and to assess the effectiveness of adaptation options in managing those risks. Our assessment involved three steps. Initially, the impacts and adaptation responses were documented in matrices by ‘experts’ (rangeland and climate scientists). Then, a modified risk management framework was used to develop risk management matrices that identified important impacts, areas of greatest vulnerability (combination of potential impact and adaptive capacity) and priority areas for action at the industry level. The process was easy to implement and useful for arranging and analysing large amounts of information (both complex and interacting). Lastly, regional extension officers (after minimal ‘climate literacy’ training) could build on existing knowledge provided here and implement the risk management process in workshops with rangeland land managers. Their participation is likely to identify relevant and robust adaptive responses that are most likely to be included in regional and property management decisions. The process developed here for the grazing industry could be modified and used in other industries and sectors. By 2030, some areas of northern Australia will experience more droughts and lower summer rainfall. This poses a serious threat to the rangelands. Although the impacts and adaptive responses will vary between ecological and geographic systems, climate change is expected to have noticeable detrimental effects: reduced pasture growth and surface water availability; increased competition from woody vegetation; decreased production per head (beef and wool) and gross margin; and adverse impacts on biodiversity. Further research and development is needed to identify the most vulnerable regions, and to inform policy in time to facilitate transitional change and enable land managers to implement those changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Because of the variable and changing environment, advisors and farmers are seeking systems that provide risk management support at a number of time scales. The Agricultural Production Systems Research Unit, Toowoomba, Australia has developed a suite of tools to assist advisors and farmers to better manage risk in cropping. These tools range from simple rainfall analysis tools (Rainman, HowWet, HowOften) through crop simulation tools (WhopperCropper and YieldProphet) to the most complex, APSFarm, a whole-farm analysis tool. Most are derivatives of the APSIM crop model. These tools encompass a range of complexity and potential benefit to both the farming community and for government policy. This paper describes, the development and usage of two specific products; WhopperCropper and APSFarm. WhopperCropper facilitates simulation-aided discussion of growers' exposure to risk when comparing alternative crop input options. The user can readily generate 'what-if' scenarios that separate the major influences whilst holding other factors constant. Interactions of the major inputs can also be tested. A manager can examine the effects of input levels (and Southern Oscillation Index phase) to broadly determine input levels that match their attitude to risk. APSFarm has been used to demonstrate that management changes can have different effects in short and long time periods. It can be used to test local advisors and farmers' knowledge and experience of their desired rotation system. This study has shown that crop type has a larger influence than more conservative minimum soil water triggers in the long term. However, in short term dry periods, minimum soil water triggers and maximum area of the various crops can give significant financial gains.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Spotted gum dominant forests occur from Cooktown in northern Queensland (Qld) to Orbost in Victoria (Boland et al. 2006) and these forests are commercially very important with spotted gum the most commonly harvested hardwood timber in Qld and one of the most important in New South Wales (NSW). Spotted gum has a wide range of end uses from solid wood products through to power transmission poles and generally has excellent sawing and timber qualities (Hopewell 2004). The private native forest resource in southern Qld and northern NSW is a critical component of the hardwood timber industry (Anon 2005, Timber Qld 2006) and currently half or more of the native forest timber resource harvested in northern NSW and Qld is sourced from private land. However, in many cases productivity on private lands is well below what could be achieved with appropriate silvicultural management. This project provides silvicultural management tools to assist extension staff, land owners and managers in the south east Qld and north eastern NSW regions. The intent was that this would lead to improvement of the productivity of the private estate through implementation of appropriate management. The other intention of this project was to implement a number of silvicultural experiments and demonstration sites to provide data on growth rates of managed and unmanaged forests so that landholders can make informed decisions on the future management of their forests. To assist forest managers and improve the ability to predict forest productivity in the private resource, the project has developed: • A set of spotted gum specific silvicultural guidelines for timber production on private land that cover both silvicultural treatment and harvesting. The guidelines were developed for extension officers and property owners. • A simple decision support tool, referred to as the spotted gum productivity assessment tool (SPAT), that allows an estimation of: 1. Tree growth productivity on specific sites. Estimation is based on the analysis of site and growth data collected from a large number of yield and experimental plots on Crown land across a wide range of spotted gum forest types. Growth algorithms were developed using tree growth and site data and the algorithms were used to formulate basic economic predictors. 2. Pasture development under a range of tree stockings and the expected livestock carrying capacity at nominated tree stockings for a particular area. 3. Above-ground tree biomass and carbon stored in trees. •A series of experiments in spotted gum forests on private lands across the study area to quantify growth and to provide measures of the effect of silvicultural thinning and different agro-forestry regimes. The adoption and use of these tools by farm forestry extension officers and private land holders in both field operations and in training exercises will, over time, improve the commercial management of spotted gum forests for both timber and grazing. Future measurement of the experimental sites at ages five, 10 and 15 years will provide longer term data on the effects of various stocking rates and thinning regimes and facilitate modification and improvement of these silvicultural prescriptions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Hendra virus is a highly pathogenic novel paramyxovirus causing sporadic fatal infection in horses and humans in Australia. Species of fruit-bats (genus Pteropus), commonly known as flying-foxes, are the natural host of the virus. We undertook a survey of horse owners in the states of Queensland and New South Wales, Australia to assess the level of adoption of recommended risk management strategies and to identify impediments to adoption. Survey questionnaires were completed by 1431 respondents from the target states, and from a spectrum of industry sectors. Hendra virus knowledge varied with sector, but was generally limited, with only 13% of respondents rating their level of knowledge as high or very high. The majority of respondents (63%) had seen their state’s Hendra virus information for horse owners, and a similar proportion found the information useful. Fifty-six percent of respondents thought it moderately, very or extremely likely that a Hendra virus case could occur in their area, yet only 37% said they would consider Hendra virus if their horse was sick. Only 13% of respondents stabled their horses overnight, although another 24% said it would be easy or very easy to do so, but hadn’t done so. Only 13% and 15% of respondents respectively had horse feed bins and water points under solid cover. Responses varied significantly with state, likely reflecting different Hendra virus history. The survey identified inconsistent awareness and/or adoption of available knowledge, confusion in relation to Hendra virus risk perception, with both over-and under-estimation of true risk, and lag in the uptake of recommended risk minimisation strategies, even when these were readily implementable. However, we also identified frustration and potential alienation by horse owners who found the recommended strategies impractical, onerous and prohibitively expensive. The insights gained from this survey have broader application to other complex risk-management scenarios.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The shelf life of mangoes is limited by two main postharvest diseases when not consistently managed. These are anthracnose ( Colletotrichum gloeosporioides) and stem end rots (SER) ( Fusicoccum parvum). The management of these diseases has often relied mainly on the use of fungicides either as field spray treatments or as postharvest dips. These have done a fairly good job at serving the industry and allowing fruits to be transported, stored and sold at markets distant from the areas of production. There are however concerns on the continuous use of these fungicides as the main or only tool for the management of these diseases. This has necessitated a re-think of how these diseases could be sustainably managed into the future using a systems approach that focuses on integrated crop management. It is a holistic approach that considers all the crop protection management strategies including the genetics of the plant and its ability to naturally defend itself from infection with plant activators and growth regulators. It also considers other cultural or agronomic management tools such as the use of crop nutrition, timely application of irrigation water and the pruning of trees on a regular basis as a means of reducing inoculum levels in the orchards. The ultimate aim of this approach is to increase yields and obtain long term sustainable production. It is guided by the sustainable crop production principle which states that producers should apply as little inputs as possible but as much as needed.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Hendra virus (HeV) was first described in 1994 in an outbreak of acute and highly lethal disease in horses and humans in Australia. Equine cases continue to be diagnosed periodically, yet the predisposing factors for infection remain unclear. We undertook an analysis of equine submissions tested for HeV by the Queensland government veterinary reference laboratory over a 20-year period to identify and investigate any patterns. We found a marked increase in testing from July 2008, primarily reflecting a broadening of the HeV clinical case definition. Peaks in submissions for testing, and visitations to the Government HeV website, were associated with reported equine incidents. Significantly differing between-year HeV detection rates in north and south Queensland suggest a fundamental difference in risk exposure between the two regions. The statistical association between HeV detection and stockhorse type may suggest that husbandry is a more important risk determinant than breed per se. The detection of HeV in horses with neither neurological nor respiratory signs poses a risk management challenge for attending veterinarians and laboratory staff, reinforcing animal health authority recommendations that appropriate risk management strategies be employed for all sick horses, and by anyone handling sick horses or associated biological samples.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

SUMMARY Hendra virus (HeV) was first described in 1994 in an outbreak of acute and highly lethal disease in horses and humans in Australia. Equine cases continue to be diagnosed periodically, yet the predisposing factors for infection remain unclear. We undertook an analysis of equine submissions tested for HeV by the Queensland government veterinary reference laboratory over a 20-year period to identify and investigate any patterns. We found a marked increase in testing from July 2008, primarily reflecting a broadening of the HeV clinical case definition. Peaks in submissions for testing, and visitations to the Government HeV website, were associated with reported equine incidents. Significantly differing between-year HeV detection rates in north and south Queensland suggest a fundamental difference in risk exposure between the two regions. The statistical association between HeV detection and stockhorse type may suggest that husbandry is a more important risk determinant than breed per se. The detection of HeV in horses with neither neurological nor respiratory signs poses a risk management challenge for attending veterinarians and laboratory staff, reinforcing animal health authority recommendations that appropriate risk management strategies be employed for all sick horses, and by anyone handling sick horses or associated biological samples.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reliability of supply of feed grain has become a high priority issue for industry in the northern region. Expansion by major intensive livestock and industrial users of grain, combined with high inter-annual variability in seasonal conditions, has generated concern in the industry about reliability of supply. This paper reports on a modelling study undertaken to analyse the reliability of supply of feed grain in the northern region. Feed grain demand was calculated for major industries (cattle feedlots, pigs, poultry, dairy) based on their current size and rate of grain usage. Current demand was estimated to be 2.8Mt. With the development of new industrial users (ethanol) and by projecting the current growth rate of the various intensive livestock industries, it was estimated that demand would grow to 3.6Mt in three years time. Feed grain supply was estimated using shire scale yield prediction models for wheat and sorghum that had been calibrated against recent ABS production data. Other crops that contribute to a lesser extent to the total feed grain pool (barley, maize) were included by considering their production relative to the major winter and summer grains, with estimates based on available production records. This modelling approach allowed simulation of a 101-year time series of yield that showed the extent of the impact of inter-annual climate variability on yield levels. Production estimates were developed from this yield time series by including planted crop area. Area planted data were obtained from ABS and ABARE records. Total production amounts were adjusted to allow for any export and end uses that were not feed grain (flour, malt etc). The median feed grain supply for an average area planted was about 3.1Mt, but this varied greatly from year to year depending on seasonal conditions and area planted. These estimates indicated that supply would not meet current demand in about 30% of years if a median area crop were planted. Two thirds of the years with a supply shortfall were El Nino years. This proportion of years was halved (i.e. 15%) if the area planted increased to that associated with the best 10% of years. Should demand grow as projected in this study, there would be few years where it could be met if a median crop area was planted. With area planted similar to the best 10% of years, there would still be a shortfall in nearly 50% of all years (and 80% of El Nino years). The implications of these results on supply/demand and risk management and investment in research and development are briefly discussed.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Many statistical forecast systems are available to interested users. In order to be useful for decision-making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and their statistical manifestation have been firmly established, the forecasts must also provide some quantitative evidence of `quality’. However, the quality of statistical climate forecast systems (forecast quality) is an ill-defined and frequently misunderstood property. Often, providers and users of such forecast systems are unclear about what ‘quality’ entails and how to measure it, leading to confusion and misinformation. Here we present a generic framework to quantify aspects of forecast quality using an inferential approach to calculate nominal significance levels (p-values) that can be obtained either by directly applying non-parametric statistical tests such as Kruskal-Wallis (KW) or Kolmogorov-Smirnov (KS) or by using Monte-Carlo methods (in the case of forecast skill scores). Once converted to p-values, these forecast quality measures provide a means to objectively evaluate and compare temporal and spatial patterns of forecast quality across datasets and forecast systems. Our analysis demonstrates the importance of providing p-values rather than adopting some arbitrarily chosen significance levels such as p < 0.05 or p < 0.01, which is still common practice. This is illustrated by applying non-parametric tests (such as KW and KS) and skill scoring methods (LEPS and RPSS) to the 5-phase Southern Oscillation Index classification system using historical rainfall data from Australia, The Republic of South Africa and India. The selection of quality measures is solely based on their common use and does not constitute endorsement. We found that non-parametric statistical tests can be adequate proxies for skill measures such as LEPS or RPSS. The framework can be implemented anywhere, regardless of dataset, forecast system or quality measure. Eventually such inferential evidence should be complimented by descriptive statistical methods in order to fully assist in operational risk management.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In recent years, there have been significant developments in climate science relevant to agriculture and natural resource management. Assessing impacts of climate variability and use of seasonal climate forecasts have become increasingly important elements in the management "toolkit" for many Australian farmers. Consideration of climate change further increases the need for improved management strategies. While climate risk extension activities have kept pace with advances in climate science, a national review of the Vocational Education and Training system in Australia in relation to "weather and climate" showed that these topics were "poorly represented" at the management level in the Australian Qualifications Framework, and needed increased emphasis. Consequently, a new Unit of Competency concerning management of climatic risk was developed and accredited to address this deficiency. The objective of the unit was to build knowledge and skills for better management of climate variability via the elements of surveying climatic and enterprise data; analysing climatic risks and opportunities; and developing climatic risk management strategies. This paper describes establishment of a new unit for vocational education that is designed to harness recent developments in applied climate science for better management of Australia's highly variable climate. The main benefits of the new unit of competency, "Developing climatic risk management strategies,"were seen as improving decisions in climate and agriculture, and reducing climate risk exposure to enhance sustainable agriculture. The educational unit is now within the scope of agricultural colleges, universities, and registered training organisations as an accredited unit.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The APSIM-Wheat module was used to investigate our present capacity to simulate wheat yields in a semi-arid region of eastern Australia (the Victorian Mallee), where hostile subsoils associated with salinity, sodicity, and boron toxicity are known to limit grain yield. In this study we tested whether the effects of subsoil constraints on wheat growth and production could be modelled with APSIM-Wheat by assuming that either: (a) root exploration within a particular soil layer was reduced by the presence of toxic concentrations of salts, or (b) soil water uptake from a particular soil layer was reduced by high concentration of salts through osmotic effects. After evaluating the improved predictive capacity of the model we applied it to study the interactions between subsoil constraints and seasonal conditions, and to estimate the economic effect that subsoil constraints have on wheat farming in the Victorian Mallee under different climatic scenarios. Although the soils had high levels of salinity, sodicity, and boron, the observed variability in root abundance at different soil layers was mainly related to soil salinity. We concluded that: (i) whether the effect of subsoil limitations on growth and yield of wheat in the Victorian Mallee is driven by toxic, osmotic, or both effects acting simultaneously still requires further research, (ii) at present, the performance of APSIM-Wheat in the region can be improved either by assuming increased values of lower limit for soil water extraction, or by modifying the pattern of root exploration in the soil pro. le, both as a function of soil salinity. The effect of subsoil constraints on wheat yield and gross margin can be expected to be higher during drier than wetter seasons. In this region the interaction between climate and soil properties makes rainfall information alone, of little use for risk management and farm planning when not integrated with cropping systems models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The traditional reductionist approach to science has a tendency to create 'islands of knowledge in a sea of ignorance', with a much stronger focus on analysis of scientific inputs rather than synthesis of socially relevant outcomes. This might be the principal reason why intended end users of climate information generally fail to embrace what the climate science community has to offer. The translation of climate information into real-life action requires 3 essential components: salience (the perceived relevance of the information), credibility (the perceived technical quality of the information) and legitimacy (the perceived objectivity of the process by which the information is shared). We explore each of these components using 3 case studies focused on dryland cropping in Australia, India and Brazil. In regards to 'salience' we discuss the challenge for climate science to be 'policy-relevant', using Australian drought policy as an example. In a village in southern India 'credibility' was gained through engagement between scientists and risk managers with the aim of building social capital, achieved only at high cost to science institutions. Finally, in Brazil we found that 'legitimacy' is a fragile, yet renewable resource that needs to be part of the package for successful climate applications; legitimacy can be easily eroded but is difficult to recover. We conclude that climate risk management requires holistic solutions derived from cross-disciplinary and participatory, user-oriented research. Approaches that combine climate, agroecological and socioeconomic models provide the scientific capabilities for establishment of 'borderless' institutions without disciplinary constraints. Such institutions could provide the necessary support and flexibility to deliver the social benefits of climate science across diverse contexts. Our case studies show that this type of solution is already being applied, and suggest that the climate science community attempt to address existing institutional constraints, which still impede climate risk management.