16 resultados para Downside Risk Management
Resumo:
Decision-making in agriculture is carried out in an uncertain environment with farmers often seeking information to reduce risk. As a result of the extreme variability of rainfall and stream-flows in north-eastern Australia, water supplies for irrigated agriculture are a limiting factor and a source of risk. The present study examined the use of seasonal climate forecasting (SCF) when calculating planting areas for irrigated cotton in the northern Murray Darling Basin. Results show that minimising risk by adjusting plant areas in response to SCF can lead to significant gains in gross margin returns. However, how farmers respond to SCF is dependent on several other factors including irrigators’ attitude towards risk.
Resumo:
Recent incidents of mycotoxin contamination (particularly aflatoxins and fumonisins) have demonstrated a need for an industry-wide management system to ensure Australian maize meets the requirements of all domestic users and export markets. Results of recent surveys are presented, demonstrating overall good conformity with nationally accepted industry marketing standards but with occasional samples exceeding these levels. This paper describes mycotoxin-related hazards inherent in the Australian maize production system and a methodology combining good agricultural practices and the hazard analysis critical control point framework to manage risk.
Resumo:
The complexity, variability and vastness of the northern Australian rangelands make it difficult to assess the risks associated with climate change. In this paper we present a methodology to help industry and primary producers assess risks associated with climate change and to assess the effectiveness of adaptation options in managing those risks. Our assessment involved three steps. Initially, the impacts and adaptation responses were documented in matrices by ‘experts’ (rangeland and climate scientists). Then, a modified risk management framework was used to develop risk management matrices that identified important impacts, areas of greatest vulnerability (combination of potential impact and adaptive capacity) and priority areas for action at the industry level. The process was easy to implement and useful for arranging and analysing large amounts of information (both complex and interacting). Lastly, regional extension officers (after minimal ‘climate literacy’ training) could build on existing knowledge provided here and implement the risk management process in workshops with rangeland land managers. Their participation is likely to identify relevant and robust adaptive responses that are most likely to be included in regional and property management decisions. The process developed here for the grazing industry could be modified and used in other industries and sectors. By 2030, some areas of northern Australia will experience more droughts and lower summer rainfall. This poses a serious threat to the rangelands. Although the impacts and adaptive responses will vary between ecological and geographic systems, climate change is expected to have noticeable detrimental effects: reduced pasture growth and surface water availability; increased competition from woody vegetation; decreased production per head (beef and wool) and gross margin; and adverse impacts on biodiversity. Further research and development is needed to identify the most vulnerable regions, and to inform policy in time to facilitate transitional change and enable land managers to implement those changes.
Resumo:
Aim: Decision-making in weed management involves consideration of limited budgets, long time horizons, conflicting priorities, and as a result, trade-offs. Economics provides tools that allow these issues to be addressed and is thus integral to management of the risks posed by weeds. One of the critical issues in weed risk management during the early stages of an invasion concerns feasibility of eradication. We briefly review how economics may be used in weed risk management, concentrating on this management strategy. Location: Australia. Methods: A range of innovative studies that investigate aspects of weed risk management are reviewed. We show how these could be applied to newly invading weeds, focussing on methods for investigating eradication feasibility. In particular, eradication feasibility is analysed in terms of cost and duration of an eradication programme, using a simulation model based on field-derived parameter values for chromolaena, Chromolaena odorata. Results: The duration of an eradication programme can be reduced by investing in progressively higher amounts of search effort per hectare, but increasing search area will become relatively more expensive as search effort increases. When variation in survey and control success is taken into account, increasing search effort also reduces uncertainty around the required duration of the eradication programme. Main conclusions: Economics is integral to the management of the risks posed by weeds. Decision analysis, based on economic principles, is now commonly used to tackle key issues that confront weed managers. For eradication feasibility, duration and cost of a weed eradication programme are critical components; the dimensions of both factors can usefully be estimated through simulation. © 2013 John Wiley & Sons Ltd.
Resumo:
Hendra virus is a highly pathogenic novel paramyxovirus causing sporadic fatal infection in horses and humans in Australia. Species of fruit-bats (genus Pteropus), commonly known as flying-foxes, are the natural host of the virus. We undertook a survey of horse owners in the states of Queensland and New South Wales, Australia to assess the level of adoption of recommended risk management strategies and to identify impediments to adoption. Survey questionnaires were completed by 1431 respondents from the target states, and from a spectrum of industry sectors. Hendra virus knowledge varied with sector, but was generally limited, with only 13% of respondents rating their level of knowledge as high or very high. The majority of respondents (63%) had seen their state’s Hendra virus information for horse owners, and a similar proportion found the information useful. Fifty-six percent of respondents thought it moderately, very or extremely likely that a Hendra virus case could occur in their area, yet only 37% said they would consider Hendra virus if their horse was sick. Only 13% of respondents stabled their horses overnight, although another 24% said it would be easy or very easy to do so, but hadn’t done so. Only 13% and 15% of respondents respectively had horse feed bins and water points under solid cover. Responses varied significantly with state, likely reflecting different Hendra virus history. The survey identified inconsistent awareness and/or adoption of available knowledge, confusion in relation to Hendra virus risk perception, with both over-and under-estimation of true risk, and lag in the uptake of recommended risk minimisation strategies, even when these were readily implementable. However, we also identified frustration and potential alienation by horse owners who found the recommended strategies impractical, onerous and prohibitively expensive. The insights gained from this survey have broader application to other complex risk-management scenarios.
Resumo:
Because of the variable and changing environment, advisors and farmers are seeking systems that provide risk management support at a number of time scales. The Agricultural Production Systems Research Unit, Toowoomba, Australia has developed a suite of tools to assist advisors and farmers to better manage risk in cropping. These tools range from simple rainfall analysis tools (Rainman, HowWet, HowOften) through crop simulation tools (WhopperCropper and YieldProphet) to the most complex, APSFarm, a whole-farm analysis tool. Most are derivatives of the APSIM crop model. These tools encompass a range of complexity and potential benefit to both the farming community and for government policy. This paper describes, the development and usage of two specific products; WhopperCropper and APSFarm. WhopperCropper facilitates simulation-aided discussion of growers' exposure to risk when comparing alternative crop input options. The user can readily generate 'what-if' scenarios that separate the major influences whilst holding other factors constant. Interactions of the major inputs can also be tested. A manager can examine the effects of input levels (and Southern Oscillation Index phase) to broadly determine input levels that match their attitude to risk. APSFarm has been used to demonstrate that management changes can have different effects in short and long time periods. It can be used to test local advisors and farmers' knowledge and experience of their desired rotation system. This study has shown that crop type has a larger influence than more conservative minimum soil water triggers in the long term. However, in short term dry periods, minimum soil water triggers and maximum area of the various crops can give significant financial gains.
Resumo:
Hendra virus (HeV) was first described in 1994 in an outbreak of acute and highly lethal disease in horses and humans in Australia. Equine cases continue to be diagnosed periodically, yet the predisposing factors for infection remain unclear. We undertook an analysis of equine submissions tested for HeV by the Queensland government veterinary reference laboratory over a 20-year period to identify and investigate any patterns. We found a marked increase in testing from July 2008, primarily reflecting a broadening of the HeV clinical case definition. Peaks in submissions for testing, and visitations to the Government HeV website, were associated with reported equine incidents. Significantly differing between-year HeV detection rates in north and south Queensland suggest a fundamental difference in risk exposure between the two regions. The statistical association between HeV detection and stockhorse type may suggest that husbandry is a more important risk determinant than breed per se. The detection of HeV in horses with neither neurological nor respiratory signs poses a risk management challenge for attending veterinarians and laboratory staff, reinforcing animal health authority recommendations that appropriate risk management strategies be employed for all sick horses, and by anyone handling sick horses or associated biological samples.
Resumo:
SUMMARY Hendra virus (HeV) was first described in 1994 in an outbreak of acute and highly lethal disease in horses and humans in Australia. Equine cases continue to be diagnosed periodically, yet the predisposing factors for infection remain unclear. We undertook an analysis of equine submissions tested for HeV by the Queensland government veterinary reference laboratory over a 20-year period to identify and investigate any patterns. We found a marked increase in testing from July 2008, primarily reflecting a broadening of the HeV clinical case definition. Peaks in submissions for testing, and visitations to the Government HeV website, were associated with reported equine incidents. Significantly differing between-year HeV detection rates in north and south Queensland suggest a fundamental difference in risk exposure between the two regions. The statistical association between HeV detection and stockhorse type may suggest that husbandry is a more important risk determinant than breed per se. The detection of HeV in horses with neither neurological nor respiratory signs poses a risk management challenge for attending veterinarians and laboratory staff, reinforcing animal health authority recommendations that appropriate risk management strategies be employed for all sick horses, and by anyone handling sick horses or associated biological samples.
Resumo:
Reliability of supply of feed grain has become a high priority issue for industry in the northern region. Expansion by major intensive livestock and industrial users of grain, combined with high inter-annual variability in seasonal conditions, has generated concern in the industry about reliability of supply. This paper reports on a modelling study undertaken to analyse the reliability of supply of feed grain in the northern region. Feed grain demand was calculated for major industries (cattle feedlots, pigs, poultry, dairy) based on their current size and rate of grain usage. Current demand was estimated to be 2.8Mt. With the development of new industrial users (ethanol) and by projecting the current growth rate of the various intensive livestock industries, it was estimated that demand would grow to 3.6Mt in three years time. Feed grain supply was estimated using shire scale yield prediction models for wheat and sorghum that had been calibrated against recent ABS production data. Other crops that contribute to a lesser extent to the total feed grain pool (barley, maize) were included by considering their production relative to the major winter and summer grains, with estimates based on available production records. This modelling approach allowed simulation of a 101-year time series of yield that showed the extent of the impact of inter-annual climate variability on yield levels. Production estimates were developed from this yield time series by including planted crop area. Area planted data were obtained from ABS and ABARE records. Total production amounts were adjusted to allow for any export and end uses that were not feed grain (flour, malt etc). The median feed grain supply for an average area planted was about 3.1Mt, but this varied greatly from year to year depending on seasonal conditions and area planted. These estimates indicated that supply would not meet current demand in about 30% of years if a median area crop were planted. Two thirds of the years with a supply shortfall were El Nino years. This proportion of years was halved (i.e. 15%) if the area planted increased to that associated with the best 10% of years. Should demand grow as projected in this study, there would be few years where it could be met if a median crop area was planted. With area planted similar to the best 10% of years, there would still be a shortfall in nearly 50% of all years (and 80% of El Nino years). The implications of these results on supply/demand and risk management and investment in research and development are briefly discussed.
Resumo:
Many statistical forecast systems are available to interested users. In order to be useful for decision-making, these systems must be based on evidence of underlying mechanisms. Once causal connections between the mechanism and their statistical manifestation have been firmly established, the forecasts must also provide some quantitative evidence of `quality’. However, the quality of statistical climate forecast systems (forecast quality) is an ill-defined and frequently misunderstood property. Often, providers and users of such forecast systems are unclear about what ‘quality’ entails and how to measure it, leading to confusion and misinformation. Here we present a generic framework to quantify aspects of forecast quality using an inferential approach to calculate nominal significance levels (p-values) that can be obtained either by directly applying non-parametric statistical tests such as Kruskal-Wallis (KW) or Kolmogorov-Smirnov (KS) or by using Monte-Carlo methods (in the case of forecast skill scores). Once converted to p-values, these forecast quality measures provide a means to objectively evaluate and compare temporal and spatial patterns of forecast quality across datasets and forecast systems. Our analysis demonstrates the importance of providing p-values rather than adopting some arbitrarily chosen significance levels such as p < 0.05 or p < 0.01, which is still common practice. This is illustrated by applying non-parametric tests (such as KW and KS) and skill scoring methods (LEPS and RPSS) to the 5-phase Southern Oscillation Index classification system using historical rainfall data from Australia, The Republic of South Africa and India. The selection of quality measures is solely based on their common use and does not constitute endorsement. We found that non-parametric statistical tests can be adequate proxies for skill measures such as LEPS or RPSS. The framework can be implemented anywhere, regardless of dataset, forecast system or quality measure. Eventually such inferential evidence should be complimented by descriptive statistical methods in order to fully assist in operational risk management.
Resumo:
In recent years, there have been significant developments in climate science relevant to agriculture and natural resource management. Assessing impacts of climate variability and use of seasonal climate forecasts have become increasingly important elements in the management "toolkit" for many Australian farmers. Consideration of climate change further increases the need for improved management strategies. While climate risk extension activities have kept pace with advances in climate science, a national review of the Vocational Education and Training system in Australia in relation to "weather and climate" showed that these topics were "poorly represented" at the management level in the Australian Qualifications Framework, and needed increased emphasis. Consequently, a new Unit of Competency concerning management of climatic risk was developed and accredited to address this deficiency. The objective of the unit was to build knowledge and skills for better management of climate variability via the elements of surveying climatic and enterprise data; analysing climatic risks and opportunities; and developing climatic risk management strategies. This paper describes establishment of a new unit for vocational education that is designed to harness recent developments in applied climate science for better management of Australia's highly variable climate. The main benefits of the new unit of competency, "Developing climatic risk management strategies,"were seen as improving decisions in climate and agriculture, and reducing climate risk exposure to enhance sustainable agriculture. The educational unit is now within the scope of agricultural colleges, universities, and registered training organisations as an accredited unit.
Resumo:
The APSIM-Wheat module was used to investigate our present capacity to simulate wheat yields in a semi-arid region of eastern Australia (the Victorian Mallee), where hostile subsoils associated with salinity, sodicity, and boron toxicity are known to limit grain yield. In this study we tested whether the effects of subsoil constraints on wheat growth and production could be modelled with APSIM-Wheat by assuming that either: (a) root exploration within a particular soil layer was reduced by the presence of toxic concentrations of salts, or (b) soil water uptake from a particular soil layer was reduced by high concentration of salts through osmotic effects. After evaluating the improved predictive capacity of the model we applied it to study the interactions between subsoil constraints and seasonal conditions, and to estimate the economic effect that subsoil constraints have on wheat farming in the Victorian Mallee under different climatic scenarios. Although the soils had high levels of salinity, sodicity, and boron, the observed variability in root abundance at different soil layers was mainly related to soil salinity. We concluded that: (i) whether the effect of subsoil limitations on growth and yield of wheat in the Victorian Mallee is driven by toxic, osmotic, or both effects acting simultaneously still requires further research, (ii) at present, the performance of APSIM-Wheat in the region can be improved either by assuming increased values of lower limit for soil water extraction, or by modifying the pattern of root exploration in the soil pro. le, both as a function of soil salinity. The effect of subsoil constraints on wheat yield and gross margin can be expected to be higher during drier than wetter seasons. In this region the interaction between climate and soil properties makes rainfall information alone, of little use for risk management and farm planning when not integrated with cropping systems models.
Resumo:
The traditional reductionist approach to science has a tendency to create 'islands of knowledge in a sea of ignorance', with a much stronger focus on analysis of scientific inputs rather than synthesis of socially relevant outcomes. This might be the principal reason why intended end users of climate information generally fail to embrace what the climate science community has to offer. The translation of climate information into real-life action requires 3 essential components: salience (the perceived relevance of the information), credibility (the perceived technical quality of the information) and legitimacy (the perceived objectivity of the process by which the information is shared). We explore each of these components using 3 case studies focused on dryland cropping in Australia, India and Brazil. In regards to 'salience' we discuss the challenge for climate science to be 'policy-relevant', using Australian drought policy as an example. In a village in southern India 'credibility' was gained through engagement between scientists and risk managers with the aim of building social capital, achieved only at high cost to science institutions. Finally, in Brazil we found that 'legitimacy' is a fragile, yet renewable resource that needs to be part of the package for successful climate applications; legitimacy can be easily eroded but is difficult to recover. We conclude that climate risk management requires holistic solutions derived from cross-disciplinary and participatory, user-oriented research. Approaches that combine climate, agroecological and socioeconomic models provide the scientific capabilities for establishment of 'borderless' institutions without disciplinary constraints. Such institutions could provide the necessary support and flexibility to deliver the social benefits of climate science across diverse contexts. Our case studies show that this type of solution is already being applied, and suggest that the climate science community attempt to address existing institutional constraints, which still impede climate risk management.
Resumo:
This project provided information on the genetics of crown rot (CR) resistance to help breeding work, located new parent lines in wheat and barley, and provided an insight into yield losses that occur in commercial varieties with increasing levels of CR for risk management. Genetic experiments found some highly resistant lines were poor parents, and CR resistance was complex. Best parent lines and many specific crosses were identified for further work. New potential parent lines were identified in wheat and barley, some now used in breeding programs. Yield loss can be severe even with low levels of CR when combined with drought stress. CR can reduce yield even with a wet finish.
Resumo:
Nipah virus (NiV) (Genus Henipavirus) is a recently emerged zoonotic virus that causes severe disease in humans and has been found in bats of the genus Pteropus. Whilst NiV has not been detected in Australia, evidence for NiV-infection has been found in pteropid bats in some of Australia's closest neighbours. The aim of this study was to determine the occurrence of henipaviruses in fruit bat (Family Pteropodidae) populations to the north of Australia. In particular we tested the hypothesis that Nipah virus is restricted to west of Wallace's Line. Fruit bats from Australia, Papua New Guinea, East Timor and Indonesia were tested for the presence of antibodies to Hendra virus (HeV) and Nipah virus, and tested for the presence of HeV, NiV or henipavirus RNA by PCR. Evidence was found for the presence of Nipah virus in both Pteropus vampyrus and Rousettus amplexicaudatus populations from East Timor. Serology and PCR also suggested the presence of a henipavirus that was neither HeV nor NiV in Pteropus alecto and Acerodon celebensis. The results demonstrate the presence of NiV in the fruit bat populations on the eastern side of Wallace's Line and within 500 km of Australia. They indicate the presence of non-NiV, non-HeV henipaviruses in fruit bat populations of Sulawesi and Sumba and possibly in Papua New Guinea. It appears that NiV is present where P. vampyrus occurs, such as in the fruit bat populations of Timor, but where this bat species is absent other henipaviruses may be present, as on Sulawesi and Sumba. Evidence was obtained for the presence henipaviruses in the non-Pteropid species R. amplexicaudatus and in A. celebensis. The findings of this work fill some gaps in knowledge in geographical and species distribution of henipaviruses in Australasia which will contribute to planning of risk management and surveillance activities.