200 resultados para Predation risk assessment
Resumo:
Public and private sector organisations worldwide are putting strategies in place to manage the commercial and operational risks of climate change. However, community organisations are lagging behind in their understanding and preparedness, despite them being among the most exposed to the effects of climate change impacts and regulation. This poster presents a proposal for a multidisciplinary study that addresses this issue by developing, testing and applying a novel climate risk assessment methodology that is tailored to the needs of Australia’s community sector and its clients. Strategies to mitigate risks and build resilience and adaptive capacity will be identified including new opportunities afforded by urban informatics, social media, and technologies of scale making.
Resumo:
Background: Despite declining rates of cardiovascular disease (CVD) mortality in developed countries, lower socioeconomic groups continue to experience a greater burden of the disease. There are now many evidence-based treatments and prevention strategies for the management of CVD and it is essential that their impact on the more disadvantaged group is understood if socioeconomic inequalities in CVD are to be reduced. Aims: To determine whether key interventions for CVD prevention and treatment are effective among lower socioeconomic groups, to describe barriers to their effectiveness and the potential or actual impact of these interventions on the socioeconomic gradient in CVD. Methods: Interventions were selected from four stages of the CVD continuum. These included smoking reduction strategies, absolute risk assessment, cardiac rehabilitation, secondary prevention medications, and heart failure self-management programmes. Electronic searches were conducted using terms for each intervention combined with terms for socioeconomic status (SES). Results: Only limited evidence was found for the effectiveness of the selected interventions among lower SES groups and there was little exploration of socioeconomic-related barriers to their uptake. Some broad themes and key messages were identified. In the majority of findings examined, it was clear that the underlying material, social and environmental factors associated with disadvantage are a significant barrier to the effectiveness of interventions. Conclusion: Opportunities to reduce socioeconomic inequalities occur at all stages of the CVD continuum. Despite this, current treatment and prevention strategies may be contributing to the widening socioeconomic-CVD gradient. Further research into the impact of best-practice interventions for CVD upon lower SES groups is required.
Resumo:
A method of eliciting prior distributions for Bayesian models using expert knowledge is proposed. Elicitation is a widely studied problem, from a psychological perspective as well as from a statistical perspective. Here, we are interested in combining opinions from more than one expert using an explicitly model-based approach so that we may account for various sources of variation affecting elicited expert opinions. We use a hierarchical model to achieve this. We apply this approach to two problems. The first problem involves a food risk assessment problem involving modelling dose-response for Listeria monocytogenes contamination of mice. The second concerns the time taken by PhD students to submit their thesis in a particular school.
Resumo:
Intelligible and accurate risk-based decision-making requires a complex balance of information from different sources, appropriate statistical analysis of this information and consequent intelligent inference and decisions made on the basis of these analyses. Importantly, this requires an explicit acknowledgement of uncertainty in the inputs and outputs of the statistical model. The aim of this paper is to progress a discussion of these issues in the context of several motivating problems related to the wider scope of agricultural production. These problems include biosecurity surveillance design, pest incursion, environmental monitoring and import risk assessment. The information to be integrated includes observational and experimental data, remotely sensed data and expert information. We describe our efforts in addressing these problems using Bayesian models and Bayesian networks. These approaches provide a coherent and transparent framework for modelling complex systems, combining the different information sources, and allowing for uncertainty in inputs and outputs. While the theory underlying Bayesian modelling has a long and well established history, its application is only now becoming more possible for complex problems, due to increased availability of methodological and computational tools. Of course, there are still hurdles and constraints, which we also address through sharing our endeavours and experiences.
Resumo:
The low stream salinity naturally in the Nebine-Mungallala Catchment, extent of vegetation retention, relatively low rainfall and high evaporation indicates that there is a relatively low risk of rising shallow groundwater tables in the catchment. Scalding caused by wind and water erosion exposing highly saline sub-soils is a more important regional issue, such as in the Homeboin area. Local salinisation associated with evaporation of bore water from free flowing bore drains and bores is also an important land degradation issue particularly in the lower Nebine, Wallam and Mungallala Creeks. The replacement of free flowing artesian bores and bore drains with capped bores and piped water systems under the Great Artesian Basin bore rehabilitation program is addressing local salinisation and scalding in the vicinity of bore drains and preventing the discharge of saline bore water to streams. Three principles for the prevention and control of salinity in the Nebine Mungallala catchment have been identified in this review: • Avoid salinity through avoiding scalds – i.e. not exposing the near-surface salt in landscape through land degradation; • Riparian zone management: Scalding often occurs within 200m or so of watering lines. Natural drainage lines are most likely to be overstocked, and thus have potential for scalding. Scalding begins when vegetation is removed, and without that binding cover, wind and water erosion exposes the subsoil; and • Monitoring of exposed or grazed soil areas. Based on the findings of the study, we make the following recommendations: 1. Undertake a geotechnical study of existing maps and other data to help identify and target areas most at risk of rising water tables causing salinity. Selected monitoring should then be established using piezometers as an early warning system. 2. SW NRM should financially support scald reclamation activity through its various funding programs. However, for this to have any validity in the overall management of salinity risk, it is critical that such funding require the landholder to undertake a salinity hazard/risk assessment of his/her holding. 3. A staged approach to funding may be appropriate. In the first instance, it would be reasonable to commence funding some pilot scald reclamation work with a view to further developing and piloting the farm hazard/risk assessment tools, and exploring how subsequent grazing management strategies could be incorporated within other extension and management activities. Once the details of the necessary farm level activities have been more clearly defined, and following the outcomes of the geotechnical review recommended above, a more comprehensive funding package could be rolled out to priority areas. 4. We recommend that best-practice grazing management training currently on offer should be enhanced with information about salinity risk in scald-prone areas, and ways of minimising the likelihood of scald formation. 5. We recommend that course material be developed for local students in Years 6 and 7, and that arrangements be made with local schools to present this information. Given the constraints of existing syllabi, we envisage that negotiations may have to be undertaken with the Department of Education in order for this material to be permitted to be used. We have contact with key people who could help in this if required. 6. We recommend that SW NRM continue to support existing extension activities such as Grazing Land Management and the Monitoring Made Easy tools. These aids should be able to be easily expanding to incorporate techniques for monitoring, addressing and preventing salinity and scalding. At the time of writing staff of SW NRM were actively involved in this process. It is important that these activities are adequately resourced to facilitate the uptake by landholders of the perception that salinity is an issue that needs to be addressed as part of everyday management. 7. We recommend that SW NRM consider investing in the development and deployment of a scenario-modelling learning support tool as part of the awareness raising and education activities. Secondary salinity is a dynamic process that results from ongoing human activity which mobilises and/or exposes salt occurring naturally in the landscape. Time scales can be short to very long, and the benefits of management actions can similarly have immediate or very long time frames. One way to help explain the dynamics of these processes is through scenario modelling.
Resumo:
Different international plant protection organisations advocate different schemes for conducting pest risk assessments. Most of these schemes use structured questionnaire in which experts are asked to score several items using an ordinal scale. The scores are then combined using a range of procedures, such as simple arithmetic mean, weighted averages, multiplication of scores, and cumulative sums. The most useful schemes will correctly identify harmful pests and identify ones that are not. As the quality of a pest risk assessment can depend on the characteristics of the scoring system used by the risk assessors (i.e., on the number of points of the scale and on the method used for combining the component scores), it is important to assess and compare the performance of different scoring systems. In this article, we proposed a new method for assessing scoring systems. Its principle is to simulate virtual data using a stochastic model and, then, to estimate sensitivity and specificity values from these data for different scoring systems. The interest of our approach was illustrated in a case study where several scoring systems were compared. Data for this analysis were generated using a probabilistic model describing the pest introduction process. The generated data were then used to simulate the outcome of scoring systems and to assess the accuracy of the decisions about positive and negative introduction. The results showed that ordinal scales with at most 5 or 6 points were sufficient and that the multiplication-based scoring systems performed better than their sum-based counterparts. The proposed method could be used in the future to assess a great diversity of scoring systems.
Resumo:
In 2008, a three-year pilot ‘pay for performance’ (P4P) program, known as ‘Clinical Practice Improvement Payment’ (CPIP) was introduced into Queensland Health (QHealth). QHealth is a large public health sector provider of acute, community, and public health services in Queensland, Australia. The organisation has recently embarked on a significant reform agenda including a review of existing funding arrangements (Duckett et al., 2008). Partly in response to this reform agenda, a casemix funding model has been implemented to reconnect health care funding with outcomes. CPIP was conceptualised as a performance-based scheme that rewarded quality with financial incentives. This is the first time such a scheme has been implemented into the public health sector in Australia with a focus on rewarding quality, and it is unique in that it has a large state-wide focus and includes 15 Districts. CPIP initially targeted five acute and community clinical areas including Mental Health, Discharge Medication, Emergency Department, Chronic Obstructive Pulmonary Disease, and Stroke. The CPIP scheme was designed around key concepts including the identification of clinical indicators that met the set criteria of: high disease burden, a well defined single diagnostic group or intervention, significant variations in clinical outcomes and/or practices, a good evidence, and clinician control and support (Ward, Daniels, Walker & Duckett, 2007). This evaluative research targeted Phase One of implementation of the CPIP scheme from January 2008 to March 2009. A formative evaluation utilising a mixed methodology and complementarity analysis was undertaken. The research involved three research questions and aimed to determine the knowledge, understanding, and attitudes of clinicians; identify improvements to the design, administration, and monitoring of CPIP; and determine the financial and economic costs of the scheme. Three key studies were undertaken to ascertain responses to the key research questions. Firstly, a survey of clinicians was undertaken to examine levels of knowledge and understanding and their attitudes to the scheme. Secondly, the study sought to apply Statistical Process Control (SPC) to the process indicators to assess if this enhanced the scheme and a third study examined a simple economic cost analysis. The CPIP Survey of clinicians elicited 192 clinician respondents. Over 70% of these respondents were supportive of the continuation of the CPIP scheme. This finding was also supported by the results of a quantitative altitude survey that identified positive attitudes in 6 of the 7 domains-including impact, awareness and understanding and clinical relevance, all being scored positive across the combined respondent group. SPC as a trending tool may play an important role in the early identification of indicator weakness for the CPIP scheme. This evaluative research study supports a previously identified need in the literature for a phased introduction of Pay for Performance (P4P) type programs. It further highlights the value of undertaking a formal risk assessment of clinician, management, and systemic levels of literacy and competency with measurement and monitoring of quality prior to a phased implementation. This phasing can then be guided by a P4P Design Variable Matrix which provides a selection of program design options such as indicator target and payment mechanisms. It became evident that a clear process is required to standardise how clinical indicators evolve over time and direct movement towards more rigorous ‘pay for performance’ targets and the development of an optimal funding model. Use of this matrix will enable the scheme to mature and build the literacy and competency of clinicians and the organisation as implementation progresses. Furthermore, the research identified that CPIP created a spotlight on clinical indicators and incentive payments of over five million from a potential ten million was secured across the five clinical areas in the first 15 months of the scheme. This indicates that quality was rewarded in the new QHealth funding model, and despite issues being identified with the payment mechanism, funding was distributed. The economic model used identified a relative low cost of reporting (under $8,000) as opposed to funds secured of over $300,000 for mental health as an example. Movement to a full cost effectiveness study of CPIP is supported. Overall the introduction of the CPIP scheme into QHealth has been a positive and effective strategy for engaging clinicians in quality and has been the catalyst for the identification and monitoring of valuable clinical process indicators. This research has highlighted that clinicians are supportive of the scheme in general; however, there are some significant risks that include the functioning of the CPIP payment mechanism. Given clinician support for the use of a pay–for-performance methodology in QHealth, the CPIP scheme has the potential to be a powerful addition to a multi-faceted suite of quality improvement initiatives within QHealth.
Resumo:
There is no doubt that fraud in relation to land transactions is a problem that resonates amongst land academics, practitioners, and stakeholders involved in conveyancing. As each land registration and conveyancing process increasingly moves towards a fully electronic environment, we need to make sure that we understand and guard against the frauds that can occur. What this paper does is examine the types of fraud that have occurred in paper-based conveyancing systems in Australia and considers how they might be undertaken in the National Electronic Conveyancing System (NECS) that is currently under development. Whilst no system can ever be infallible, it is suggested that by correctly imposing the responsibility for identity verification on the appropriate individual, the conveyancing system adopted can achieve the optimum level of fairness in terms of allocation of responsibility and loss. As we sit on the cusp of a new era of electronic conveyancing, the framework suggested here provides a model for minimising the risks of forged mortgages and appropriately allocating the loss. Importantly it also recognises that the electronic environment will see new opportunities for those with criminal intent to undermine the integrity of land transactions. An appreciation of this now, can see the appropriate measures put in place to minimise the risk.
Resumo:
The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.