815 resultados para Predation risk assessment


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background: Despite declining rates of cardiovascular disease (CVD) mortality in developed countries, lower socioeconomic groups continue to experience a greater burden of the disease. There are now many evidence-based treatments and prevention strategies for the management of CVD and it is essential that their impact on the more disadvantaged group is understood if socioeconomic inequalities in CVD are to be reduced. Aims: To determine whether key interventions for CVD prevention and treatment are effective among lower socioeconomic groups, to describe barriers to their effectiveness and the potential or actual impact of these interventions on the socioeconomic gradient in CVD. Methods: Interventions were selected from four stages of the CVD continuum. These included smoking reduction strategies, absolute risk assessment, cardiac rehabilitation, secondary prevention medications, and heart failure self-management programmes. Electronic searches were conducted using terms for each intervention combined with terms for socioeconomic status (SES). Results: Only limited evidence was found for the effectiveness of the selected interventions among lower SES groups and there was little exploration of socioeconomic-related barriers to their uptake. Some broad themes and key messages were identified. In the majority of findings examined, it was clear that the underlying material, social and environmental factors associated with disadvantage are a significant barrier to the effectiveness of interventions. Conclusion: Opportunities to reduce socioeconomic inequalities occur at all stages of the CVD continuum. Despite this, current treatment and prevention strategies may be contributing to the widening socioeconomic-CVD gradient. Further research into the impact of best-practice interventions for CVD upon lower SES groups is required.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A method of eliciting prior distributions for Bayesian models using expert knowledge is proposed. Elicitation is a widely studied problem, from a psychological perspective as well as from a statistical perspective. Here, we are interested in combining opinions from more than one expert using an explicitly model-based approach so that we may account for various sources of variation affecting elicited expert opinions. We use a hierarchical model to achieve this. We apply this approach to two problems. The first problem involves a food risk assessment problem involving modelling dose-response for Listeria monocytogenes contamination of mice. The second concerns the time taken by PhD students to submit their thesis in a particular school.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Intelligible and accurate risk-based decision-making requires a complex balance of information from different sources, appropriate statistical analysis of this information and consequent intelligent inference and decisions made on the basis of these analyses. Importantly, this requires an explicit acknowledgement of uncertainty in the inputs and outputs of the statistical model. The aim of this paper is to progress a discussion of these issues in the context of several motivating problems related to the wider scope of agricultural production. These problems include biosecurity surveillance design, pest incursion, environmental monitoring and import risk assessment. The information to be integrated includes observational and experimental data, remotely sensed data and expert information. We describe our efforts in addressing these problems using Bayesian models and Bayesian networks. These approaches provide a coherent and transparent framework for modelling complex systems, combining the different information sources, and allowing for uncertainty in inputs and outputs. While the theory underlying Bayesian modelling has a long and well established history, its application is only now becoming more possible for complex problems, due to increased availability of methodological and computational tools. Of course, there are still hurdles and constraints, which we also address through sharing our endeavours and experiences.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The low stream salinity naturally in the Nebine-Mungallala Catchment, extent of vegetation retention, relatively low rainfall and high evaporation indicates that there is a relatively low risk of rising shallow groundwater tables in the catchment. Scalding caused by wind and water erosion exposing highly saline sub-soils is a more important regional issue, such as in the Homeboin area. Local salinisation associated with evaporation of bore water from free flowing bore drains and bores is also an important land degradation issue particularly in the lower Nebine, Wallam and Mungallala Creeks. The replacement of free flowing artesian bores and bore drains with capped bores and piped water systems under the Great Artesian Basin bore rehabilitation program is addressing local salinisation and scalding in the vicinity of bore drains and preventing the discharge of saline bore water to streams. Three principles for the prevention and control of salinity in the Nebine Mungallala catchment have been identified in this review: • Avoid salinity through avoiding scalds – i.e. not exposing the near-surface salt in landscape through land degradation; • Riparian zone management: Scalding often occurs within 200m or so of watering lines. Natural drainage lines are most likely to be overstocked, and thus have potential for scalding. Scalding begins when vegetation is removed, and without that binding cover, wind and water erosion exposes the subsoil; and • Monitoring of exposed or grazed soil areas. Based on the findings of the study, we make the following recommendations: 1. Undertake a geotechnical study of existing maps and other data to help identify and target areas most at risk of rising water tables causing salinity. Selected monitoring should then be established using piezometers as an early warning system. 2. SW NRM should financially support scald reclamation activity through its various funding programs. However, for this to have any validity in the overall management of salinity risk, it is critical that such funding require the landholder to undertake a salinity hazard/risk assessment of his/her holding. 3. A staged approach to funding may be appropriate. In the first instance, it would be reasonable to commence funding some pilot scald reclamation work with a view to further developing and piloting the farm hazard/risk assessment tools, and exploring how subsequent grazing management strategies could be incorporated within other extension and management activities. Once the details of the necessary farm level activities have been more clearly defined, and following the outcomes of the geotechnical review recommended above, a more comprehensive funding package could be rolled out to priority areas. 4. We recommend that best-practice grazing management training currently on offer should be enhanced with information about salinity risk in scald-prone areas, and ways of minimising the likelihood of scald formation. 5. We recommend that course material be developed for local students in Years 6 and 7, and that arrangements be made with local schools to present this information. Given the constraints of existing syllabi, we envisage that negotiations may have to be undertaken with the Department of Education in order for this material to be permitted to be used. We have contact with key people who could help in this if required. 6. We recommend that SW NRM continue to support existing extension activities such as Grazing Land Management and the Monitoring Made Easy tools. These aids should be able to be easily expanding to incorporate techniques for monitoring, addressing and preventing salinity and scalding. At the time of writing staff of SW NRM were actively involved in this process. It is important that these activities are adequately resourced to facilitate the uptake by landholders of the perception that salinity is an issue that needs to be addressed as part of everyday management. 7. We recommend that SW NRM consider investing in the development and deployment of a scenario-modelling learning support tool as part of the awareness raising and education activities. Secondary salinity is a dynamic process that results from ongoing human activity which mobilises and/or exposes salt occurring naturally in the landscape. Time scales can be short to very long, and the benefits of management actions can similarly have immediate or very long time frames. One way to help explain the dynamics of these processes is through scenario modelling.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Different international plant protection organisations advocate different schemes for conducting pest risk assessments. Most of these schemes use structured questionnaire in which experts are asked to score several items using an ordinal scale. The scores are then combined using a range of procedures, such as simple arithmetic mean, weighted averages, multiplication of scores, and cumulative sums. The most useful schemes will correctly identify harmful pests and identify ones that are not. As the quality of a pest risk assessment can depend on the characteristics of the scoring system used by the risk assessors (i.e., on the number of points of the scale and on the method used for combining the component scores), it is important to assess and compare the performance of different scoring systems. In this article, we proposed a new method for assessing scoring systems. Its principle is to simulate virtual data using a stochastic model and, then, to estimate sensitivity and specificity values from these data for different scoring systems. The interest of our approach was illustrated in a case study where several scoring systems were compared. Data for this analysis were generated using a probabilistic model describing the pest introduction process. The generated data were then used to simulate the outcome of scoring systems and to assess the accuracy of the decisions about positive and negative introduction. The results showed that ordinal scales with at most 5 or 6 points were sufficient and that the multiplication-based scoring systems performed better than their sum-based counterparts. The proposed method could be used in the future to assess a great diversity of scoring systems.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In 2008, a three-year pilot ‘pay for performance’ (P4P) program, known as ‘Clinical Practice Improvement Payment’ (CPIP) was introduced into Queensland Health (QHealth). QHealth is a large public health sector provider of acute, community, and public health services in Queensland, Australia. The organisation has recently embarked on a significant reform agenda including a review of existing funding arrangements (Duckett et al., 2008). Partly in response to this reform agenda, a casemix funding model has been implemented to reconnect health care funding with outcomes. CPIP was conceptualised as a performance-based scheme that rewarded quality with financial incentives. This is the first time such a scheme has been implemented into the public health sector in Australia with a focus on rewarding quality, and it is unique in that it has a large state-wide focus and includes 15 Districts. CPIP initially targeted five acute and community clinical areas including Mental Health, Discharge Medication, Emergency Department, Chronic Obstructive Pulmonary Disease, and Stroke. The CPIP scheme was designed around key concepts including the identification of clinical indicators that met the set criteria of: high disease burden, a well defined single diagnostic group or intervention, significant variations in clinical outcomes and/or practices, a good evidence, and clinician control and support (Ward, Daniels, Walker & Duckett, 2007). This evaluative research targeted Phase One of implementation of the CPIP scheme from January 2008 to March 2009. A formative evaluation utilising a mixed methodology and complementarity analysis was undertaken. The research involved three research questions and aimed to determine the knowledge, understanding, and attitudes of clinicians; identify improvements to the design, administration, and monitoring of CPIP; and determine the financial and economic costs of the scheme. Three key studies were undertaken to ascertain responses to the key research questions. Firstly, a survey of clinicians was undertaken to examine levels of knowledge and understanding and their attitudes to the scheme. Secondly, the study sought to apply Statistical Process Control (SPC) to the process indicators to assess if this enhanced the scheme and a third study examined a simple economic cost analysis. The CPIP Survey of clinicians elicited 192 clinician respondents. Over 70% of these respondents were supportive of the continuation of the CPIP scheme. This finding was also supported by the results of a quantitative altitude survey that identified positive attitudes in 6 of the 7 domains-including impact, awareness and understanding and clinical relevance, all being scored positive across the combined respondent group. SPC as a trending tool may play an important role in the early identification of indicator weakness for the CPIP scheme. This evaluative research study supports a previously identified need in the literature for a phased introduction of Pay for Performance (P4P) type programs. It further highlights the value of undertaking a formal risk assessment of clinician, management, and systemic levels of literacy and competency with measurement and monitoring of quality prior to a phased implementation. This phasing can then be guided by a P4P Design Variable Matrix which provides a selection of program design options such as indicator target and payment mechanisms. It became evident that a clear process is required to standardise how clinical indicators evolve over time and direct movement towards more rigorous ‘pay for performance’ targets and the development of an optimal funding model. Use of this matrix will enable the scheme to mature and build the literacy and competency of clinicians and the organisation as implementation progresses. Furthermore, the research identified that CPIP created a spotlight on clinical indicators and incentive payments of over five million from a potential ten million was secured across the five clinical areas in the first 15 months of the scheme. This indicates that quality was rewarded in the new QHealth funding model, and despite issues being identified with the payment mechanism, funding was distributed. The economic model used identified a relative low cost of reporting (under $8,000) as opposed to funds secured of over $300,000 for mental health as an example. Movement to a full cost effectiveness study of CPIP is supported. Overall the introduction of the CPIP scheme into QHealth has been a positive and effective strategy for engaging clinicians in quality and has been the catalyst for the identification and monitoring of valuable clinical process indicators. This research has highlighted that clinicians are supportive of the scheme in general; however, there are some significant risks that include the functioning of the CPIP payment mechanism. Given clinician support for the use of a pay–for-performance methodology in QHealth, the CPIP scheme has the potential to be a powerful addition to a multi-faceted suite of quality improvement initiatives within QHealth.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

There is no doubt that fraud in relation to land transactions is a problem that resonates amongst land academics, practitioners, and stakeholders involved in conveyancing. As each land registration and conveyancing process increasingly moves towards a fully electronic environment, we need to make sure that we understand and guard against the frauds that can occur. What this paper does is examine the types of fraud that have occurred in paper-based conveyancing systems in Australia and considers how they might be undertaken in the National Electronic Conveyancing System (NECS) that is currently under development. Whilst no system can ever be infallible, it is suggested that by correctly imposing the responsibility for identity verification on the appropriate individual, the conveyancing system adopted can achieve the optimum level of fairness in terms of allocation of responsibility and loss. As we sit on the cusp of a new era of electronic conveyancing, the framework suggested here provides a model for minimising the risks of forged mortgages and appropriately allocating the loss. Importantly it also recognises that the electronic environment will see new opportunities for those with criminal intent to undermine the integrity of land transactions. An appreciation of this now, can see the appropriate measures put in place to minimise the risk.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The World Health Organization recommends that the majority of water monitoring laboratories in the world should test for E. coli daily since thermotolerant coliforms and E. coli are key indicators for risk assessment of recreational waters. Recently, we developed a new SNP method for typing E. coli strains, by which human-specific genotypes were identified. Here, we report the presence of these previously described specific SNP profiles in environmental water, sourced from the Coomera River, located on South East Queensland, Australia, over a period of two years. This study tested for the presence of human-specific E. coli to ascertain whether hydrologic and anthropogenic activity plays a key role in the pollution of the investigated watershed or whether the pollution is from other sources. We found six human-specific SNP profiles and one animal-specific SNP profile consistently across sampling sites and times. We have demonstrated that our SNP genotyping method is able to rapidly identify and characterise human- and animal-specific E. coli isolates in water sources.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper discusses human factors issues of low cost railway level crossings in Australia. Several issues are discussed in this paper including safety at passive level railway crossings, human factors considerations associated with unavailability of a warning device, and a conceptual model for how safety could be compromised at railway level crossings following prolonged or frequent unavailability. The research plans to quantify safety risk to motorists at level crossings using a Human Reliability Assessment (HRA) method, supported by data collected using an advanced driving simulator. This method aims to identify human error within tasks and task units identified as part of the task analysis process. It is anticipated that by modelling driver behaviour the current study will be able to quantify meaningful task variability including temporal parameters, between participants and within participants. The process of complex tasks such as driving through a level crossing is fundamentally context-bound. Therefore this study also aims to quantify those performance-shaping factors that contribute to vehicle train collisions by highlighting changes in the task units and driver physiology. Finally we will also consider a number of variables germane to ensuring external validity of our results. Without this inclusion, such an analysis could seriously underestimate the probabilistic risk assessment.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Book summary: In a constantly evolving context of performance management, accountability and risk assessment, police organisations and frontline police officers are required to pay careful attention to what has come to be known as ‘at risk people’, ‘vulnerable populations’ or ‘vulnerable people’. Vulnerable people have become a key focus of policy. Concurrently, there have been stronger demands on police, and a steep increase in police powers in relation to their interaction with vulnerable people. The premise of this protectionist and interventionist agenda is threefold: to protect the rights of vulnerable individuals proactively cater for their vulnerability within the justice system; and to secure police operations and protocols within strict guidelines. This collection unpacks ‘vulnerable people policing’ in theory and practice and guides the reader through the policing process as it is experienced by police officers, victims, offenders, witnesses and justice stakeholders. Each chapter features a single step of the policing process: from police recruit education through to custody, and the final transfer of vulnerable people to courts and sentencing. This edited collection provides analytical, theoretical and empirical insights on vulnerable people policing, and reflects on critical issues in a domain that is increasingly subject to speedy conversion from policy to practice, and heightened media and political scrutiny. It breaks down policing practices, operations and procedures that have vulnerable populations as a focus, bringing together original and innovative academic research and literature, practitioner experience and discussion of policy implications (from local and international perspectives). The particular nature of this collection highlights the multi-disciplinary nature of police work, sheds light on how specific, mandatory policies guide police officers steps in their interaction with vulnerable populations, and discusses the practicalities of police decision making at key points in this process.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Airports represent the epitome of complex systems with multiple stakeholders, multiple jurisdictions and complex interactions between many actors. The large number of existing models that capture different aspects of the airport are a testament to this. However, these existing models do not consider in a systematic sense modelling requirements nor how stakeholders such as airport operators or airlines would make use of these models. This can detrimentally impact on the verification and validation of models and makes the development of extensible and reusable modelling tools difficult. This paper develops from the Concept of Operations (CONOPS) framework a methodology to help structure the review and development of modelling capabilities and usage scenarios. The method is applied to the review of existing airport terminal passenger models. It is found that existing models can be broadly categorised according to four usage scenarios: capacity planning, operational planning and design, security policy and planning, and airport performance review. The models, the performance metrics that they evaluate and their usage scenarios are discussed. It is found that capacity and operational planning models predominantly focus on performance metrics such as waiting time, service time and congestion whereas performance review models attempt to link those to passenger satisfaction outcomes. Security policy models on the other hand focus on probabilistic risk assessment. However, there is an emerging focus on the need to be able to capture trade-offs between multiple criteria such as security and processing time. Based on the CONOPS framework and literature findings, guidance is provided for the development of future airport terminal models.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In response to developments in international trade and an increased focus on international transfer-pricing issues, Canada’s minister of finance announced in the 1997 budget that the Department of Finance would undertake a review of the transfer-pricing provisions in the Income Tax Act. On September 11, 1997, the Department of Finance released draft transfer-pricing legislation and Revenue Canada released revised draft Information Circular 87-2R. The legislation was subsequently amended and included in Bill C-28, which received first reading on December 10, 1997. The new rules are intended to update Canada’s international transfer-pricing practices. In particular, they attempt to harmonize the standards in the Income Tax Act with the arm’s-length principle established in the OECD’s transfer pricing guidelines. The new rules also set out contemporaneous documentation requirements in respect of cross-border related-party transactions, facilitate administration of the law by Revenue Canada, and provide for a penalty where transfer prices do not comply with the arm’s-length principle. The Australian tax authorities have similarly reviewed and updated their transfer-pricing practices. Since 1992, the Australian commissioner of taxation has issued three rulings and seven draft rulings directly relating to international transfer pricing. These rulings outline the selection and application of transfer pricing methodologies, documentation requirements, and penalties for non-compliance. The Australian Taxation Office supports the use of advance pricing agreements (APAs) and has expanded its audit strategy by conducting transfer-pricing risk assessment reviews. This article presents a detailed review of Australia’s transfer-pricing policy and practices, which address essentially the same concerns as those at which the new Canadian rules are directed. This review provides a framework for comparison of the approaches adopted in the two jurisdictions. The author concludes that although these approaches differ in some respects, ultimately they produce a similar result. Both regimes set a clear standard to be met by multinational enterprises in establishing transfer prices. Both provide for audits and penalties in the event of noncompliance. And both offer the alternative of an APA as a means of avoiding transfer-pricing disputes with Australian and Canadian tax authorities.