971 resultados para Hierarchical Bayesian
Resumo:
Keeping exotic plant pests out of our country relies on good border control or quarantine. However with increasing globalization and mobilization some things slip through. Then the back up systems become important. This can include an expensive form of surveillance that purposively targets particular pests. A much wider net is provided by general surveillance, which is assimilated into everyday activities, like farmers checking the health of their crops. In fact farmers and even home gardeners have provided a front line warning system for some pests (eg European wasp) that could otherwise have wreaked havoc. Mathematics is used to model how surveillance works in various situations. Within this virtual world we can play with various surveillance and management strategies to "see" how they would work, or how to make them work better. One of our greatest challenges is estimating some of the input parameters : because the pest hasn't been here before, it's hard to predict how well it might behave: establishing, spreading, and what types of symptoms it might express. So we rely on experts to help us with this. This talk will look at the mathematical, psychological and logical challenges of helping experts to quantify what they think. We show how the subjective Bayesian approach is useful for capturing expert uncertainty, ultimately providing a more complete picture of what they think... And what they don't!
Resumo:
This thesis developed and applied Bayesian models for the analysis of survival data. The gene expression was considered as explanatory variables within the Bayesian survival model which can be considered the new contribution in the analysis of such data. The censoring factor that is inherent of survival data has also been addressed in terms of its impact on the fitting of a finite mixture of Weibull distribution with and without covariates. To investigate this, simulation study were carried out under several censoring percentages. Censoring percentage as high as 80% is acceptable here as the work involved high dimensional data. Lastly the Bayesian model averaging approach was developed to incorporate model uncertainty in the prediction of survival.
Resumo:
Soil-based emissions of nitrous oxide (N2O), a well-known greenhouse gas, have been associated with changes in soil water-filled pore space (WFPS) and soil temperature in many previous studies. However, it is acknowledged that the environment-N2O relationship is complex and still relatively poorly unknown. In this article, we employed a Bayesian model selection approach (Reversible jump Markov chain Monte Carlo) to develop a data-informed model of the relationship between daily N2O emissions and daily WFPS and soil temperature measurements between March 2007 and February 2009 from a soil under pasture in Queensland, Australia, taking seasonal factors and time-lagged effects into account. The model indicates a very strong relationship between a hybrid seasonal structure and daily N2O emission, with the latter substantially increased in summer. Given the other variables in the model, daily soil WFPS, lagged by a week, had a negative influence on daily N2O; there was evidence of a nonlinear positive relationship between daily soil WFPS and daily N2O emission; and daily soil temperature tended to have a linear positive relationship with daily N2O emission when daily soil temperature was above a threshold of approximately 19°C. We suggest that this flexible Bayesian modeling approach could facilitate greater understanding of the shape of the covariate-N2O flux relation and detection of effect thresholds in the natural temporal variation of environmental variables on N2O emission.
Resumo:
Endotoxins can significantly affect the air quality in school environments. However, there is currently no reliable method for the measurement of endotoxins and there is a lack of reference values for endotoxin concentrations to aid in the interpretation of measurement results in school settings. We benchmarked the “baseline” range of endotoxin concentration in indoor air, together with endotoxin load in floor dust, and evaluated the correlation between endotoxin levels in indoor air and settled dust, as well as the effects of temperature and humidity on these levels in subtropical school settings. Bayesian hierarchical modeling indicated that the concentration in indoor air and the load in floor dust were generally (<95th percentile) < 13 EU/m3 and < 24,570 EU/m2, respectively. Exceeding these levels would indicate abnormal sources of endotoxins in the school environment, and the need for further investigation. Metaregression indicated no relationship between endotoxin concentration and load, which points to the necessity for measuring endotoxin levels in both the air and settled dust. Temperature increases were associated with lower concentrations in indoor air and higher loads in floor dust. Higher levels of humidity may be associated with lower airborne endotoxin concentrations.
Resumo:
Current Bayesian network software packages provide good graphical interface for users who design and develop Bayesian networks for various applications. However, the intended end-users of these networks may not necessarily find such an interface appealing and at times it could be overwhelming, particularly when the number of nodes in the network is large. To circumvent this problem, this paper presents an intuitive dashboard, which provides an additional layer of abstraction, enabling the end-users to easily perform inferences over the Bayesian networks. Unlike most software packages, which display the nodes and arcs of the network, the developed tool organises the nodes based on the cause-and-effect relationship, making the user-interaction more intuitive and friendly. In addition to performing various types of inferences, the users can conveniently use the tool to verify the behaviour of the developed Bayesian network. The tool has been developed using QT and SMILE libraries in C++.
Resumo:
Indirect inference (II) is a methodology for estimating the parameters of an intractable (generative) model on the basis of an alternative parametric (auxiliary) model that is both analytically and computationally easier to deal with. Such an approach has been well explored in the classical literature but has received substantially less attention in the Bayesian paradigm. The purpose of this paper is to compare and contrast a collection of what we call parametric Bayesian indirect inference (pBII) methods. One class of pBII methods uses approximate Bayesian computation (referred to here as ABC II) where the summary statistic is formed on the basis of the auxiliary model, using ideas from II. Another approach proposed in the literature, referred to here as parametric Bayesian indirect likelihood (pBIL), we show to be a fundamentally different approach to ABC II. We devise new theoretical results for pBIL to give extra insights into its behaviour and also its differences with ABC II. Furthermore, we examine in more detail the assumptions required to use each pBII method. The results, insights and comparisons developed in this paper are illustrated on simple examples and two other substantive applications. The first of the substantive examples involves performing inference for complex quantile distributions based on simulated data while the second is for estimating the parameters of a trivariate stochastic process describing the evolution of macroparasites within a host based on real data. We create a novel framework called Bayesian indirect likelihood (BIL) which encompasses pBII as well as general ABC methods so that the connections between the methods can be established.
Resumo:
This paper presents a novel framework for the modelling of passenger facilitation in a complex environment. The research is motivated by the challenges in the airport complex system, where there are multiple stakeholders, differing operational objectives and complex interactions and interdependencies between different parts of the airport system. Traditional methods for airport terminal modelling do not explicitly address the need for understanding causal relationships in a dynamic environment. Additionally, existing Bayesian Network (BN) models, which provide a means for capturing causal relationships, only present a static snapshot of a system. A method to integrate a BN complex systems model with stochastic queuing theory is developed based on the properties of the Poisson and Exponential distributions. The resultant Hybrid Queue-based Bayesian Network (HQBN) framework enables the simulation of arbitrary factors, their relationships, and their effects on passenger flow and vice versa. A case study implementation of the framework is demonstrated on the inbound passenger facilitation process at Brisbane International Airport. The predicted outputs of the model, in terms of cumulative passenger flow at intermediary and end points in the inbound process, are found to have an $R^2$ goodness of fit of 0.9994 and 0.9982 respectively over a 10 hour test period. The utility of the framework is demonstrated on a number of usage scenarios including real time monitoring and `what-if' analysis. This framework provides the ability to analyse and simulate a dynamic complex system, and can be applied to other socio-technical systems such as hospitals.
Resumo:
Utility functions in Bayesian experimental design are usually based on the posterior distribution. When the posterior is found by simulation, it must be sampled from for each future data set drawn from the prior predictive distribution. Many thousands of posterior distributions are often required. A popular technique in the Bayesian experimental design literature to rapidly obtain samples from the posterior is importance sampling, using the prior as the importance distribution. However, importance sampling will tend to break down if there is a reasonable number of experimental observations and/or the model parameter is high dimensional. In this paper we explore the use of Laplace approximations in the design setting to overcome this drawback. Furthermore, we consider using the Laplace approximation to form the importance distribution to obtain a more efficient importance distribution than the prior. The methodology is motivated by a pharmacokinetic study which investigates the effect of extracorporeal membrane oxygenation on the pharmacokinetics of antibiotics in sheep. The design problem is to find 10 near optimal plasma sampling times which produce precise estimates of pharmacokinetic model parameters/measures of interest. We consider several different utility functions of interest in these studies, which involve the posterior distribution of parameter functions.
Resumo:
Toxic blooms of Lyngbya majuscula occur in coastal areas worldwide and have major ecological, health and economic consequences. The exact causes and combinations of factors which lead to these blooms are not clearly understood. Lyngbya experts and stakeholders are a particularly diverse group, including ecologists, scientists, state and local government representatives, community organisations, catchment industry groups and local fishermen. An integrated Bayesian Network approach was developed to better understand and model this complex environmental problem, identify knowledge gaps, prioritise future research and evaluate management options.
Resumo:
Conservation of free-ranging cheetah (Acinonyx jubatus) populations is multi faceted and needs to be addressed from an ecological, biological and management perspective. There is a wealth of published research, each focusing on a particular aspect of cheetah conservation. Identifying the most important factors, making sense of various (and sometimes contrasting) findings, and taking decisions when little or no empirical data is available, are everyday challenges facing conservationists. Bayesian networks (BN) provide a statistical modeling framework that enables analysis and integration of information addressing different aspects of conservation. There has been an increased interest in the use of BNs to model conservation issues, however the development of more sophisticated BNs, utilizing object-oriented (OO) features, is still at the frontier of ecological research. We describe an integrated, parallel modeling process followed during a BN modeling workshop held in Namibia to combine expert knowledge and data about free-ranging cheetahs. The aim of the workshop was to obtain a more comprehensive view of the current viability of the free-ranging cheetah population in Namibia, and to predict the effect different scenarios may have on the future viability of this free-ranging cheetah population. Furthermore, a complementary aim was to identify influential parameters of the model to more effectively target those parameters having the greatest impact on population viability. The BN was developed by aggregating diverse perspectives from local and independent scientists, agents from the national ministry, conservation agency members and local fieldworkers. This integrated BN approach facilitates OO modeling in a multi-expert context which lends itself to a series of integrated, yet independent, subnetworks describing different scientific and management components. We created three subnetworks in parallel: a biological, ecological and human factors network, which were then combined to create a complete representation of free-ranging cheetah population viability. Such OOBNs have widespread relevance to the effective and targeted conservation management of vulnerable and endangered species.
Resumo:
Bayesian networks (BNs) provide a statistical modelling framework which is ideally suited for modelling the many factors and components of complex problems such as healthcare-acquired infections. The methicillin-resistant Staphylococcus aureus (MRSA) organism is particularly troublesome since it is resistant to standard treatments for Staph infections. Overcrowding and understa�ng are believed to increase infection transmission rates and also to inhibit the effectiveness of disease control measures. Clearly the mechanisms behind MRSA transmission and containment are very complicated and control strategies may only be e�ective when used in combination. BNs are growing in popularity in general and in medical sciences in particular. A recent Current Content search of the number of published BN journal articles showed a fi�ve fold increase in general and a six fold increase in medical and veterinary science from 2000 to 2009. This chapter introduces the reader to Bayesian network (BN) modelling and an iterative modelling approach to build and test the BN created to investigate the possible role of high bed occupancy on transmission of MRSA while simultaneously taking into account other risk factors.
Resumo:
There has been considerable scientific interest in personal exposure to ultrafine particles (UFP). In this study, the inhaled particle surface area doses and dose relative intensities in the tracheobronchial and alveolar regions of lungs were calculated using the measured 24-hour UFP time series of school children personal exposures for each recorded activity. Bayesian hierarchical modelling was used to determine mean doses and dose intensities for the various microenvironments. Analysis of measured personal exposures for 137 participating children from 25 schools in the Brisbane Metropolitan Area showed similar trends for all the participating children. Bayesian regression modelling was performed to calculate the daily proportion of children's total doses at different microenvironments. The proportion of alveolar doses in the total daily dose for \emph{home}, \emph{school}, \emph{commuting} and \emph{other} were 55.3\%, 35.3\%, 4.5\% and 5.0\%, respectively, with the \emph{home} microenvironment contributing a majority of children's total daily dose. Children's mean indoor dose was never higher than the outdoor's at any of the schools, indicating there were no persistent indoor particle sources in the classrooms during the measurements. Outdoor activities, eating/cooking at home and commuting were the three activities with the highest dose intensities. Personal exposure was more influenced by the ambient particle levels than immediate traffic.
Resumo:
This paper investigates the business cycle co-movement across countries and regions since 1950 as a measure for quantifying the economic interdependence in the ongoing globalisation process. Our methodological approach is based on analysis of a correlation matrix and the networks it contains. Such an approach summarises the interaction and interdependence of all elements, and it represents a more accurate measure of the global interdependence involved in an economic system. Our results show (1) the dynamics of interdependence has been driven more by synchronisation in regional growth patterns than by the synchronisation of the world economy, and (2) world crisis periods dramatically increase the global co-movement in the world economy.
Resumo:
Discretization of a geographical region is quite common in spatial analysis. There have been few studies into the impact of different geographical scales on the outcome of spatial models for different spatial patterns. This study aims to investigate the impact of spatial scales and spatial smoothing on the outcomes of modelling spatial point-based data. Given a spatial point-based dataset (such as occurrence of a disease), we study the geographical variation of residual disease risk using regular grid cells. The individual disease risk is modelled using a logistic model with the inclusion of spatially unstructured and/or spatially structured random effects. Three spatial smoothness priors for the spatially structured component are employed in modelling, namely an intrinsic Gaussian Markov random field, a second-order random walk on a lattice, and a Gaussian field with Matern correlation function. We investigate how changes in grid cell size affect model outcomes under different spatial structures and different smoothness priors for the spatial component. A realistic example (the Humberside data) is analyzed and a simulation study is described. Bayesian computation is carried out using an integrated nested Laplace approximation. The results suggest that the performance and predictive capacity of the spatial models improve as the grid cell size decreases for certain spatial structures. It also appears that different spatial smoothness priors should be applied for different patterns of point data.