985 resultados para Bayesian techniques
Resumo:
The measurement error model is a well established statistical method for regression problems in medical sciences, although rarely used in ecological studies. While the situations in which it is appropriate may be less common in ecology, there are instances in which there may be benefits in its use for prediction and estimation of parameters of interest. We have chosen to explore this topic using a conditional independence model in a Bayesian framework using a Gibbs sampler, as this gives a great deal of flexibility, allowing us to analyse a number of different models without losing generality. Using simulations and two examples, we show how the conditional independence model can be used in ecology, and when it is appropriate.
Resumo:
In general, the benefits of using cooperative learning include academic achievement, communication skills, problem-solving, social skills and student motivation. Yet cooperative learning as a Western educational concept may be ineffective in a different learning system. The study aims to investigate scaffolding techniques for cooperative learning in Thailand primary education. The program was designed to foster Thai primary school teachers’ cooperative learning implementation that includes the basic tenets of cooperative learning and socio-cognitive based learning. Two teachers were invited to participate in this experimental teacher training program for one and a half weeks. Then the teachers implemented a cooperative learning in their mathematics class for six weeks. The data from teacher interview and classroom observation indicated that the both teachers are able to utilise questions to scaffold their students’ engagement in cooperative learning. This initiative study showed that difficulty or failure of implementing cooperative learning in Thailand education may not be derived from cultural difference. The paper discussed the techniques the participant teachers applied with proactive scaffolding, reactive scaffolding and scaffolding questions that can be used to facilitate the implementation of cooperative learning in Thai school.
Resumo:
Inverse problems based on using experimental data to estimate unknown parameters of a system often arise in biological and chaotic systems. In this paper, we consider parameter estimation in systems biology involving linear and non-linear complex dynamical models, including the Michaelis–Menten enzyme kinetic system, a dynamical model of competence induction in Bacillus subtilis bacteria and a model of feedback bypass in B. subtilis bacteria. We propose some novel techniques for inverse problems. Firstly, we establish an approximation of a non-linear differential algebraic equation that corresponds to the given biological systems. Secondly, we use the Picard contraction mapping, collage methods and numerical integration techniques to convert the parameter estimation into a minimization problem of the parameters. We propose two optimization techniques: a grid approximation method and a modified hybrid Nelder–Mead simplex search and particle swarm optimization (MH-NMSS-PSO) for non-linear parameter estimation. The two techniques are used for parameter estimation in a model of competence induction in B. subtilis bacteria with noisy data. The MH-NMSS-PSO scheme is applied to a dynamical model of competence induction in B. subtilis bacteria based on experimental data and the model for feedback bypass. Numerical results demonstrate the effectiveness of our approach.
Resumo:
This paper describes the feasibility of the application of an Imputer in a multiple choice answer sheet marking system based on image processing techniques.
Resumo:
A time series method for the determination of combustion chamber resonant frequencies is outlined. This technique employs the use of Markov-chain Monte Carlo (MCMC) to infer parameters in a chosen model of the data. The development of the model is included and the resonant frequency is characterised as a function of time. Potential applications for cycle-by-cycle analysis are discussed and the bulk temperature of the gas and the trapped mass in the combustion chamber are evaluated as a function of time from resonant frequency information.
Resumo:
Fibre composite structures have become the most attractive candidate for civil engineering applications. Fibre reinforced plastic polymer (FRP) composite materials have been used in the rehabilitation and replacement of the old degrading traditional structures or build new structures. However, the lack of design standards for civil infrastructure limits their structural applications. The majority of the existing applications have been designed based on the research and guidelines provided by the fibre composite manufacturers or based on the designer’s experience. It has been a tendency that the final structure is generally over-designed. This paper provides a review on the available studies related to the design optimization of fibre composite structures used in civil engineering such as; plate, beam, box beam, sandwich panel, bridge girder, and bridge deck. Various optimization methods are presented and compared. In addition, the importance of using the appropriate optimization technique is discussed. An improved methodology, which considering experimental testing, numerical modelling, and design constrains, is proposed in the paper for design optimization of composite structures.
Resumo:
Radiotherapy is a cancer treatment modality in which a dose of ionising radiation is delivered to a tumour. The accurate calculation of the dose to the patient is very important in the design of an effective therapeutic strategy. This study aimed to systematically examine the accuracy of the radiotherapy dose calculations performed by clinical treatment planning systems by comparison againstMonte Carlo simulations of the treatment delivery. A suite of software tools known as MCDTK (Monte Carlo DICOM ToolKit) was developed for this purpose, and is capable of: • Importing DICOM-format radiotherapy treatment plans and producing Monte Carlo simulation input files (allowing simple simulation of complex treatments), and calibrating the results; • Analysing the predicted doses of and deviations between the Monte Carlo simulation results and treatment planning system calculations in regions of interest (tumours and organs-at-risk) and generating dose-volume histograms, so that conformity with dose prescriptions can be evaluated. The code has been tested against various treatment planning systems, linear acceleratormodels and treatment complexities. Six clinical head and neck cancer treatments were simulated and the results analysed using this software. The deviations were greatest where the treatment volume encompassed tissues on both sides of an air cavity. This was likely due to the method the planning system used to model low density media.
Resumo:
Modelling an environmental process involves creating a model structure and parameterising the model with appropriate values to accurately represent the process. Determining accurate parameter values for environmental systems can be challenging. Existing methods for parameter estimation typically make assumptions regarding the form of the Likelihood, and will often ignore any uncertainty around estimated values. This can be problematic, however, particularly in complex problems where Likelihoods may be intractable. In this paper we demonstrate an Approximate Bayesian Computational method for the estimation of parameters of a stochastic CA. We use as an example a CA constructed to simulate a range expansion such as might occur after a biological invasion, making parameter estimates using only count data such as could be gathered from field observations. We demonstrate ABC is a highly useful method for parameter estimation, with accurate estimates of parameters that are important for the management of invasive species such as the intrinsic rate of increase and the point in a landscape where a species has invaded. We also show that the method is capable of estimating the probability of long distance dispersal, a characteristic of biological invasions that is very influential in determining spread rates but has until now proved difficult to estimate accurately.
Resumo:
The research objectives of this thesis were to contribute to Bayesian statistical methodology by contributing to risk assessment statistical methodology, and to spatial and spatio-temporal methodology, by modelling error structures using complex hierarchical models. Specifically, I hoped to consider two applied areas, and use these applications as a springboard for developing new statistical methods as well as undertaking analyses which might give answers to particular applied questions. Thus, this thesis considers a series of models, firstly in the context of risk assessments for recycled water, and secondly in the context of water usage by crops. The research objective was to model error structures using hierarchical models in two problems, namely risk assessment analyses for wastewater, and secondly, in a four dimensional dataset, assessing differences between cropping systems over time and over three spatial dimensions. The aim was to use the simplicity and insight afforded by Bayesian networks to develop appropriate models for risk scenarios, and again to use Bayesian hierarchical models to explore the necessarily complex modelling of four dimensional agricultural data. The specific objectives of the research were to develop a method for the calculation of credible intervals for the point estimates of Bayesian networks; to develop a model structure to incorporate all the experimental uncertainty associated with various constants thereby allowing the calculation of more credible credible intervals for a risk assessment; to model a single day’s data from the agricultural dataset which satisfactorily captured the complexities of the data; to build a model for several days’ data, in order to consider how the full data might be modelled; and finally to build a model for the full four dimensional dataset and to consider the timevarying nature of the contrast of interest, having satisfactorily accounted for possible spatial and temporal autocorrelations. This work forms five papers, two of which have been published, with two submitted, and the final paper still in draft. The first two objectives were met by recasting the risk assessments as directed, acyclic graphs (DAGs). In the first case, we elicited uncertainty for the conditional probabilities needed by the Bayesian net, incorporated these into a corresponding DAG, and used Markov chain Monte Carlo (MCMC) to find credible intervals, for all the scenarios and outcomes of interest. In the second case, we incorporated the experimental data underlying the risk assessment constants into the DAG, and also treated some of that data as needing to be modelled as an ‘errors-invariables’ problem [Fuller, 1987]. This illustrated a simple method for the incorporation of experimental error into risk assessments. In considering one day of the three-dimensional agricultural data, it became clear that geostatistical models or conditional autoregressive (CAR) models over the three dimensions were not the best way to approach the data. Instead CAR models are used with neighbours only in the same depth layer. This gave flexibility to the model, allowing both the spatially structured and non-structured variances to differ at all depths. We call this model the CAR layered model. Given the experimental design, the fixed part of the model could have been modelled as a set of means by treatment and by depth, but doing so allows little insight into how the treatment effects vary with depth. Hence, a number of essentially non-parametric approaches were taken to see the effects of depth on treatment, with the model of choice incorporating an errors-in-variables approach for depth in addition to a non-parametric smooth. The statistical contribution here was the introduction of the CAR layered model, the applied contribution the analysis of moisture over depth and estimation of the contrast of interest together with its credible intervals. These models were fitted using WinBUGS [Lunn et al., 2000]. The work in the fifth paper deals with the fact that with large datasets, the use of WinBUGS becomes more problematic because of its highly correlated term by term updating. In this work, we introduce a Gibbs sampler with block updating for the CAR layered model. The Gibbs sampler was implemented by Chris Strickland using pyMCMC [Strickland, 2010]. This framework is then used to consider five days data, and we show that moisture in the soil for all the various treatments reaches levels particular to each treatment at a depth of 200 cm and thereafter stays constant, albeit with increasing variances with depth. In an analysis across three spatial dimensions and across time, there are many interactions of time and the spatial dimensions to be considered. Hence, we chose to use a daily model and to repeat the analysis at all time points, effectively creating an interaction model of time by the daily model. Such an approach allows great flexibility. However, this approach does not allow insight into the way in which the parameter of interest varies over time. Hence, a two-stage approach was also used, with estimates from the first-stage being analysed as a set of time series. We see this spatio-temporal interaction model as being a useful approach to data measured across three spatial dimensions and time, since it does not assume additivity of the random spatial or temporal effects.
Resumo:
Background: Breastfeeding is the internationally accepted ideal in infant feeding. Ensuring mothers and babies receive optimal benefits, in both the short and long term, is dependent upon the successful establishment of breastfeeding in the first week. Many maternal and infant challenges can occur during the establishment of breastfeeding (Lactogenesis II). There are also many methods and devices (alternative techniques) which can be used to help, but the majority do not have an evidence-base. The mother.s self-confidence (self-efficacy) can be challenged by these unexpected circumstances, but understanding of the relationship is unclear. Method: This descriptive study used mail survey (including the Breastfeeding Self-Efficacy Scale . Short Form) to obtain the mother.s reports of their self-efficacy and their breastfeeding experience during the first week following birth, as well as actual use of alternative techniques. This study included all mothers of full term healthy singleton infants from one private hospital in Brisbane who began any breastfeeding. The data collection took place from November 2008 to February 2009. Ethical approval was granted from the research site and QUT Human Research Ethics Committee. Results: A total of 128 questionnaires were returned, a response rate of 56.9%. The sample was dissimilar to the Queensland population with regard to age, income, and education level, all of which were higher in this study. The sample was similar to the Queensland population in terms of parity and marital status. The rate of use of alternative techniques was 48.3%. The mean breastfeeding self-efficacy score of those who used any alternative technique was 43.43 (SD=12.19), and for those who did not, it was 58.32 (SD=7.40). Kruskal-Wallis analysis identified that the median self efficacy score for those who used alternative techniques was significantly lower than median self efficacy scores for those who did not use alternative techniques. The reasons women used alternative techniques varied widely, and their knowledge of alternative techniques was good. Conclusion: This study is the first to document breastfeeding self-efficacy of women who used alternative techniques to support their breastfeeding goals in the first week postpartum. An individualised clinical intervention to develop women.s self-efficacy with breastfeeding is important to assist mother/infant dyads encountering challenges to breastfeeding in the first week postpartum.
Resumo:
An introduction to thinking about and understanding probability that highlights the main pits and trapfalls that befall logical reasoning
Resumo:
An introduction to elicitation of experts' probabilities, which illustrates common problems with reasoning and how to circumvent them during elicitation.