88 resultados para Averaging Theorem
em Queensland University of Technology - ePrints Archive
Resumo:
Harmful Algal Blooms (HABs) are a worldwide problem that have been increasing in frequency and extent over the past several decades. HABs severely damage aquatic ecosystems by destroying benthic habitat, reducing invertebrate and fish populations and affecting larger species such as dugong that rely on seagrasses for food. Few statistical models for predicting HAB occurrences have been developed, and in common with most predictive models in ecology, those that have been developed do not fully account for uncertainties in parameters and model structure. This makes management decisions based on these predictions more risky than might be supposed. We used a probit time series model and Bayesian Model Averaging (BMA) to predict occurrences of blooms of Lyngbya majuscula, a toxic cyanophyte, in Deception Bay, Queensland, Australia. We found a suite of useful predictors for HAB occurrence, with Temperature figuring prominently in models with the majority of posterior support, and a model consisting of the single covariate average monthly minimum temperature showed by far the greatest posterior support. A comparison of alternative model averaging strategies was made with one strategy using the full posterior distribution and a simpler approach that utilised the majority of the posterior distribution for predictions but with vastly fewer models. Both BMA approaches showed excellent predictive performance with little difference in their predictive capacity. Applications of BMA are still rare in ecology, particularly in management settings. This study demonstrates the power of BMA as an important management tool that is capable of high predictive performance while fully accounting for both parameter and model uncertainty.
Resumo:
Volume measurements are useful in many branches of science and medicine. They are usually accomplished by acquiring a sequence of cross sectional images through the object using an appropriate scanning modality, for example x-ray computed tomography (CT), magnetic resonance (MR) or ultrasound (US). In the cases of CT and MR, a dividing cubes algorithm can be used to describe the surface as a triangle mesh. However, such algorithms are not suitable for US data, especially when the image sequence is multiplanar (as it usually is). This problem may be overcome by manually tracing regions of interest (ROIs) on the registered multiplanar images and connecting the points into a triangular mesh. In this paper we describe and evaluate a new discreet form of Gauss’ theorem which enables the calculation of the volume of any enclosed surface described by a triangular mesh. The volume is calculated by summing the vector product of the centroid, area and normal of each surface triangle. The algorithm was tested on computer-generated objects, US-scanned balloons, livers and kidneys and CT-scanned clay rocks. The results, expressed as the mean percentage difference ± one standard deviation were 1.2 ± 2.3, 5.5 ± 4.7, 3.0 ± 3.2 and −1.2 ± 3.2% for balloons, livers, kidneys and rocks respectively. The results compare favourably with other volume estimation methods such as planimetry and tetrahedral decomposition.
Resumo:
This study considered the problem of predicting survival, based on three alternative models: a single Weibull, a mixture of Weibulls and a cure model. Instead of the common procedure of choosing a single “best” model, where “best” is defined in terms of goodness of fit to the data, a Bayesian model averaging (BMA) approach was adopted to account for model uncertainty. This was illustrated using a case study in which the aim was the description of lymphoma cancer survival with covariates given by phenotypes and gene expression. The results of this study indicate that if the sample size is sufficiently large, one of the three models emerge as having highest probability given the data, as indicated by the goodness of fit measure; the Bayesian information criterion (BIC). However, when the sample size was reduced, no single model was revealed as “best”, suggesting that a BMA approach would be appropriate. Although a BMA approach can compromise on goodness of fit to the data (when compared to the true model), it can provide robust predictions and facilitate more detailed investigation of the relationships between gene expression and patient survival. Keywords: Bayesian modelling; Bayesian model averaging; Cure model; Markov Chain Monte Carlo; Mixture model; Survival analysis; Weibull distribution
Resumo:
Nitrous oxide (N2O) is one of the greenhouse gases that can contribute to global warming. Spatial variability of N2O can lead to large uncertainties in prediction. However, previous studies have often ignored the spatial dependency to quantify the N2O - environmental factors relationships. Few researches have examined the impacts of various spatial correlation structures (e.g. independence, distance-based and neighbourhood based) on spatial prediction of N2O emissions. This study aimed to assess the impact of three spatial correlation structures on spatial predictions and calibrate the spatial prediction using Bayesian model averaging (BMA) based on replicated, irregular point-referenced data. The data were measured in 17 chambers randomly placed across a 271 m(2) field between October 2007 and September 2008 in the southeast of Australia. We used a Bayesian geostatistical model and a Bayesian spatial conditional autoregressive (CAR) model to investigate and accommodate spatial dependency, and to estimate the effects of environmental variables on N2O emissions across the study site. We compared these with a Bayesian regression model with independent errors. The three approaches resulted in different derived maps of spatial prediction of N2O emissions. We found that incorporating spatial dependency in the model not only substantially improved predictions of N2O emission from soil, but also better quantified uncertainties of soil parameters in the study. The hybrid model structure obtained by BMA improved the accuracy of spatial prediction of N2O emissions across this study region.
Resumo:
This paper discusses how fundamentals of number theory, such as unique prime factorization and greatest common divisor can be made accessible to secondary school students through spreadsheets. In addition, the three basic multiplicative functions of number theory are defined and illustrated through a spreadsheet environment. Primes are defined simply as those natural numbers with just two divisors. One focus of the paper is to show the ease with which spreadsheets can be used to introduce students to some basics of elementary number theory. Complete instructions are given to build a spreadsheet to enable the user to input a positive integer, either with a slider or manually, and see the prime decomposition. The spreadsheet environment allows students to observe patterns, gain structural insight, form and test conjectures, and solve problems in elementary number theory.
Resumo:
This article lays down the foundations of the renormalization group (RG) approach for differential equations characterized by multiple scales. The renormalization of constants through an elimination process and the subsequent derivation of the amplitude equation [Chen, Phys. Rev. E 54, 376 (1996)] are given a rigorous but not abstract mathematical form whose justification is based on the implicit function theorem. Developing the theoretical framework that underlies the RG approach leads to a systematization of the renormalization process and to the derivation of explicit closed-form expressions for the amplitude equations that can be carried out with symbolic computation for both linear and nonlinear scalar differential equations and first order systems but independently of their particular forms. Certain nonlinear singular perturbation problems are considered that illustrate the formalism and recover well-known results from the literature as special cases. © 2008 American Institute of Physics.
Contrast transfer function correction applied to cryo-electron tomography and sub-tomogram averaging
Resumo:
Cryo-electron tomography together with averaging of sub-tomograms containing identical particles can reveal the structure of proteins or protein complexes in their native environment. The resolution of this technique is limited by the contrast transfer function (CTF) of the microscope. The CTF is not routinely corrected in cryo-electron tomography because of difficulties including CTF detection, due to the low signal to noise ratio, and CTF correction, since images are characterised by a spatially variant CTF. Here we simulate the effects of the CTF on the resolution of the final reconstruction, before and after CTF correction, and consider the effect of errors and approximations in defocus determination. We show that errors in defocus determination are well tolerated when correcting a series of tomograms collected at a range of defocus values. We apply methods for determining the CTF parameters in low signal to noise images of tilted specimens, for monitoring defocus changes using observed magnification changes, and for correcting the CTF prior to reconstruction. Using bacteriophage PRDI as a test sample, we demonstrate that this approach gives an improvement in the structure obtained by sub-tomogram averaging from cryo-electron tomograms.
Resumo:
The population Monte Carlo algorithm is an iterative importance sampling scheme for solving static problems. We examine the population Monte Carlo algorithm in a simplified setting, a single step of the general algorithm, and study a fundamental problem that occurs in applying importance sampling to high-dimensional problem. The precision of the computed estimate from the simplified setting is measured by the asymptotic variance of estimate under conditions on the importance function. We demonstrate the exponential growth of the asymptotic variance with the dimension and show that the optimal covariance matrix for the importance function can be estimated in special cases.
Resumo:
Objectives: To evaluate the validity, reliability and responsiveness of EDC using the WOMAC® NRS 3.1 Index on Motorola V3 mobile phones. ---------- Methods: Patients with osteoarthritis (OA) undergoing primary unilateral hip or knee joint replacement surgery were assessed pre-operatively and 3-4 months post-operatively. Patients completed the WOMAC® Index in paper (p-WOMAC®) and electronic (m-WOMAC®) format in random order. ---------- Results: 24 men and 38 women with hip and knee OA participated and successfully completed the m-WOMAC® questionnaire. Pearson correlations between the summated total index scores for the p-WOMAC® and m-WOMAC® pre- and post-surgery were 0.98 and 0.99 (p<0.0001). There was no clinically important or statistically significant between-method difference in the adjusted total summated scores, pre- and post-surgery (adjusted mean difference = 4.44, p = 0.474 and 1.73, p = 0.781). Internal consistency estimates of m-WOMAC® reliability were 0.87 – 0.98. The m-WOMAC® detected clinically important, statistically significant (p<0.0001) improvements in pain, stiffness, function and total index score. ---------- Conclusions: Sixty-two patients with hip and knee OA successfully completed EDC by Motorola V3 mobile phone using the m-WOMAC® NRS3.1 Index; completion times averaging only 1-1.5 minutes longer than the p-WOMAC® Index. Data were successfully and securely transmitted from patients in Australia to a server in the USA. There was close agreement and no significant differences between m-WOMAC® and p-WOMAC® scores. This study confirms the validity, reliability and responsiveness of the Exco InTouch engineered, Java-based m-WOMAC® Index application. EDC with the m-WOMAC® Index provides unique opportunities for using quantitative measurement in clinical research and practice.