803 resultados para Bayesian modelling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

The export of sediments from coastal catchments can have detrimental impacts on estuaries and near shore reef ecosystems such as the Great Barrier Reef. Catchment management approaches aimed at reducing sediment loads require monitoring to evaluate their effectiveness in reducing loads over time. However, load estimation is not a trivial task due to the complex behaviour of constituents in natural streams, the variability of water flows and often a limited amount of data. Regression is commonly used for load estimation and provides a fundamental tool for trend estimation by standardising the other time specific covariates such as flow. This study investigates whether load estimates and resultant power to detect trends can be enhanced by (i) modelling the error structure so that temporal correlation can be better quantified, (ii) making use of predictive variables, and (iii) by identifying an efficient and feasible sampling strategy that may be used to reduce sampling error. To achieve this, we propose a new regression model that includes an innovative compounding errors model structure and uses two additional predictive variables (average discounted flow and turbidity). By combining this modelling approach with a new, regularly optimised, sampling strategy, which adds uniformity to the event sampling strategy, the predictive power was increased to 90%. Using the enhanced regression model proposed here, it was possible to detect a trend of 20% over 20 years. This result is in stark contrast to previous conclusions presented in the literature. (C) 2014 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A flexible and simple Bayesian decision-theoretic design for dose-finding trials is proposed in this paper. In order to reduce the computational burden, we adopt a working model with conjugate priors, which is flexible to fit all monotonic dose-toxicity curves and produces analytic posterior distributions. We also discuss how to use a proper utility function to reflect the interest of the trial. Patients are allocated based on not only the utility function but also the chosen dose selection rule. The most popular dose selection rule is the one-step-look-ahead (OSLA), which selects the best-so-far dose. A more complicated rule, such as the two-step-look-ahead, is theoretically more efficient than the OSLA only when the required distributional assumptions are met, which is, however, often not the case in practice. We carried out extensive simulation studies to evaluate these two dose selection rules and found that OSLA was often more efficient than two-step-look-ahead under the proposed Bayesian structure. Moreover, our simulation results show that the proposed Bayesian method's performance is superior to several popular Bayesian methods and that the negative impact of prior misspecification can be managed in the design stage.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

So far, most Phase II trials have been designed and analysed under a frequentist framework. Under this framework, a trial is designed so that the overall Type I and Type II errors of the trial are controlled at some desired levels. Recently, a number of articles have advocated the use of Bavesian designs in practice. Under a Bayesian framework, a trial is designed so that the trial stops when the posterior probability of treatment is within certain prespecified thresholds. In this article, we argue that trials under a Bayesian framework can also be designed to control frequentist error rates. We introduce a Bayesian version of Simon's well-known two-stage design to achieve this goal. We also consider two other errors, which are called Bayesian errors in this article because of their similarities to posterior probabilities. We show that our method can also control these Bayesian-type errors. We compare our method with other recent Bayesian designs in a numerical study and discuss implications of different designs on error rates. An example of a clinical trial for patients with nasopharyngeal carcinoma is used to illustrate differences of the different designs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This project provides a steppingstone to comprehend the mechanisms that govern particulate fouling in metal foam heat exchangers. The method is based on development of an advanced Computational Fluid Dynamics model in addition to performing analytical validation. This novel method allows an engineer to better optimize heat exchanger designs, thereby mitigating fouling, reducing energy consumption caused by fouling, economize capital expenditure on heat exchanger maintenance, and reduce operation downtime. The robust model leads to the establishment of an alternative heat exchanger configuration that has lower pressure drop and particulate deposition propensity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Stallard (1998, Biometrics 54, 279-294) recently used Bayesian decision theory for sample-size determination in phase II trials. His design maximizes the expected financial gains in the development of a new treatment. However, it results in a very high probability (0.65) of recommending an ineffective treatment for phase III testing. On the other hand, the expected gain using his design is more than 10 times that of a design that tightly controls the false positive error (Thall and Simon, 1994, Biometrics 50, 337-349). Stallard's design maximizes the expected gain per phase II trial, but it does not maximize the rate of gain or total gain for a fixed length of time because the rate of gain depends on the proportion: of treatments forwarding to the phase III study. We suggest maximizing the rate of gain, and the resulting optimal one-stage design becomes twice as efficient as Stallard's one-stage design. Furthermore, the new design has a probability of only 0.12 of passing an ineffective treatment to phase III study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Records of shrimp growth and water quality made during 12 crops from each of 48 ponds, over a period of 6.5 years, were provided by a Queensland, Australia, commercial shrimp farm, These data were analysed with a new growth model derived from the Gompertz model. The results indicate that water temperature, mortality and pond age significantly affect growth rates. After 180 days, shrimp reach 34 g at constant 30 degrees C, but only 15 g after the same amount of time at 20 degrees C. Mortality, through thinning the density of shrimp in the ponds, increased the growth rate, but the effect is small. With continual production, growth rates at first remained steady, then appeared to decrease for the sixth and seventh crop, after which they have increased steadily with each crop. It appears that conservative pond management, together with a gradual improvement in husbandry techniques, particularly feed management, brought about this change. This has encouraging implications for the long-term sustainability of the farming methods used. The growth model can be used to predict productivity, and hence, profitability, of new aquaculture locations or new production strategies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study uses agent based modelling to simulate the worker interactions within a workplace and to investigate how the interactions can have impact on the workplace dynamics. Two new models (Bounded Confidence with Bias model and Relative Agreement with Bias model) are built based on the theoretical foundation of two existing models. A new factor, namely bias, is added into the new models which raises several issues to be studied.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes solutions to three issues pertaining to the estimation of finite mixture models with an unknown number of components: the non-identifiability induced by overfitting the number of components, the mixing limitations of standard Markov Chain Monte Carlo (MCMC) sampling techniques, and the related label switching problem. An overfitting approach is used to estimate the number of components in a finite mixture model via a Zmix algorithm. Zmix provides a bridge between multidimensional samplers and test based estimation methods, whereby priors are chosen to encourage extra groups to have weights approaching zero. MCMC sampling is made possible by the implementation of prior parallel tempering, an extension of parallel tempering. Zmix can accurately estimate the number of components, posterior parameter estimates and allocation probabilities given a sufficiently large sample size. The results will reflect uncertainty in the final model and will report the range of possible candidate models and their respective estimated probabilities from a single run. Label switching is resolved with a computationally light-weight method, Zswitch, developed for overfitted mixtures by exploiting the intuitiveness of allocation-based relabelling algorithms and the precision of label-invariant loss functions. Four simulation studies are included to illustrate Zmix and Zswitch, as well as three case studies from the literature. All methods are available as part of the R package Zmix, which can currently be applied to univariate Gaussian mixture models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This article presents some remarks on models currently used in low speed manoeuvring and dynamic positioning problems. It discusses the relationship between the classical hydrodynamic equations for manoeuvring and seakeeping, and offers insight into the models used for simulation and control system design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pseudo-marginal methods such as the grouped independence Metropolis-Hastings (GIMH) and Markov chain within Metropolis (MCWM) algorithms have been introduced in the literature as an approach to perform Bayesian inference in latent variable models. These methods replace intractable likelihood calculations with unbiased estimates within Markov chain Monte Carlo algorithms. The GIMH method has the posterior of interest as its limiting distribution, but suffers from poor mixing if it is too computationally intensive to obtain high-precision likelihood estimates. The MCWM algorithm has better mixing properties, but less theoretical support. In this paper we propose to use Gaussian processes (GP) to accelerate the GIMH method, whilst using a short pilot run of MCWM to train the GP. Our new method, GP-GIMH, is illustrated on simulated data from a stochastic volatility and a gene network model.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The intervertebral disc withstands large compressive loads (up to nine times bodyweight in humans) while providing flexibility to the spinal column. At a microstructural level, the outer sheath of the disc (the annulus fibrosus) comprises 12–20 annular layers of alternately crisscrossed collagen fibres embedded in a soft ground matrix. The centre of the disc (the nucleus pulposus) consists of a hydrated gel rich in proteoglycans. The disc is the largest avascular structure in the body and is of much interest biomechanically due to the high societal burden of disc degeneration and back pain. Although the disc has been well characterized at the whole joint scale, it is not clear how the disc tissue microstructure confers its overall mechanical properties. In particular, there have been conflicting reports regarding the level of attachment between adjacent lamellae in the annulus, and the importance of these interfaces to the overall integrity of the disc is unknown. We used a polarized light micrograph of the bovine tail disc in transverse cross-section to develop an image-based finite element model incorporating sliding and separation between layers of the annulus, and subjected the model to axial compressive loading. Validation experiments were also performed on four bovine caudal discs. Interlamellar shear resistance had a strong effect on disc compressive stiffness, with a 40% drop in stiffness when the interface shear resistance was changed from fully bonded to freely sliding. By contrast, interlamellar cohesion had no appreciable effect on overall disc mechanics. We conclude that shear resistance between lamellae confers disc mechanical resistance to compression, and degradation of the interlamellar interface structure may be a precursor to macroscopic disc degeneration.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traffic incidents are recognised as one of the key sources of non-recurrent congestion that often leads to reduction in travel time reliability (TTR), a key metric of roadway performance. A method is proposed here to quantify the impacts of traffic incidents on TTR on freeways. The method uses historical data to establish recurrent speed profiles and identifies non-recurrent congestion based on their negative impacts on speeds. The locations and times of incidents are used to identify incidents among non-recurrent congestion events. Buffer time is employed to measure TTR. Extra buffer time is defined as the extra delay caused by traffic incidents. This reliability measure indicates how much extra travel time is required by travellers to arrive at their destination on time with 95% certainty in the case of an incident, over and above the travel time that would have been required under recurrent conditions. An extra buffer time index (EBTI) is defined as the ratio of extra buffer time to recurrent travel time, with zero being the best case (no delay). A Tobit model is used to identify and quantify factors that affect EBTI using a selected freeway segment in the Southeast Queensland, Australia network. Both fixed and random parameter Tobit specifications are tested. The estimation results reveal that models with random parameters offer a superior statistical fit for all types of incidents, suggesting the presence of unobserved heterogeneity across segments. What factors influence EBTI depends on the type of incident. In addition, changes in TTR as a result of traffic incidents are related to the characteristics of the incidents (multiple vehicles involved, incident duration, major incidents, etc.) and traffic characteristics.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Between-subject and within-subject variability is ubiquitous in biology and physiology and understanding and dealing with this is one of the biggest challenges in medicine. At the same time it is difficult to investigate this variability by experiments alone. A recent modelling and simulation approach, known as population of models (POM), allows this exploration to take place by building a mathematical model consisting of multiple parameter sets calibrated against experimental data. However, finding such sets within a high-dimensional parameter space of complex electrophysiological models is computationally challenging. By placing the POM approach within a statistical framework, we develop a novel and efficient algorithm based on sequential Monte Carlo (SMC). We compare the SMC approach with Latin hypercube sampling (LHS), a method commonly adopted in the literature for obtaining the POM, in terms of efficiency and output variability in the presence of a drug block through an in-depth investigation via the Beeler-Reuter cardiac electrophysiological model. We show improved efficiency via SMC and that it produces similar responses to LHS when making out-of-sample predictions in the presence of a simulated drug block.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies how conceptual process models - that is, graphical documentations of an organisation's business processes - can enable and constrain the actions of their users. The results from case study and experiment indicate that model design decisions and people's characteristics influence how these opportunities for action are perceived and acted upon in practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Being able to accurately predict the risk of falling is crucial in patients with Parkinson’s dis- ease (PD). This is due to the unfavorable effect of falls, which can lower the quality of life as well as directly impact on survival. Three methods considered for predicting falls are decision trees (DT), Bayesian networks (BN), and support vector machines (SVM). Data on a 1-year prospective study conducted at IHBI, Australia, for 51 people with PD are used. Data processing are conducted using rpart and e1071 packages in R for DT and SVM, con- secutively; and Bayes Server 5.5 for the BN. The results show that BN and SVM produce consistently higher accuracy over the 12 months evaluation time points (average sensitivity and specificity > 92%) than DT (average sensitivity 88%, average specificity 72%). DT is prone to imbalanced data so needs to adjust for the misclassification cost. However, DT provides a straightforward, interpretable result and thus is appealing for helping to identify important items related to falls and to generate fallers’ profiles.