7 resultados para Tempered MCMC

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Monte Carlo simulation was used to evaluate properties of a simple Bayesian MCMC analysis of the random effects model for single group Cormack-Jolly-Seber capture-recapture data. The MCMC method is applied to the model via a logit link, so parameters p, S are on a logit scale, where logit(S) is assumed to have, and is generated from, a normal distribution with mean μ and variance σ2 . Marginal prior distributions on logit(p) and μ were independent normal with mean zero and standard deviation 1.75 for logit(p) and 100 for μ ; hence minimally informative. Marginal prior distribution on σ2 was placed on τ2=1/σ2 as a gamma distribution with α=β=0.001 . The study design has 432 points spread over 5 factors: occasions (t) , new releases per occasion (u), p, μ , and σ . At each design point 100 independent trials were completed (hence 43,200 trials in total), each with sample size n=10,000 from the parameter posterior distribution. At 128 of these design points comparisons are made to previously reported results from a method of moments procedure. We looked at properties of point and interval inference on μ , and σ based on the posterior mean, median, and mode and equal-tailed 95% credibility interval. Bayesian inference did very well for the parameter μ , but under the conditions used here, MCMC inference performance for σ was mixed: poor for sparse data (i.e., only 7 occasions) or σ=0 , but good when there were sufficient data and not small σ .

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Background The estimation of demographic parameters from genetic data often requires the computation of likelihoods. However, the likelihood function is computationally intractable for many realistic evolutionary models, and the use of Bayesian inference has therefore been limited to very simple models. The situation changed recently with the advent of Approximate Bayesian Computation (ABC) algorithms allowing one to obtain parameter posterior distributions based on simulations not requiring likelihood computations. Results Here we present ABCtoolbox, a series of open source programs to perform Approximate Bayesian Computations (ABC). It implements various ABC algorithms including rejection sampling, MCMC without likelihood, a Particle-based sampler and ABC-GLM. ABCtoolbox is bundled with, but not limited to, a program that allows parameter inference in a population genetics context and the simultaneous use of different types of markers with different ploidy levels. In addition, ABCtoolbox can also interact with most simulation and summary statistics computation programs. The usability of the ABCtoolbox is demonstrated by inferring the evolutionary history of two evolutionary lineages of Microtus arvalis. Using nuclear microsatellites and mitochondrial sequence data in the same estimation procedure enabled us to infer sex-specific population sizes and migration rates and to find that males show smaller population sizes but much higher levels of migration than females. Conclusion ABCtoolbox allows a user to perform all the necessary steps of a full ABC analysis, from parameter sampling from prior distributions, data simulations, computation of summary statistics, estimation of posterior distributions, model choice, validation of the estimation procedure, and visualization of the results.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Animal and early clinical studies of gene therapy for tissue ischaemia suggested that this approach might provide benefit to patients with coronary artery disease not amenable to traditional revascularization. This enthusiasm was then tempered by the subsequent disappointing results of randomized clinical trials and led researchers to develop strategies using progenitor cells as an alternative to improve collateral function. However, the recent publication of several randomized clinical trials reporting either negative or weakly positive results using this approach have led to questions regarding its effectiveness. There are several factors that need to be considered in explaining the discordance between the positive studies of such treatments in animals and the disappointing results seen in randomized patient trials. Aside from the practical issues of arteriogenic therapies, such as effective delivery, vascular remodelling is an extraordinarily complex process, and the administration of a single agent or cell in the hope that it would lead to lasting physiological effects may be far too simplistic an approach. In addition, however, evidence now suggests that many of the traditional cardiovascular risk factors-such as age and hypercholesterolemia-may impair the host response not only to ischaemia but, critically, also to treatment as well. This review discusses the evidence and mechanisms for these observations and highlights future directions that might be taken in an effort to provide more effective therapies.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

One of the main problems of flood hazard assessment in ungauged or poorly gauged basins is the lack of runoff data. In an attempt to overcome this problem we have combined archival records, dendrogeomorphic time series and instrumental data (daily rainfall and discharge) from four ungauged and poorly gauged mountain basins in Central Spain with the aim of reconstructing and compiling information on 41 flash flood events since the end of the 19th century. Estimation of historical discharge and the incorporation of uncertainty for the at-site and regional flood frequency analysis were performed with an empirical rainfall–runoff assessment as well as stochastic and Bayesian Markov Chain Monte Carlo (MCMC) approaches. Results for each of the ungauged basins include flood frequency, severity, seasonality and triggers (synoptic meteorological situations). The reconstructed data series clearly demonstrates how uncertainty can be reduced by including historical information, but also points to the considerable influence of different approaches on quantile estimation. This uncertainty should be taken into account when these data are used for flood risk management.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Treatment-resistant hypertension (TRH) affects between 3 and 30% of hypertensive patients, and its presence is associated with increased cardiovascular morbidity and mortality. Until recently, the interest on these patients has been limited, because providing care for them is difficult and often frustrating. However, the arrival of new treatment options [i.e. catheter-based renal denervation (RDN) and baroreceptor stimulation] has revitalized the interest in this topic. The very promising results of the initial uncontrolled studies on the blood pressure (BP)-lowering effect of RDN in TRH seemed to suggest that this intervention might represent an easy solution for a complex problem. However, subsequently, data from controlled studies have tempered the enthusiasm of the medical community (and the industry). Conversely, these new studies emphasized some seminal aspects on this topic: (i) the key role of 24 h ambulatory BP and arterial stiffness measurement to identify 'true' resistant patients; (ii) the high prevalence of secondary hypertension among this population; and (iii) the difficulty to identify those patients who may profit from device-based interventions. Accordingly, for those patients with documented TRH, the guidelines suggest to refer them to a hypertension specialist/centre in order to perform adequate work-up and treatment strategies. The aim of this review is to provide guidance for the cardiologist on how to identify patients with TRH and elucidate the prevailing underlying pathophysiological mechanism(s), to define a strategy for the identification of patients with TRH who may benefit from device-based interventions and discuss results and limitations of these interventions, and finally to briefly summarize the different drug-based treatment strategies.