933 resultados para Bayesian mixture model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a method for evaluating an ensemble of predictive models given a sample of observations comprising the model predictions and the outcome event measured with error. Our formulation allows us to simultaneously estimate measurement error parameters, true outcome — aka the gold standard — and a relative weighting of the predictive scores. We describe conditions necessary to estimate the gold standard and for these estimates to be calibrated and detail how our approach is related to, but distinct from, standard model combination techniques. We apply our approach to data from a study to evaluate a collection of BRCA1/BRCA2 gene mutation prediction scores. In this example, genotype is measured with error by one or more genetic assays. We estimate true genotype for each individual in the dataset, operating characteristics of the commonly used genotyping procedures and a relative weighting of the scores. Finally, we compare the scores against the gold standard genotype and find that Mendelian scores are, on average, the more refined and better calibrated of those considered and that the comparison is sensitive to measurement error in the gold standard.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study is to develop statistical methodology to facilitate indirect estimation of the concentration of antiretroviral drugs and viral loads in the prostate gland and the seminal vesicle. The differences in antiretroviral drug concentrations in these organs may lead to suboptimal concentrations in one gland compared to the other. Suboptimal levels of the antiretroviral drugs will not be able to fully suppress the virus in that gland, lead to a source of sexually transmissible virus and increase the chance of selecting for drug resistant virus. This information may be useful selecting antiretroviral drug regimen that will achieve optimal concentrations in most of male genital tract glands. Using fractionally collected semen ejaculates, Lundquist (1949) measured levels of surrogate markers in each fraction that are uniquely produced by specific male accessory glands. To determine the original glandular concentrations of the surrogate markers, Lundquist solved a simultaneous series of linear equations. This method has several limitations. In particular, it does not yield a unique solution, it does not address measurement error, and it disregards inter-subject variability in the parameters. To cope with these limitations, we developed a mechanistic latent variable model based on the physiology of the male genital tract and surrogate markers. We employ a Bayesian approach and perform a sensitivity analysis with regard to the distributional assumptions on the random effects and priors. The model and Bayesian approach is validated on experimental data where the concentration of a drug should be (biologically) differentially distributed between the two glands. In this example, the Bayesian model-based conclusions are found to be robust to model specification and this hierarchical approach leads to more scientifically valid conclusions than the original methodology. In particular, unlike existing methods, the proposed model based approach was not affected by a common form of outliers.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The present distribution of freshwater fish in the Alpine region has been strongly affected by colonization events occurring after the last glacial maximum (LGM), some 20,000 years ago. We use here a spatially explicit simulation framework to model and better understand their colonization dynamics in the Swiss Rhine basin. This approach is applied to the European bullhead (Cottus gobio), which is an ideal model organism to study fish past demographic processes since it has not been managed by humans. The molecular diversity of eight sampled populations is simulated and compared to observed data at six microsatellite loci under an approximate Bayesian computation framework to estimate the parameters of the colonization process. Our demographic estimates fit well with current knowledge about the biology of this species, but they suggest that the Swiss Rhine basin was colonized very recently, after the Younger Dryas some 6600 years ago. We discuss the implication of this result, as well as the strengths and limits of the spatially explicit approach coupled to the approximate Bayesian computation framework.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation represents experimental and numerical investigations of combustion initiation trigged by electrical-discharge-induced plasma within lean and dilute methane air mixture. This research topic is of interest due to its potential to further promote the understanding and prediction of spark ignition quality in high efficiency gasoline engines, which operate with lean and dilute fuel-air mixture. It is specified in this dissertation that the plasma to flame transition is the key process during the spark ignition event, yet it is also the most complicated and least understood procedure. Therefore the investigation is focused on the overlapped periods when plasma and flame both exists in the system. Experimental study is divided into two parts. Experiments in Part I focuses on the flame kernel resulting from the electrical discharge. A number of external factors are found to affect the growth of the flame kernel, resulting in complex correlations between discharge and flame kernel. Heat loss from the flame kernel to code ambient is found to be a dominant factor that quenches the flame kernel. Another experimental focus is on the plasma channel. Electrical discharges into gases induce intense and highly transient plasma. Detailed observation of the size and contents of the discharge-induced plasma channel is performed. Given the complex correlation and the multi-discipline physical/chemical processes involved in the plasma-flame transition, the modeling principle is taken to reproduce detailed transitions numerically with minimum analytical assumptions. Detailed measurement obtained from experimental work facilitates the more accurate description of initial reaction conditions. The novel and unique spark source considering both energy and species deposition is defined in a justified manner, which is the key feature of this Ignition by Plasma (IBP) model. The results of numerical simulation are intuitive and the potential of numerical simulation to better resolve the complex spark ignition mechanism is presented. Meanwhile, imperfections of the IBP model and numerical simulation have been specified and will address future attentions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Monte Carlo simulation was used to evaluate properties of a simple Bayesian MCMC analysis of the random effects model for single group Cormack-Jolly-Seber capture-recapture data. The MCMC method is applied to the model via a logit link, so parameters p, S are on a logit scale, where logit(S) is assumed to have, and is generated from, a normal distribution with mean μ and variance σ2 . Marginal prior distributions on logit(p) and μ were independent normal with mean zero and standard deviation 1.75 for logit(p) and 100 for μ ; hence minimally informative. Marginal prior distribution on σ2 was placed on τ2=1/σ2 as a gamma distribution with α=β=0.001 . The study design has 432 points spread over 5 factors: occasions (t) , new releases per occasion (u), p, μ , and σ . At each design point 100 independent trials were completed (hence 43,200 trials in total), each with sample size n=10,000 from the parameter posterior distribution. At 128 of these design points comparisons are made to previously reported results from a method of moments procedure. We looked at properties of point and interval inference on μ , and σ based on the posterior mean, median, and mode and equal-tailed 95% credibility interval. Bayesian inference did very well for the parameter μ , but under the conditions used here, MCMC inference performance for σ was mixed: poor for sparse data (i.e., only 7 occasions) or σ=0 , but good when there were sufficient data and not small σ .

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The significance of the adjacent cartilage in cartilage defect healing is not yet completely understood. Furthermore, it is unknown if the adjacent cartilage can somehow be influenced into responding after cartilage damage. The present study was undertaken to investigate whether the adjacent cartilage can be better sustained after microfracturing in a cartilage defect model in the stifle joint of sheep using a transcutaneous treatment concept (Vetdrop(®)). Carprofen and chito-oligosaccharids were added either as single components or as a mixture to a vehicle suspension consisting of a herbal carrier oil in a water-in-oil phase. This mixture was administered onto the skin with the aid of a specific applicator during 6 weeks in 28 sheep, allocated into 6 different groups, that underwent microfracturing surgery either on the left or the right medial femoral condyle. Two groups served as control and were either treated intravenously or sham treated with oxygen only. Sheep were sacrificed and their medial condyle histologically evaluated qualitatively and semi-quantitatively according to 4 different scoring systems (Mankin, ICRS, Little and O'Driscoll). The adjacent cartilage of animals of group 4 treated transcutaneously with vehicle, chito-oligosaccharids and carprofen had better histological scores compared to all the other groups (Mankin 3.3±0.8, ICRS 15.7±0.7, Little 9.0±1.4). Complete defect filling was absent from the transcutaneous treatment groups. The experiment suggests that the adjacent cartilage is susceptible to treatment and that the combination of vehicle, chitooligosaccharids and carprofen may sustain the adjacent cartilage during the recovery period.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The considerable search for synergistic agents in cancer research is motivated by the therapeutic benefits achieved by combining anti-cancer agents. Synergistic agents make it possible to reduce dosage while maintaining or enhancing a desired effect. Other favorable outcomes of synergistic agents include reduction in toxicity and minimizing or delaying drug resistance. Dose-response assessment and drug-drug interaction analysis play an important part in the drug discovery process, however analysis are often poorly done. This dissertation is an effort to notably improve dose-response assessment and drug-drug interaction analysis. The most commonly used method in published analysis is the Median-Effect Principle/Combination Index method (Chou and Talalay, 1984). The Median-Effect Principle/Combination Index method leads to inefficiency by ignoring important sources of variation inherent in dose-response data and discarding data points that do not fit the Median-Effect Principle. Previous work has shown that the conventional method yields a high rate of false positives (Boik, Boik, Newman, 2008; Hennessey, Rosner, Bast, Chen, 2010) and, in some cases, low power to detect synergy. There is a great need for improving the current methodology. We developed a Bayesian framework for dose-response modeling and drug-drug interaction analysis. First, we developed a hierarchical meta-regression dose-response model that accounts for various sources of variation and uncertainty and allows one to incorporate knowledge from prior studies into the current analysis, thus offering a more efficient and reliable inference. Second, in the case that parametric dose-response models do not fit the data, we developed a practical and flexible nonparametric regression method for meta-analysis of independently repeated dose-response experiments. Third, and lastly, we developed a method, based on Loewe additivity that allows one to quantitatively assess interaction between two agents combined at a fixed dose ratio. The proposed method makes a comprehensive and honest account of uncertainty within drug interaction assessment. Extensive simulation studies show that the novel methodology improves the screening process of effective/synergistic agents and reduces the incidence of type I error. We consider an ovarian cancer cell line study that investigates the combined effect of DNA methylation inhibitors and histone deacetylation inhibitors in human ovarian cancer cell lines. The hypothesis is that the combination of DNA methylation inhibitors and histone deacetylation inhibitors will enhance antiproliferative activity in human ovarian cancer cell lines compared to treatment with each inhibitor alone. By applying the proposed Bayesian methodology, in vitro synergy was declared for DNA methylation inhibitor, 5-AZA-2'-deoxycytidine combined with one histone deacetylation inhibitor, suberoylanilide hydroxamic acid or trichostatin A in the cell lines HEY and SKOV3. This suggests potential new epigenetic therapies in cell growth inhibition of ovarian cancer cells.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation explores phase I dose-finding designs in cancer trials from three perspectives: the alternative Bayesian dose-escalation rules, a design based on a time-to-dose-limiting toxicity (DLT) model, and a design based on a discrete-time multi-state (DTMS) model. We list alternative Bayesian dose-escalation rules and perform a simulation study for the intra-rule and inter-rule comparisons based on two statistical models to identify the most appropriate rule under certain scenarios. We provide evidence that all the Bayesian rules outperform the traditional ``3+3'' design in the allocation of patients and selection of the maximum tolerated dose. The design based on a time-to-DLT model uses patients' DLT information over multiple treatment cycles in estimating the probability of DLT at the end of treatment cycle 1. Dose-escalation decisions are made whenever a cycle-1 DLT occurs, or two months after the previous check point. Compared to the design based on a logistic regression model, the new design shows more safety benefits for trials in which more late-onset toxicities are expected. As a trade-off, the new design requires more patients on average. The design based on a discrete-time multi-state (DTMS) model has three important attributes: (1) Toxicities are categorized over a distribution of severity levels, (2) Early toxicity may inform dose escalation, and (3) No suspension is required between accrual cohorts. The proposed model accounts for the difference in the importance of the toxicity severity levels and for transitions between toxicity levels. We compare the operating characteristics of the proposed design with those from a similar design based on a fully-evaluated model that directly models the maximum observed toxicity level within the patients' entire assessment window. We describe settings in which, under comparable power, the proposed design shortens the trial. The proposed design offers more benefit compared to the alternative design as patient accrual becomes slower.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background Different anesthesia regimes are commonly used in experimental models of cardiac arrest, but the effects of various anesthetics on clinical outcome parameters are unknown. We conducted a study in which we subjected rats to cardiac arrest under medetomidine/ketamine or sevoflurane/fentanyl anesthesia. Methods Asystolic cardiac arrest for 8 minutes was induced in 73 rats with a mixture of potassium chloride and esmolol. Daily behavioral and neurological examination included the open field test (OFT), the tape removal test (TRT) and a neurodeficit score (NDS). Animals were randomized for sacrifice on day 2 or day 5 and brains were harvested for histology in the hippocampus cornus ammonis segment CA1. The inflammatory markers IL-6, TNF-α, MCP-1 and MIP-1α were assessed in cerebrospinal fluid (CSF). Proportions of survival were tested with the Fisher’s exact test, repeated measurements were assessed with the Friedman’s test; the baseline values were tested using Mann–Whitney U test and the difference of results of repeated measures were compared. Results In 31 animals that survived beyond 24 hours neither OFT, TRT nor NDS differed between the groups; histology was similar on day 2. On day 5, significantly more apoptosis in the CA1 segment of the hippocampus was found in the sevoflurane/fentanyl group. MCP-1 was higher on day 5 in the sevoflurane/fentanyl group (p = 0.04). All other cyto- and chemokines were below detection threshold. Conclusion In our cardiac arrest model neurological function was not influenced by different anesthetic regimes; in contrast, anesthesia with sevoflurane/fentanyl results in increased CSF inflammation and histologic damage at day 5 post cardiac arrest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The nematode Caenorhabditis elegans is a well-known model organism used to investigate fundamental questions in biology. Motility assays of this small roundworm are designed to study the relationships between genes and behavior. Commonly, motility analysis is used to classify nematode movements and characterize them quantitatively. Over the past years, C. elegans' motility has been studied across a wide range of environments, including crawling on substrates, swimming in fluids, and locomoting through microfluidic substrates. However, each environment often requires customized image processing tools relying on heuristic parameter tuning. In the present study, we propose a novel Multi-Environment Model Estimation (MEME) framework for automated image segmentation that is versatile across various environments. The MEME platform is constructed around the concept of Mixture of Gaussian (MOG) models, where statistical models for both the background environment and the nematode appearance are explicitly learned and used to accurately segment a target nematode. Our method is designed to simplify the burden often imposed on users; here, only a single image which includes a nematode in its environment must be provided for model learning. In addition, our platform enables the extraction of nematode ‘skeletons’ for straightforward motility quantification. We test our algorithm on various locomotive environments and compare performances with an intensity-based thresholding method. Overall, MEME outperforms the threshold-based approach for the overwhelming majority of cases examined. Ultimately, MEME provides researchers with an attractive platform for C. elegans' segmentation and ‘skeletonizing’ across a wide range of motility assays.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Information on the relationship between cumulative fossil CO2 emissions and multiple climate targets is essential to design emission mitigation and climate adaptation strategies. In this study, the transient response of a climate or environmental variable per trillion tonnes of CO2 emissions, termed TRE, is quantified for a set of impact-relevant climate variables and from a large set of multi-forcing scenarios extended to year 2300 towards stabilization. An  ∼ 1000-member ensemble of the Bern3D-LPJ carbon–climate model is applied and model outcomes are constrained by 26 physical and biogeochemical observational data sets in a Bayesian, Monte Carlo-type framework. Uncertainties in TRE estimates include both scenario uncertainty and model response uncertainty. Cumulative fossil emissions of 1000 Gt C result in a global mean surface air temperature change of 1.9 °C (68 % confidence interval (c.i.): 1.3 to 2.7 °C), a decrease in surface ocean pH of 0.19 (0.18 to 0.22), and a steric sea level rise of 20 cm (13 to 27 cm until 2300). Linearity between cumulative emissions and transient response is high for pH and reasonably high for surface air and sea surface temperatures, but less pronounced for changes in Atlantic meridional overturning, Southern Ocean and tropical surface water saturation with respect to biogenic structures of calcium carbonate, and carbon stocks in soils. The constrained model ensemble is also applied to determine the response to a pulse-like emission and in idealized CO2-only simulations. The transient climate response is constrained, primarily by long-term ocean heat observations, to 1.7 °C (68 % c.i.: 1.3 to 2.2 °C) and the equilibrium climate sensitivity to 2.9 °C (2.0 to 4.2 °C). This is consistent with results by CMIP5 models but inconsistent with recent studies that relied on short-term air temperature data affected by natural climate variability.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES Cerebral hypoxic-ischaemic injury following cardiac arrest is a devastating disease affecting thousands of patients each year. There is a complex interaction between post-resuscitation injury after whole-body ischaemia-reperfusion and cerebral damage which cannot be explored in in vitro systems only; there is a need for animal models. In this study, we describe and evaluate the feasibility and efficiency of our simple rodent cardiac arrest model. METHODS Ten wistar rats were subjected to 9 and 10 minutes of cardiac arrest. Cardiac arrest was introduced with a mixture of the short-acting beta-blocking drug esmolol and potassium chloride. RESULTS All animals could be resuscitated within 1 minute, and survived until day 5.General health score and neurobehavioural testing indicated substantial impairment after cardiac arrest, without differences between groups. Histological examination of the hippocampus CA1 segment, the most vulnerable segment of the cerebrum, demonstrated extensive damage in the cresyl violet staining, as well as in the Fluoro-Jade B staining and in the Iba-1 staining, indicating recruitment of microglia after the hypoxic-ischaemic event. Again, there were no differences between the 9- and 10-minute cardiac arrest groups. DISCUSSION We were able to establish a simple and reproducible 9- and 10-minute rodent cardiac arrest models with a well-defined no-flow-time. Extensive damage can be found in the hippocampus CA1 segment. The lack of difference between 9- and 10-minute cardiac arrest time in the neuropsychological, the open field test and the histological evaluations is mainly due to the small sample size.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Development of interfaces for sample introduction from high pressures is important for real-time online hyphenation of chromatographic and other separation devices with mass spectrometry (MS) or accelerator mass spectrometry (AMS). Momentum separators can reduce unwanted low-density gases and introduce the analyte into the vacuum. In this work, the axial jet separator, a new momentum interface, is characterized by theory and empirical optimization. The mathematical model describes the different axial penetration of the components of a jetgas mixture and explains the empirical results for injections of CO2 in helium into MS and AMS instruments. We show that the performance of the new interface is sensitive to the nozzle size, showing good qualitative agreement with the mathematical model. Smaller nozzle sizes are more preferable due to their higher inflow capacity. The CO2 transmission efficiency of the interface into a MS instrument is ~14% (CO2/helium separation factor of 2.7). The interface receives and delivers flows of ~17.5 mL/min and ~0.9 mL/min, respectively. For the interfaced AMS instrument, the ionization and overall efficiencies are 0.7-3% and 0.1-0.4%, respectively, for CO2 amounts of 4-0.6 µg C, which is only slightly lower compared to conventional systems using intermediate trapping. The ionization efficiency depends on to the carbon mass flow in the injected pulse and is suppressed at high CO2 flows. Relative to a conventional jet separator, the transmission efficiency of the axial jet separator is lower, but its performance is less sensitive to misalignments.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND The noble gas xenon is considered as a neuroprotective agent, but availability of the gas is limited. Studies on neuroprotection with the abundant noble gases helium and argon demonstrated mixed results, and data regarding neuroprotection after cardiac arrest are scant. We tested the hypothesis that administration of 50% helium or 50% argon for 24 h after resuscitation from cardiac arrest improves clinical and histological outcome in our 8 min rat cardiac arrest model. METHODS Forty animals had cardiac arrest induced with intravenous potassium/esmolol and were randomized to post-resuscitation ventilation with either helium/oxygen, argon/oxygen or air/oxygen for 24 h. Eight additional animals without cardiac arrest served as reference, these animals were not randomized and not included into the statistical analysis. Primary outcome was assessment of neuronal damage in histology of the region I of hippocampus proper (CA1) from those animals surviving until day 5. Secondary outcome was evaluation of neurobehavior by daily testing of a Neurodeficit Score (NDS), the Tape Removal Test (TRT), a simple vertical pole test (VPT) and the Open Field Test (OFT). Because of the non-parametric distribution of the data, the histological assessments were compared with the Kruskal-Wallis test. Treatment effect in repeated measured assessments was estimated with a linear regression with clustered robust standard errors (SE), where normality is less important. RESULTS Twenty-nine out of 40 rats survived until day 5 with significant initial deficits in neurobehavioral, but rapid improvement within all groups randomized to cardiac arrest. There were no statistical significant differences between groups neither in the histological nor in neurobehavioral assessment. CONCLUSIONS The replacement of air with either helium or argon in a 50:50 air/oxygen mixture for 24 h did not improve histological or clinical outcome in rats subjected to 8 min of cardiac arrest.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper uses Bayesian vector autoregressive models to examine the usefulness of leading indicators in predicting US home sales. The benchmark Bayesian model includes home sales, the price of homes, the mortgage rate, real personal disposable income, and the unemployment rate. We evaluate the forecasting performance of six alternative leading indicators by adding each, in turn, to the benchmark model. Out-of-sample forecast performance over three periods shows that the model that includes building permits authorized consistently produces the most accurate forecasts. Thus, the intention to build in the future provides good information with which to predict home sales. Another finding suggests that leading indicators with longer leads outperform the short-leading indicators.