933 resultados para Bayesian mixture model


Relevância:

30.00% 30.00%

Publicador:

Resumo:

When conducting a randomized comparative clinical trial, ethical, scientific or economic considerations often motivate the use of interim decision rules after successive groups of patients have been treated. These decisions may pertain to the comparative efficacy or safety of the treatments under study, cost considerations, the desire to accelerate the drug evaluation process, or the likelihood of therapeutic benefit for future patients. At the time of each interim decision, an important question is whether patient enrollment should continue or be terminated; either due to a high probability that one treatment is superior to the other, or a low probability that the experimental treatment will ultimately prove to be superior. The use of frequentist group sequential decision rules has become routine in the conduct of phase III clinical trials. In this dissertation, we will present a new Bayesian decision-theoretic approach to the problem of designing a randomized group sequential clinical trial, focusing on two-arm trials with time-to-failure outcomes. Forward simulation is used to obtain optimal decision boundaries for each of a set of possible models. At each interim analysis, we use Bayesian model selection to adaptively choose the model having the largest posterior probability of being correct, and we then make the interim decision based on the boundaries that are optimal under the chosen model. We provide a simulation study to compare this method, which we call Bayesian Doubly Optimal Group Sequential (BDOGS), to corresponding frequentist designs using either O'Brien-Fleming (OF) or Pocock boundaries, as obtained from EaSt 2000. Our simulation results show that, over a wide variety of different cases, BDOGS either performs at least as well as both OF and Pocock, or on average provides a much smaller trial. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The joint modeling of longitudinal and survival data is a new approach to many applications such as HIV, cancer vaccine trials and quality of life studies. There are recent developments of the methodologies with respect to each of the components of the joint model as well as statistical processes that link them together. Among these, second order polynomial random effect models and linear mixed effects models are the most commonly used for the longitudinal trajectory function. In this study, we first relax the parametric constraints for polynomial random effect models by using Dirichlet process priors, then three longitudinal markers rather than only one marker are considered in one joint model. Second, we use a linear mixed effect model for the longitudinal process in a joint model analyzing the three markers. In this research these methods were applied to the Primary Biliary Cirrhosis sequential data, which were collected from a clinical trial of primary biliary cirrhosis (PBC) of the liver. This trial was conducted between 1974 and 1984 at the Mayo Clinic. The effects of three longitudinal markers (1) Total Serum Bilirubin, (2) Serum Albumin and (3) Serum Glutamic-Oxaloacetic transaminase (SGOT) on patients' survival were investigated. Proportion of treatment effect will also be studied using the proposed joint modeling approaches. ^ Based on the results, we conclude that the proposed modeling approaches yield better fit to the data and give less biased parameter estimates for these trajectory functions than previous methods. Model fit is also improved after considering three longitudinal markers instead of one marker only. The results from analysis of proportion of treatment effects from these joint models indicate same conclusion as that from the final model of Fleming and Harrington (1991), which is Bilirubin and Albumin together has stronger impact in predicting patients' survival and as a surrogate endpoints for treatment. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study investigates a theoretical model where a longitudinal process, that is a stationary Markov-Chain, and a Weibull survival process share a bivariate random effect. Furthermore, a Quality-of-Life adjusted survival is calculated as the weighted sum of survival time. Theoretical values of population mean adjusted survival of the described model are computed numerically. The parameters of the bivariate random effect do significantly affect theoretical values of population mean. Maximum-Likelihood and Bayesian methods are applied on simulated data to estimate the model parameters. Based on the parameter estimates, predicated population mean adjusted survival can then be calculated numerically and compared with the theoretical values. Bayesian method and Maximum-Likelihood method provide parameter estimations and population mean prediction with comparable accuracy; however Bayesian method suffers from poor convergence due to autocorrelation and inter-variable correlation. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This dissertation examined body mass index (BMI) growth trajectories and the effects of gender, ethnicity, dietary intake, and physical activity (PA) on BMI growth trajectories among 3rd to 12th graders (9-18 years of age). Growth curve model analysis was performed using data from The Child and Adolescent Trial for Cardiovascular Health (CATCH) study. The study population included 2909 students who were followed up from grades 3-12. The main outcome was BMI at grades 3, 4, 5, 8, and 12. ^ The results revealed that BMI growth differed across two distinct developmental periods of childhood and adolescence. Rate of BMI growth was faster in middle childhood (9-11 years old or 3rd - 5th grades) than in adolescence (11-18 years old or 5th - 12th grades). Students with higher BMI at 3rd grade (baseline) had faster rates of BMI growth. Three groups of students with distinct BMI growth trajectories were identified: high, average, and low. ^ Black and Hispanic children were more likely to be in the groups with higher baseline BMI and faster rates of BMI growth over time. The effects of gender or ethnicity on BMI growth differed across the three groups. The effects of ethnicity on BMI growth were weakened as the children aged. The effects of gender on BMI growth were attenuated in the groups with a large proportion of black and Hispanic children, i.e., “high” or “average” BMI trajectory group. After controlling for gender, ethnicity, and age at baseline, in the “high BMI trajectory”, rate of yearly BMI growth in middle childhood increased 0.102 for every 500 Kcals increase (p=0.049). No significant effects of percentage of energy from total fat and saturated fat on BMI growth were found. Baseline BMI increased 0.041 for every 30 minutes increased in moderate-to-vigorous PA (MVPA) in the “low BMI trajectory”, while Baseline BMI decreased 0.345 for every 30 minutes increased in vigorous PA (VPA) in the “high BMI trajectory”. ^ Childhood overweight and obesity interventions should start at the earliest possible ages, prior to 3rd grade and continue through grade school. Interventions should focus on all children, but specifically black and Hispanic children, who are more likely to be highest at-risk. Promoting VPA earlier in childhood is important for preventing overweight and obesity among children and adolescents. Interventions should target total energy intake, rather than only percentage of energy from total fat or saturated fat. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In geographical epidemiology, maps of disease rates and disease risk provide a spatial perspective for researching disease etiology. For rare diseases or when the population base is small, the rate and risk estimates may be unstable. Empirical Bayesian (EB) methods have been used to spatially smooth the estimates by permitting an area estimate to "borrow strength" from its neighbors. Such EB methods include the use of a Gamma model, of a James-Stein estimator, and of a conditional autoregressive (CAR) process. A fully Bayesian analysis of the CAR process is proposed. One advantage of this fully Bayesian analysis is that it can be implemented simply by using repeated sampling from the posterior densities. Use of a Markov chain Monte Carlo technique such as Gibbs sampler was not necessary. Direct resampling from the posterior densities provides exact small sample inferences instead of the approximate asymptotic analyses of maximum likelihood methods (Clayton & Kaldor, 1987). Further, the proposed CAR model provides for covariates to be included in the model. A simulation demonstrates the effect of sample size on the fully Bayesian analysis of the CAR process. The methods are applied to lip cancer data from Scotland, and the results are compared. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In regression analysis, covariate measurement error occurs in many applications. The error-prone covariates are often referred to as latent variables. In this proposed study, we extended the study of Chan et al. (2008) on recovering latent slope in a simple regression model to that in a multiple regression model. We presented an approach that applied the Monte Carlo method in the Bayesian framework to the parametric regression model with the measurement error in an explanatory variable. The proposed estimator applied the conditional expectation of latent slope given the observed outcome and surrogate variables in the multiple regression models. A simulation study was presented showing that the method produces estimator that is efficient in the multiple regression model, especially when the measurement error variance of surrogate variable is large.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Treating patients with combined agents is a growing trend in cancer clinical trials. Evaluating the synergism of multiple drugs is often the primary motivation for such drug-combination studies. Focusing on the drug combination study in the early phase clinical trials, our research is composed of three parts: (1) We conduct a comprehensive comparison of four dose-finding designs in the two-dimensional toxicity probability space and propose using the Bayesian model averaging method to overcome the arbitrariness of the model specification and enhance the robustness of the design; (2) Motivated by a recent drug-combination trial at MD Anderson Cancer Center with a continuous-dose standard of care agent and a discrete-dose investigational agent, we propose a two-stage Bayesian adaptive dose-finding design based on an extended continual reassessment method; (3) By combining phase I and phase II clinical trials, we propose an extension of a single agent dose-finding design. We model the time-to-event toxicity and efficacy to direct dose finding in two-dimensional drug-combination studies. We conduct extensive simulation studies to examine the operating characteristics of the aforementioned designs and demonstrate the designs' good performances in various practical scenarios.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

My dissertation focuses mainly on Bayesian adaptive designs for phase I and phase II clinical trials. It includes three specific topics: (1) proposing a novel two-dimensional dose-finding algorithm for biological agents, (2) developing Bayesian adaptive screening designs to provide more efficient and ethical clinical trials, and (3) incorporating missing late-onset responses to make an early stopping decision. Treating patients with novel biological agents is becoming a leading trend in oncology. Unlike cytotoxic agents, for which toxicity and efficacy monotonically increase with dose, biological agents may exhibit non-monotonic patterns in their dose-response relationships. Using a trial with two biological agents as an example, we propose a phase I/II trial design to identify the biologically optimal dose combination (BODC), which is defined as the dose combination of the two agents with the highest efficacy and tolerable toxicity. A change-point model is used to reflect the fact that the dose-toxicity surface of the combinational agents may plateau at higher dose levels, and a flexible logistic model is proposed to accommodate the possible non-monotonic pattern for the dose-efficacy relationship. During the trial, we continuously update the posterior estimates of toxicity and efficacy and assign patients to the most appropriate dose combination. We propose a novel dose-finding algorithm to encourage sufficient exploration of untried dose combinations in the two-dimensional space. Extensive simulation studies show that the proposed design has desirable operating characteristics in identifying the BODC under various patterns of dose-toxicity and dose-efficacy relationships. Trials of combination therapies for the treatment of cancer are playing an increasingly important role in the battle against this disease. To more efficiently handle the large number of combination therapies that must be tested, we propose a novel Bayesian phase II adaptive screening design to simultaneously select among possible treatment combinations involving multiple agents. Our design is based on formulating the selection procedure as a Bayesian hypothesis testing problem in which the superiority of each treatment combination is equated to a single hypothesis. During the trial conduct, we use the current values of the posterior probabilities of all hypotheses to adaptively allocate patients to treatment combinations. Simulation studies show that the proposed design substantially outperforms the conventional multi-arm balanced factorial trial design. The proposed design yields a significantly higher probability for selecting the best treatment while at the same time allocating substantially more patients to efficacious treatments. The proposed design is most appropriate for the trials combining multiple agents and screening out the efficacious combination to be further investigated. The proposed Bayesian adaptive phase II screening design substantially outperformed the conventional complete factorial design. Our design allocates more patients to better treatments while at the same time providing higher power to identify the best treatment at the end of the trial. Phase II trial studies usually are single-arm trials which are conducted to test the efficacy of experimental agents and decide whether agents are promising to be sent to phase III trials. Interim monitoring is employed to stop the trial early for futility to avoid assigning unacceptable number of patients to inferior treatments. We propose a Bayesian single-arm phase II design with continuous monitoring for estimating the response rate of the experimental drug. To address the issue of late-onset responses, we use a piece-wise exponential model to estimate the hazard function of time to response data and handle the missing responses using the multiple imputation approach. We evaluate the operating characteristics of the proposed method through extensive simulation studies. We show that the proposed method reduces the total length of the trial duration and yields desirable operating characteristics for different physician-specified lower bounds of response rate with different true response rates.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

There are two practical challenges in the phase I clinical trial conduct: lack of transparency to physicians, and the late onset toxicity. In my dissertation, Bayesian approaches are used to address these two problems in clinical trial designs. The proposed simple optimal designs cast the dose finding problem as a decision making process for dose escalation and deescalation. The proposed designs minimize the incorrect decision error rate to find the maximum tolerated dose (MTD). For the late onset toxicity problem, a Bayesian adaptive dose-finding design for drug combination is proposed. The dose-toxicity relationship is modeled using the Finney model. The unobserved delayed toxicity outcomes are treated as missing data and Bayesian data augment is employed to handle the resulting missing data. Extensive simulation studies have been conducted to examine the operating characteristics of the proposed designs and demonstrated the designs' good performances in various practical scenarios.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Early phase clinical trial designs have long been the focus of interest for clinicians and statisticians working in oncology field. There are several standard phse I and phase II designs that have been widely-implemented in medical practice. For phase I design, the most commonly used methods are 3+3 and CRM. A newly-developed Bayesian model-based mTPI design has now been used by an increasing number of hospitals and pharmaceutical companies. The advantages and disadvantages of these three top phase I designs have been discussed in my work here and their performances were compared using simulated data. It was shown that mTPI design exhibited superior performance in most scenarios in comparison with 3+3 and CRM designs. ^ The next major part of my work is proposing an innovative seamless phase I/II design that allows clinicians to conduct phase I and phase II clinical trials simultaneously. Bayesian framework was implemented throughout the whole design. The phase I portion of the design adopts mTPI method, with the addition of futility rule which monitors the efficacy performance of the tested drugs. Dose graduation rules were proposed in this design to allow doses move forward from phase I portion of the study to phase II portion without interrupting the ongoing phase I dose-finding schema. Once a dose graduated to phase II, adaptive randomization was used to randomly allocated patients into different treatment arms, with the intention of more patients being assigned to receive more promising dose(s). Again simulations were performed to compare the performance of this innovative phase I/II design with a recently published phase I/II design, together with the conventional phase I and phase II designs. The simulation results indicated that the seamless phase I/II design outperform the other two competing methods in most scenarios, with superior trial power and the fact that it requires smaller sample size. It also significantly reduces the overall study time. ^ Similar to other early phase clinical trial designs, the proposed seamless phase I/II design requires that the efficacy and safety outcomes being able to be observed in a short time frame. This limitation can be overcome by using validated surrogate marker for the efficacy and safety endpoints.^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Existing models estimating oil spill costs at sea are based on data from the past, and they usually lack a systematic approach. This make them passive, and limits their ability to forecast the effect of the changes in the oil combating fleet or location of a spill on the oil spill costs. In this paper we make an attempt towards the development of a probabilistic and systematic model estimating the costs of clean-up operations for the Gulf of Finland. For this purpose we utilize expert knowledge along with the available data and information from literature. Then, the obtained information is combined into a framework with the use of a Bayesian Belief Networks. Due to lack of data, we validate the model by comparing its results with existing models, with which we found good agreement. We anticipate that the presented model can contribute to the cost-effective oil-combating fleet optimization for the Gulf of Finland. It can also facilitate the accident consequences estimation in the framework of formal safety assessment (FSA).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We investigate changes in the delivery and oceanic transport of Amazon sediments related to terrestrial climate variations over the last 250 ka. We present high-resolution geochemical records from four marine sediment cores located between 5 and 12° N along the northern South American margin. The Amazon River is the sole source of terrigenous material for sites at 5 and 9° N, while the core at 12° N receives a mixture of Amazon and Orinoco detrital particles. Using an endmember unmixing model, we estimated the relative proportions of Amazon Andean material ("%-Andes", at 5 and 9° N) and of Amazon material ("%-Amazon", at 12° N) within the terrigenous fraction. The %-Andes and %-Amazon records exhibit significant precessional variations over the last 250 ka that are more pronounced during interglacials in comparison to glacial periods. High %-Andes values observed during periods of high austral summer insolation reflect the increased delivery of suspended sediments by Andean tributaries and enhanced Amazonian precipitation, in agreement with western Amazonian speleothem records. Increased Amazonian rainfall reflects the intensification of the South American monsoon in response to enhanced land-ocean thermal gradient and moisture convergence. However, low %-Amazon values obtained at 12° N during the same periods seem to contradict the increased delivery of Amazon sediments. We propose that reorganizations in surface ocean currents modulate the northwestward transport of Amazon material. In agreement with published records, the seasonal North Brazil Current retroflection is intensified (or prolonged in duration) during cold substages of the last 250 ka (which correspond to intervals of high DJF or low JJA insolation) and deflects eastward the Amazon sediment and freshwater plume.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper estimates the elasticity of labor productivity with respect to employment density, a widely used measure of the agglomeration effect, in the Yangtze River Delta, China. A spatial Durbin model is presented that makes explicit the influences of spatial dependence and endogeneity bias in a very simple way. Results of Bayesian estimation using the data of the year 2009 indicate that the productivity is influenced by factors correlated with density rather than density itself and that spatial spillovers of these factors of agglomeration play a significant role. They are consistent with the findings of Ke (2010) and Artis, et al. (2011) that suggest the importance of taking into account spatial dependence and hitherto omitted variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper estimates the impact of industrial agglomeration on firm-level productivity in Chinese manufacturing sectors. To account for spatial autocorrelation across regions, we formulate a hierarchical spatial model at the firm level and develop a Bayesian estimation algorithm. A Bayesian instrumental-variables approach is used to address endogeneity bias of agglomeration. Robust to these potential biases, we find that agglomeration of the same industry (i.e. localization) has a productivity-boosting effect, but agglomeration of urban population (i.e. urbanization) has no such effects. Additionally, the localization effects increase with educational levels of employees and the share of intermediate inputs in gross output. These results may suggest that agglomeration externalities occur through knowledge spillovers and input sharing among firms producing similar manufactures.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Learning the structure of a graphical model from data is a common task in a wide range of practical applications. In this paper, we focus on Gaussian Bayesian networks, i.e., on continuous data and directed acyclic graphs with a joint probability density of all variables given by a Gaussian. We propose to work in an equivalence class search space, specifically using the k-greedy equivalence search algorithm. This, combined with regularization techniques to guide the structure search, can learn sparse networks close to the one that generated the data. We provide results on some synthetic networks and on modeling the gene network of the two biological pathways regulating the biosynthesis of isoprenoids for the Arabidopsis thaliana plant