7 resultados para Simulation methods

em Duke University


Relevância:

60.00% 60.00%

Publicador:

Resumo:

First-order transitions of system where both lattice site occupancy and lattice spacing fluctuate, such as cluster crystals, cannot be efficiently studied by traditional simulation methods, which necessarily fix one of these two degrees of freedom. The difficulty, however, can be surmounted by the generalized [N]pT ensemble [J. Chem. Phys. 136, 214106 (2012)]. Here we show that histogram reweighting and the [N]pT ensemble can be used to study an isostructural transition between cluster crystals of different occupancy in the generalized exponential model of index 4 (GEM-4). Extending this scheme to finite-size scaling studies also allows us to accurately determine the critical point parameters and to verify that it belongs to the Ising universality class.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer simulations of reaction processes in solution in general rely on the definition of a reaction coordinate and the determination of the thermodynamic changes of the system along the reaction coordinate. The reaction coordinate often is constituted of characteristic geometrical properties of the reactive solute species, while the contributions of solvent molecules are implicitly included in the thermodynamics of the solute degrees of freedoms. However, solvent dynamics can provide the driving force for the reaction process, and in such cases explicit description of the solvent contribution in the free energy of the reaction process becomes necessary. We report here a method that can be used to analyze the solvent contributions to the reaction activation free energies from the combined QM/MM minimum free-energy path simulations. The method was applied to the self-exchange S(N)2 reaction of CH(3)Cl + Cl(-), showing that the importance of solvent-solute interactions to the reaction process. The results were further discussed in the context of coupling between solvent and solute molecules in reaction processes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The ability to write clearly and effectively is of central importance to the scientific enterprise. Encouraged by the success of simulation environments in other biomedical sciences, we developed WriteSim TCExam, an open-source, Web-based, textual simulation environment for teaching effective writing techniques to novice researchers. We shortlisted and modified an existing open source application - TCExam to serve as a textual simulation environment. After testing usability internally in our team, we conducted formal field usability studies with novice researchers. These were followed by formal surveys with researchers fitting the role of administrators and users (novice researchers) RESULTS: The development process was guided by feedback from usability tests within our research team. Online surveys and formal studies, involving members of the Research on Research group and selected novice researchers, show that the application is user-friendly. Additionally it has been used to train 25 novice researchers in scientific writing to date and has generated encouraging results. CONCLUSION: WriteSim TCExam is the first Web-based, open-source textual simulation environment designed to complement traditional scientific writing instruction. While initial reviews by students and educators have been positive, a formal study is needed to measure its benefits in comparison to standard instructional methods.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We describe a strategy for Markov chain Monte Carlo analysis of non-linear, non-Gaussian state-space models involving batch analysis for inference on dynamic, latent state variables and fixed model parameters. The key innovation is a Metropolis-Hastings method for the time series of state variables based on sequential approximation of filtering and smoothing densities using normal mixtures. These mixtures are propagated through the non-linearities using an accurate, local mixture approximation method, and we use a regenerating procedure to deal with potential degeneracy of mixture components. This provides accurate, direct approximations to sequential filtering and retrospective smoothing distributions, and hence a useful construction of global Metropolis proposal distributions for simulation of posteriors for the set of states. This analysis is embedded within a Gibbs sampler to include uncertain fixed parameters. We give an example motivated by an application in systems biology. Supplemental materials provide an example based on a stochastic volatility model as well as MATLAB code.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Ritonavir inhibition of cytochrome P450 3A4 decreases the elimination clearance of fentanyl by 67%. We used a pharmacokinetic model developed from published data to simulate the effect of sample patient-controlled epidural labor analgesic regimens on plasma fentanyl concentrations in the absence and presence of ritonavir-induced cytochrome P450 3A4 inhibition. METHODS: Fentanyl absorption from the epidural space was modeled using tanks-in-series delay elements. Systemic fentanyl disposition was described using a three-compartment pharmacokinetic model. Parameters for epidural drug absorption were estimated by fitting the model to reported plasma fentanyl concentrations measured after epidural administration. The validity of the model was assessed by comparing predicted plasma concentrations after epidural administration to published data. The effect of ritonavir was modeled as a 67% decrease in fentanyl elimination clearance. Plasma fentanyl concentrations were simulated for six sample patient-controlled epidural labor analgesic regimens over 24 h using ritonavir and control models. Simulated data were analyzed to determine if plasma fentanyl concentrations producing a 50% decrease in minute ventilation (6.1 ng/mL) were achieved. RESULTS: Simulated plasma fentanyl concentrations in the ritonavir group were higher than those in the control group for all sample labor analgesic regimens. Maximum plasma fentanyl concentrations were 1.8 ng/mL and 3.4 ng/mL for the normal and ritonavir simulations, respectively, and did not reach concentrations associated with 50% decrease in minute ventilation. CONCLUSION: Our model predicts that even with maximal clinical dosing regimens of epidural fentanyl over 24 h, ritonavir-induced cytochrome P450 3A4 inhibition is unlikely to produce plasma fentanyl concentrations associated with a decrease in minute ventilation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A RET network consists of a network of photo-active molecules called chromophores that can participate in inter-molecular energy transfer called resonance energy transfer (RET). RET networks are used in a variety of applications including cryptographic devices, storage systems, light harvesting complexes, biological sensors, and molecular rulers. In this dissertation, we focus on creating a RET device called closed-diffusive exciton valve (C-DEV) in which the input to output transfer function is controlled by an external energy source, similar to a semiconductor transistor like the MOSFET. Due to their biocompatibility, molecular devices like the C-DEVs can be used to introduce computing power in biological, organic, and aqueous environments such as living cells. Furthermore, the underlying physics in RET devices are stochastic in nature, making them suitable for stochastic computing in which true random distribution generation is critical.

In order to determine a valid configuration of chromophores for the C-DEV, we developed a systematic process based on user-guided design space pruning techniques and built-in simulation tools. We show that our C-DEV is 15x better than C-DEVs designed using ad hoc methods that rely on limited data from prior experiments. We also show ways in which the C-DEV can be improved further and how different varieties of C-DEVs can be combined to form more complex logic circuits. Moreover, the systematic design process can be used to search for valid chromophore network configurations for a variety of RET applications.

We also describe a feasibility study for a technique used to control the orientation of chromophores attached to DNA. Being able to control the orientation can expand the design space for RET networks because it provides another parameter to tune their collective behavior. While results showed limited control over orientation, the analysis required the development of a mathematical model that can be used to determine the distribution of dipoles in a given sample of chromophore constructs. The model can be used to evaluate the feasibility of other potential orientation control techniques.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract

Continuous variable is one of the major data types collected by the survey organizations. It can be incomplete such that the data collectors need to fill in the missingness. Or, it can contain sensitive information which needs protection from re-identification. One of the approaches to protect continuous microdata is to sum them up according to different cells of features. In this thesis, I represents novel methods of multiple imputation (MI) that can be applied to impute missing values and synthesize confidential values for continuous and magnitude data.

The first method is for limiting the disclosure risk of the continuous microdata whose marginal sums are fixed. The motivation for developing such a method comes from the magnitude tables of non-negative integer values in economic surveys. I present approaches based on a mixture of Poisson distributions to describe the multivariate distribution so that the marginals of the synthetic data are guaranteed to sum to the original totals. At the same time, I present methods for assessing disclosure risks in releasing such synthetic magnitude microdata. The illustration on a survey of manufacturing establishments shows that the disclosure risks are low while the information loss is acceptable.

The second method is for releasing synthetic continuous micro data by a nonstandard MI method. Traditionally, MI fits a model on the confidential values and then generates multiple synthetic datasets from this model. Its disclosure risk tends to be high, especially when the original data contain extreme values. I present a nonstandard MI approach conditioned on the protective intervals. Its basic idea is to estimate the model parameters from these intervals rather than the confidential values. The encouraging results of simple simulation studies suggest the potential of this new approach in limiting the posterior disclosure risk.

The third method is for imputing missing values in continuous and categorical variables. It is extended from a hierarchically coupled mixture model with local dependence. However, the new method separates the variables into non-focused (e.g., almost-fully-observed) and focused (e.g., missing-a-lot) ones. The sub-model structure of focused variables is more complex than that of non-focused ones. At the same time, their cluster indicators are linked together by tensor factorization and the focused continuous variables depend locally on non-focused values. The model properties suggest that moving the strongly associated non-focused variables to the side of focused ones can help to improve estimation accuracy, which is examined by several simulation studies. And this method is applied to data from the American Community Survey.