936 resultados para Microscopic simulation models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The prognosis for lung cancer patients remains poor. Five year survival rates have been reported to be 15%. Studies have shown that dose escalation to the tumor can lead to better local control and subsequently better overall survival. However, dose to lung tumor is limited by normal tissue toxicity. The most prevalent thoracic toxicity is radiation pneumonitis. In order to determine a safe dose that can be delivered to the healthy lung, researchers have turned to mathematical models predicting the rate of radiation pneumonitis. However, these models rely on simple metrics based on the dose-volume histogram and are not yet accurate enough to be used for dose escalation trials. The purpose of this work was to improve the fit of predictive risk models for radiation pneumonitis and to show the dosimetric benefit of using the models to guide patient treatment planning. The study was divided into 3 specific aims. The first two specifics aims were focused on improving the fit of the predictive model. In Specific Aim 1 we incorporated information about the spatial location of the lung dose distribution into a predictive model. In Specific Aim 2 we incorporated ventilation-based functional information into a predictive pneumonitis model. In the third specific aim a proof of principle virtual simulation was performed where a model-determined limit was used to scale the prescription dose. The data showed that for our patient cohort, the fit of the model to the data was not improved by incorporating spatial information. Although we were not able to achieve a significant improvement in model fit using pre-treatment ventilation, we show some promising results indicating that ventilation imaging can provide useful information about lung function in lung cancer patients. The virtual simulation trial demonstrated that using a personalized lung dose limit derived from a predictive model will result in a different prescription than what was achieved with the clinically used plan; thus demonstrating the utility of a normal tissue toxicity model in personalizing the prescription dose.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Models of DNA sequence evolution and methods for estimating evolutionary distances are needed for studying the rate and pattern of molecular evolution and for inferring the evolutionary relationships of organisms or genes. In this dissertation, several new models and methods are developed.^ The rate variation among nucleotide sites: To obtain unbiased estimates of evolutionary distances, the rate heterogeneity among nucleotide sites of a gene should be considered. Commonly, it is assumed that the substitution rate varies among sites according to a gamma distribution (gamma model) or, more generally, an invariant+gamma model which includes some invariable sites. A maximum likelihood (ML) approach was developed for estimating the shape parameter of the gamma distribution $(\alpha)$ and/or the proportion of invariable sites $(\theta).$ Computer simulation showed that (1) under the gamma model, $\alpha$ can be well estimated from 3 or 4 sequences if the sequence length is long; and (2) the distance estimate is unbiased and robust against violations of the assumptions of the invariant+gamma model.^ However, this ML method requires a huge amount of computational time and is useful only for less than 6 sequences. Therefore, I developed a fast method for estimating $\alpha,$ which is easy to implement and requires no knowledge of tree. A computer program was developed for estimating $\alpha$ and evolutionary distances, which can handle the number of sequences as large as 30.^ Evolutionary distances under the stationary, time-reversible (SR) model: The SR model is a general model of nucleotide substitution, which assumes (i) stationary nucleotide frequencies and (ii) time-reversibility. It can be extended to SRV model which allows rate variation among sites. I developed a method for estimating the distance under the SR or SRV model, as well as the variance-covariance matrix of distances. Computer simulation showed that the SR method is better than a simpler method when the sequence length $L>1,000$ bp and is robust against deviations from time-reversibility. As expected, when the rate varies among sites, the SRV method is much better than the SR method.^ The evolutionary distances under nonstationary nucleotide frequencies: The statistical properties of the paralinear and LogDet distances under nonstationary nucleotide frequencies were studied. First, I developed formulas for correcting the estimation biases of the paralinear and LogDet distances. The performances of these formulas and the formulas for sampling variances were examined by computer simulation. Second, I developed a method for estimating the variance-covariance matrix of the paralinear distance, so that statistical tests of phylogenies can be conducted when the nucleotide frequencies are nonstationary. Third, a new method for testing the molecular clock hypothesis was developed in the nonstationary case. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Application of biogeochemical models to the study of marine ecosystems is pervasive, yet objective quantification of these models' performance is rare. Here, 12 lower trophic level models of varying complexity are objectively assessed in two distinct regions (equatorial Pacific and Arabian Sea). Each model was run within an identical one-dimensional physical framework. A consistent variational adjoint implementation assimilating chlorophyll-a, nitrate, export, and primary productivity was applied and the same metrics were used to assess model skill. Experiments were performed in which data were assimilated from each site individually and from both sites simultaneously. A cross-validation experiment was also conducted whereby data were assimilated from one site and the resulting optimal parameters were used to generate a simulation for the second site. When a single pelagic regime is considered, the simplest models fit the data as well as those with multiple phytoplankton functional groups. However, those with multiple phytoplankton functional groups produced lower misfits when the models are required to simulate both regimes using identical parameter values. The cross-validation experiments revealed that as long as only a few key biogeochemical parameters were optimized, the models with greater phytoplankton complexity were generally more portable. Furthermore, models with multiple zooplankton compartments did not necessarily outperform models with single zooplankton compartments, even when zooplankton biomass data are assimilated. Finally, even when different models produced similar least squares model-data misfits, they often did so via very different element flow pathways, highlighting the need for more comprehensive data sets that uniquely constrain these pathways.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The development of electrophoretic computer models and their use for simulation of electrophoretic processes has increased significantly during the last few years. Recently, GENTRANS and SIMUL5 were extended with algorithms that describe chemical equilibria between solutes and a buffer additive in a fast 1:1 interaction process, an approach that enables simulation of the electrophoretic separation of enantiomers. For acidic cationic systems with sodium and H3 0(+) as leading and terminating components, respectively, acetic acid as counter component, charged weak bases as samples, and a neutral CD as chiral selector, the new codes were used to investigate the dynamics of isotachophoretic adjustment of enantiomers, enantiomer separation, boundaries between enantiomers and between an enantiomer and a buffer constituent of like charge, and zone stability. The impact of leader pH, selector concentration, free mobility of the weak base, mobilities of the formed complexes and complexation constants could thereby be elucidated. For selected examples with methadone enantiomers as analytes and (2-hydroxypropyl)-β-CD as selector, simulated zone patterns were found to compare well with those monitored experimentally in capillary setups with two conductivity detectors or an absorbance and a conductivity detector. Simulation represents an elegant way to provide insight into the formation of isotachophoretic boundaries and zone stability in presence of complexation equilibria in a hitherto inaccessible way.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We use quantum link models to construct a quantum simulator for U(N) and SU(N) lattice gauge theories. These models replace Wilson’s classical link variables by quantum link operators, reducing the link Hilbert space to a finite number of dimensions. We show how to embody these quantum link models with fermionic matter with ultracold alkaline-earth atoms using optical lattices. Unlike classical simulations, a quantum simulator does not suffer from sign problems and can thus address the corresponding dynamics in real time. Using exact diagonalization results we show that these systems share qualitative features with QCD, including chiral symmetry breaking and we study the expansion of a chirally restored region in space in real time.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cloud Computing has evolved to become an enabler for delivering access to large scale distributed applications running on managed network-connected computing systems. This makes possible hosting Distributed Enterprise Information Systems (dEISs) in cloud environments, while enforcing strict performance and quality of service requirements, defined using Service Level Agreements (SLAs). {SLAs} define the performance boundaries of distributed applications, and are enforced by a cloud management system (CMS) dynamically allocating the available computing resources to the cloud services. We present two novel VM-scaling algorithms focused on dEIS systems, which optimally detect most appropriate scaling conditions using performance-models of distributed applications derived from constant-workload benchmarks, together with SLA-specified performance constraints. We simulate the VM-scaling algorithms in a cloud simulator and compare against trace-based performance models of dEISs. We compare a total of three SLA-based VM-scaling algorithms (one using prediction mechanisms) based on a real-world application scenario involving a large variable number of users. Our results show that it is beneficial to use autoregressive predictive SLA-driven scaling algorithms in cloud management systems for guaranteeing performance invariants of distributed cloud applications, as opposed to using only reactive SLA-based VM-scaling algorithms.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

N. Bostrom’s simulation argument and two additional assumptions imply that we are likely to live in a computer simulation. The argument is based upon the following assumption about the workings of realistic brain simulations: The hardware of a computer on which a brain simulation is run bears a close analogy to the brain itself. To inquire whether this is so, I analyze how computer simulations trace processes in their targets. I describe simulations as fictional, mathematical, pictorial, and material models. Even though the computer hardware does provide a material model of the target, this does not suffice to underwrite the simulation argument because the ways in which parts of the computer hardware interact during simulations do not resemble the ways in which neurons interact in the brain. Further, there are computer simulations of all kinds of systems, and it would be unreasonable to infer that some computers display consciousness just because they simulate brains rather than, say, galaxies.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The near nucleus coma of Comet 9P/Tempel 1 has been simulated with the 3D Direct Simulation Monte Carlo (DSMC) code PDSC++ (Su, C.-C. [2013]. Parallel Direct Simulation Monte Carlo (DSMC) Methods for Modeling Rarefied Gas Dynamics. PhD Thesis, National Chiao Tung University, Taiwan) and the derived column densities have been compared to observations of the water vapour distribution found by using infrared imaging spectrometer on the Deep Impact spacecraft (Feaga, L.M., A’Hearn, M.F., Sunshine, J.M., Groussin, O., Farnham, T.L. [2007]. Icarus 191(2), 134–145. http://dx.doi.org/10.1016/j.icarus.2007.04.038). Modelled total production rates are also compared to various observations made at the time of the Deep Impact encounter. Three different models were tested. For all models, the shape model constructed from the Deep Impact observations by Thomas et al. (Thomas, P.C., Veverka, J., Belton, M.J.S., Hidy, A., A’Hearn, M.F., Farnham, T.L., et al. [2007]. Icarus, 187(1), 4–15. http://dx.doi.org/10.1016/j.icarus.2006.12.013) was used. Outgassing depending only on the cosine of the solar insolation angle on each shape model facet is shown to provide an unsatisfactory model. Models constructed on the basis of active areas suggested by Kossacki and Szutowicz (Kossacki, K., Szutowicz, S. [2008]. Icarus, 195(2), 705–724. http://dx.doi.org/10.1016/j.icarus.2007.12.014) are shown to be superior. The Kossacki and Szutowicz model, however, also shows deficits which we have sought to improve upon. For the best model we investigate the properties of the outflow.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Mathematical models of disease progression predict disease outcomes and are useful epidemiological tools for planners and evaluators of health interventions. The R package gems is a tool that simulates disease progression in patients and predicts the effect of different interventions on patient outcome. Disease progression is represented by a series of events (e.g., diagnosis, treatment and death), displayed in a directed acyclic graph. The vertices correspond to disease states and the directed edges represent events. The package gems allows simulations based on a generalized multistate model that can be described by a directed acyclic graph with continuous transition-specific hazard functions. The user can specify an arbitrary hazard function and its parameters. The model includes parameter uncertainty, does not need to be a Markov model, and may take the history of previous events into account. Applications are not limited to the medical field and extend to other areas where multistate simulation is of interest. We provide a technical explanation of the multistate models used by gems, explain the functions of gems and their arguments, and show a sample application.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A search for an excess of events with multiple high transverse momentum objects including charged leptons and jets is presented, using 20.3 fb−1 of proton-proton collision data recorded by the ATLAS detector at the Large Hadron Collider in 2012 at a centre-of-mass energy of √s = 8TeV. No excess of events beyond Standard Model expectations is observed. Using extra-dimensional models for black hole and string ball production and decay, exclusion contours are determined as a function of the mass threshold for production and the fundamental gravity scale for two, four and six extra dimensions. For six extra dimensions, mass thresholds of 4.8–6.2TeV are excluded at 95% confidence level, depending on the fundamental gravity scale and model assumptions. Upper limits on the fiducial cross-sections for non-Standard Model production of these final states are set.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we present a new microscopic theoretical approach to the description of spin crossover in molecular crystals. The spin crossover crystals under consideration are composed of molecular fragments formed by the spin-crossover metal ion and its nearest ligand surrounding and exhibiting well defined localized (molecular) vibrations. As distinguished from the previous models of this phenomenon, the developed approach takes into account the interaction of spin-crossover ions not only with the phonons but also a strong coupling of the electronic shells with molecular modes. This leads to an effective coupling of the local modes with phonons which is shown to be responsible for the cooperative spin transition accompanied by the structural reorganization. The transition is characterized by the two order parameters representing the mean values of the products of electronic diagonal matrices and the coordinates of the local modes for the high- and low-spin states of the spin crossover complex. Finally, we demonstrate that the approach provides a reasonable explanation of the observed spin transition in the [Fe(ptz)6](BF4)2 crystal. The theory well reproduces the observed abrupt low-spin → high-spin transition and the temperature dependence of the high-spin fraction in a wide temperature range as well as the pronounced hysteresis loop. At the same time within the limiting approximations adopted in the developed model, the evaluated high-spin fraction vs. T shows that the cooperative spin-lattice transition proves to be incomplete in the sense that the high-spin fraction does not reach its maximum value at high temperature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study compares gridded European seasonal series of surface air temperature (SAT) and precipitation (PRE) reconstructions with a regional climate simulation over the period 1500–1990. The area is analysed separately for nine subareas that represent the majority of the climate diversity in the European sector. In their spatial structure, an overall good agreement is found between the reconstructed and simulated climate features across Europe, supporting consistency in both products. Systematic biases between both data sets can be explained by a priori known deficiencies in the simulation. Simulations and reconstructions, however, largely differ in the temporal evolution of past climate for European subregions. In particular, the simulated anomalies during the Maunder and Dalton minima show stronger response to changes in the external forcings than recorded in the reconstructions. Although this disagreement is to some extent expected given the prominent role of internal variability in the evolution of regional temperature and precipitation, a certain degree of agreement is a priori expected in variables directly affected by external forcings. In this sense, the inability of the model to reproduce a warm period similar to that recorded for the winters during the first decades of the 18th century in the reconstructions is indicative of fundamental limitations in the simulation that preclude reproducing exceptionally anomalous conditions. Despite these limitations, the simulated climate is a physically consistent data set, which can be used as a benchmark to analyse the consistency and limitations of gridded reconstructions of different variables. A comparison of the leading modes of SAT and PRE variability indicates that reconstructions are too simplistic, especially for precipitation, which is associated with the linear statistical techniques used to generate the reconstructions. The analysis of the co-variability between sea level pressure (SLP) and SAT and PRE in the simulation yields a result which resembles the canonical co-variability recorded in the observations for the 20th century. However, the same analysis for reconstructions exhibits anomalously low correlations, which points towards a lack of dynamical consistency between independent reconstructions.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

67P/Churyumov-Gerasimenko (67P) is a Jupiter-family comet and the object of investigation of the European Space Agency mission Rosetta. This report presents the first full 3D simulation results of 67P’s neutral gas coma. In this study we include results from a direct simulation Monte Carlo method, a hydrodynamic code, and a purely geometric calculation which computes the total illuminated surface area on the nucleus. All models include the triangulated 3D shape model of 67P as well as realistic illumination and shadowing conditions. The basic concept is the assumption that these illumination conditions on the nucleus are the main driver for the gas activity of the comet. As a consequence, the total production rate of 67P varies as a function of solar insolation. The best agreement between the model and the data is achieved when gas fluxes on the night side are in the range of 7% to 10% of the maximum flux, accounting for contributions from the most volatile components. To validate the output of our numerical simulations we compare the results of all three models to in situ gas number density measurements from the ROSINA COPS instrument. We are able to reproduce the overall features of these local neutral number density measurements of ROSINA COPS for the time period between early August 2014 and January 1 2015 with all three models. Some details in the measurements are not reproduced and warrant further investigation and refinement of the models. However, the overall assumption that illumination conditions on the nucleus are at least an important driver of the gas activity is validated by the models. According to our simulation results we find the total production rate of 67P to be constant between August and November 2014 with a value of about 1 × 10²⁶ molecules s⁻¹.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

When considering data from many trials, it is likely that some of them present a markedly different intervention effect or exert an undue influence on the summary results. We develop a forward search algorithm for identifying outlying and influential studies in meta-analysis models. The forward search algorithm starts by fitting the hypothesized model to a small subset of likely outlier-free studies and proceeds by adding studies into the set one-by-one that are determined to be closest to the fitted model of the existing set. As each study is added to the set, plots of estimated parameters and measures of fit are monitored to identify outliers by sharp changes in the forward plots. We apply the proposed outlier detection method to two real data sets; a meta-analysis of 26 studies that examines the effect of writing-to-learn interventions on academic achievement adjusting for three possible effect modifiers, and a meta-analysis of 70 studies that compares a fluoride toothpaste treatment to placebo for preventing dental caries in children. A simple simulated example is used to illustrate the steps of the proposed methodology, and a small-scale simulation study is conducted to evaluate the performance of the proposed method. Copyright © 2016 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The social processes that lead to destructive behavior in celebratory crowds can be studied through an agent-based computer simulation. Riots are an increasingly common outcome of sports celebrations, and pose the potential for harm to participants, bystanders, property, and the reputation of the groups with whom participants are associated. Rioting cannot necessarily be attributed to the negative emotions of individuals, such as anger, rage, frustration and despair. For instance, the celebratory behavior (e.g., chanting, cheering, singing) during UConn’s “Spring Weekend” and after the 2004 NCAA Championships resulted in several small fires and overturned cars. Further, not every individual in the area of a riot engages in violence, and those who do, do not do so continuously. Instead, small groups carry out the majority of violent acts in relatively short-lived episodes. Agent-based computer simulations are an ideal method for modeling complex group-level social phenomena, such as celebratory gatherings and riots, which emerge from the interaction of relatively “simple” individuals. By making simple assumptions about individuals’ decision-making and behaviors and allowing actors to affect one another, behavioral patterns emerge that cannot be predicted by the characteristics of individuals. The computer simulation developed here models celebratory riot behavior by repeatedly evaluating a single algorithm for each individual, the inputs of which are affected by the characteristics of nearby actors. Specifically, the simulation assumes that (a) actors possess 1 of 5 distinct social identities (group memberships), (b) actors will congregate with actors who possess the same identity, (c) the degree of social cohesion generated in the social context determines the stability of relationships within groups, and (d) actors’ level of aggression is affected by the aggression of other group members. Not only does this simulation provide a systematic investigation of the effects of the initial distribution of aggression, social identification, and cohesiveness on riot outcomes, but also an analytic tool others may use to investigate, visualize and predict how various individual characteristics affect emergent crowd behavior.