941 resultados para Simulation models


Relevância:

30.00% 30.00%

Publicador:

Resumo:

El projecte exposat té com a propòsit definir i implementar un model de simulació basat en la coordinació i assignació dels serveis d’emergència en accidents de trànsit. La definició del model s’ha realitzat amb l’ús de les Xarxes de Petri Acolorides i la implementació amb el software Rockwell Arena 7.0. El modelatge de la primera simulació ens mostra un model teòric basat en cues mentre que el segon, mostra un model més complet i real gràcies a la connexió mitjançant la plataforma Corba a una base de dades amb informació geogràfica de les flotes i de les rutes. Com a resultat de l’estudi i amb l’ajuda de GoogleEarth, podem realitzar simulacions gràfiques per veure els accidents generats, les flotes dels serveis i el moviment dels vehicles des de les bases fins als accidents.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We present a stylized intertemporal forward-looking model able that accommodates key regional economic features, an area where the literature is not well developed. The main difference, from the standard applications, is the role of saving and its implication for the balance of payments. Though maintaining dynamic forward-looking behaviour for agents, the rate of private saving is exogenously determined and so no neoclassical financial adjustment is needed. Also, we focus on the similarities and the differences between myopic and forward-looking models, highlighting the divergences among the main adjustment equations and the resulting simulation outcomes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Objectives: Several population pharmacokinetic (PPK) and pharmacokinetic-pharmacodynamic (PK-PD) analyses have been performed with the anticancer drug imatinib. Inspired by the approach of meta-analysis, we aimed to compare and combine results from published studies in a useful way - in particular for improving the clinical interpretation of imatinib concentration measurements in the scope of therapeutic drug monitoring (TDM). Methods: Original PPK analyses and PK-PD studies (PK surrogate: trough concentration Cmin; PD outcomes: optimal early response and specific adverse events) were searched systematically on MEDLINE. From each identified PPK model, a predicted concentration distribution under standard dosage was derived through 1000 simulations (NONMEM), after standardizing model parameters to common covariates. A "reference range" was calculated from pooled simulated concentrations in a semi-quantitative approach (without specific weighting) over the whole dosing interval. Meta-regression summarized relationships between Cmin and optimal/suboptimal early treatment response. Results: 9 PPK models and 6 relevant PK-PD reports in CML patients were identified. Model-based predicted median Cmin ranged from 555 to 1388 ng/ml (grand median: 870 ng/ml and inter-quartile range: 520-1390 ng/ml). The probability to achieve optimal early response was predicted to increase from 60 to 85% from 520 to 1390 ng/ml across PK-PD studies (odds ratio for doubling Cmin: 2.7). Reporting of specific adverse events was too heterogeneous to perform a regression analysis. The general frequency of anemia, rash and fluid retention increased however consistently with Cmin, but less than response probability. Conclusions: Predicted drug exposure may differ substantially between various PPK analyses. In this review, heterogeneity was mainly attributed to 2 "outlying" models. The established reference range seems to cover the range where both good efficacy and acceptable tolerance are expected for most patients. TDM guided dose adjustment appears therefore justified for imatinib in CML patients. Its usefulness remains now to be prospectively validated in a randomized trial.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In vitro aggregating brain cell cultures containing all types of brain cells have been shown to be useful for neurotoxicological investigations. The cultures are used for the detection of nervous system-specific effects of compounds by measuring multiple endpoints, including changes in enzyme activities. Concentration-dependent neurotoxicity is determined at several time points. METHODS: A Markov model was set up to describe the dynamics of brain cell populations exposed to potentially neurotoxic compounds. Brain cells were assumed to be either in a healthy or stressed state, with only stressed cells being susceptible to cell death. Cells may have switched between these states or died with concentration-dependent transition rates. Since cell numbers were not directly measurable, intracellular lactate dehydrogenase (LDH) activity was used as a surrogate. Assuming that changes in cell numbers are proportional to changes in intracellular LDH activity, stochastic enzyme activity models were derived. Maximum likelihood and least squares regression techniques were applied for estimation of the transition rates. Likelihood ratio tests were performed to test hypotheses about the transition rates. Simulation studies were used to investigate the performance of the transition rate estimators and to analyze the error rates of the likelihood ratio tests. The stochastic time-concentration activity model was applied to intracellular LDH activity measurements after 7 and 14 days of continuous exposure to propofol. The model describes transitions from healthy to stressed cells and from stressed cells to death. RESULTS: The model predicted that propofol would affect stressed cells more than healthy cells. Increasing propofol concentration from 10 to 100 μM reduced the mean waiting time for transition to the stressed state by 50%, from 14 to 7 days, whereas the mean duration to cellular death reduced more dramatically from 2.7 days to 6.5 hours. CONCLUSION: The proposed stochastic modeling approach can be used to discriminate between different biological hypotheses regarding the effect of a compound on the transition rates. The effects of different compounds on the transition rate estimates can be quantitatively compared. Data can be extrapolated at late measurement time points to investigate whether costs and time-consuming long-term experiments could possibly be eliminated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The ambition of most molecular biologists is the understanding of the intricate network of molecular interactions that control biological systems. As scientists uncover the components and the connectivity of these networks, it becomes possible to study their dynamical behavior as a whole and discover what is the specific role of each of their components. Since the behavior of a network is by no means intuitive, it becomes necessary to use computational models to understand its behavior and to be able to make predictions about it. Unfortunately, most current computational models describe small networks due to the scarcity of kinetic data available. To overcome this problem, we previously published a methodology to convert a signaling network into a dynamical system, even in the total absence of kinetic information. In this paper we present a software implementation of such methodology. RESULTS: We developed SQUAD, a software for the dynamic simulation of signaling networks using the standardized qualitative dynamical systems approach. SQUAD converts the network into a discrete dynamical system, and it uses a binary decision diagram algorithm to identify all the steady states of the system. Then, the software creates a continuous dynamical system and localizes its steady states which are located near the steady states of the discrete system. The software permits to make simulations on the continuous system, allowing for the modification of several parameters. Importantly, SQUAD includes a framework for perturbing networks in a manner similar to what is performed in experimental laboratory protocols, for example by activating receptors or knocking out molecular components. Using this software we have been able to successfully reproduce the behavior of the regulatory network implicated in T-helper cell differentiation. CONCLUSION: The simulation of regulatory networks aims at predicting the behavior of a whole system when subject to stimuli, such as drugs, or determine the role of specific components within the network. The predictions can then be used to interpret and/or drive laboratory experiments. SQUAD provides a user-friendly graphical interface, accessible to both computational and experimental biologists for the fast qualitative simulation of large regulatory networks for which kinetic data is not necessarily available.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work we present numerical simulations of continuous flow left ventricle assist device implantation with the aim of comparing difference in flow rates and pressure patterns depending on the location of the anastomosis and the rotational speed of the device. Despite the fact that the descending aorta anastomosis approach is less invasive, since it does not require a sternotomy and a cardiopulmonary bypass, its benefits are still controversial. Moreover, the device rotational speed should be correctly chosen to avoid anomalous flow rates and pressure distribution in specific location of the cardiovascular tree. With the aim of assessing the differences between these two approaches and device rotational speed in terms of flow rate and pressure waveforms, we set up numerical simulations of network of one-dimensional models where we account for the presence of an outflow cannula anastomosed to different locations of the aorta. Then, we use the resulting network to compare the results of the two different cannulations for several stages of heart failure and different rotational speed of the device. The inflow boundary data for the heart and the cannulas are obtained from a lumped parameters model of the entire circulatory system with an assist device, which is validated with clinical data. The results show that ascending and descending aorta cannulations lead to similar waveforms and mean flow rate in all the considered cases. Moreover, regardless of the anastomosis region, the rotational speed of the device has an important impact on wave profiles; this effect is more pronounced at high RPM.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The identification of genetically homogeneous groups of individuals is a long standing issue in population genetics. A recent Bayesian algorithm implemented in the software STRUCTURE allows the identification of such groups. However, the ability of this algorithm to detect the true number of clusters (K) in a sample of individuals when patterns of dispersal among populations are not homogeneous has not been tested. The goal of this study is to carry out such tests, using various dispersal scenarios from data generated with an individual-based model. We found that in most cases the estimated 'log probability of data' does not provide a correct estimation of the number of clusters, K. However, using an ad hoc statistic DeltaK based on the rate of change in the log probability of data between successive K values, we found that STRUCTURE accurately detects the uppermost hierarchical level of structure for the scenarios we tested. As might be expected, the results are sensitive to the type of genetic marker used (AFLP vs. microsatellite), the number of loci scored, the number of populations sampled, and the number of individuals typed in each sample.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The pharmacokinetic determinants of successful antibiotic prophylaxis of endocarditis are not precisely known. Differences in half-lives of antibiotics between animals and humans preclude extrapolation of animal results to human situations. To overcome this limitation, we have mimicked in rats the amoxicillin kinetics in humans following a 3-g oral dose (as often used for prophylaxis of endocarditis) by delivering the drug through a computerized pump. Rats with catheter-induced vegetations were challenged with either of two strains of antibiotic-tolerant viridans group streptococci. Antibiotics were given either through the pump (to simulate the whole kinetic profile during prophylaxis in humans) or as an intravenous bolus which imitated only the peak level of amoxicillin (18 mg/liter) in human serum. Prophylaxis by intravenous bolus was inoculum dependent and afforded a limited protection only in rats challenged with the minimum inoculum size infecting > or = 90% of untreated controls. In contrast, simulation of kinetics in humans significantly protected animals challenged with 10 to 100 times the inoculum of either of the test organisms infecting > or = 90% of untreated controls. Thus, simulation of the profiles of amoxicillin prophylaxis in human serum was more efficacious than mere imitation of the transient peak level in rats. This confirms previous studies suggesting that the duration for which the serum amoxicillin level remained detectable (not only the magnitude of the peak) was an important parameter in successful prophylaxis of endocarditis. The results also suggest that single-dose prophylaxis with 3 g of amoxicillin in humans might be more effective than predicted by conventional animal models in which only peak levels of antibiotic in human serum were stimulated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Cette thèse s'intéresse à étudier les propriétés extrémales de certains modèles de risque d'intérêt dans diverses applications de l'assurance, de la finance et des statistiques. Cette thèse se développe selon deux axes principaux, à savoir: Dans la première partie, nous nous concentrons sur deux modèles de risques univariés, c'est-à- dire, un modèle de risque de déflation et un modèle de risque de réassurance. Nous étudions le développement des queues de distribution sous certaines conditions des risques commun¬s. Les principaux résultats sont ainsi illustrés par des exemples typiques et des simulations numériques. Enfin, les résultats sont appliqués aux domaines des assurances, par exemple, les approximations de Value-at-Risk, d'espérance conditionnelle unilatérale etc. La deuxième partie de cette thèse est consacrée à trois modèles à deux variables: Le premier modèle concerne la censure à deux variables des événements extrême. Pour ce modèle, nous proposons tout d'abord une classe d'estimateurs pour les coefficients de dépendance et la probabilité des queues de distributions. Ces estimateurs sont flexibles en raison d'un paramètre de réglage. Leurs distributions asymptotiques sont obtenues sous certaines condi¬tions lentes bivariées de second ordre. Ensuite, nous donnons quelques exemples et présentons une petite étude de simulations de Monte Carlo, suivie par une application sur un ensemble de données réelles d'assurance. L'objectif de notre deuxième modèle de risque à deux variables est l'étude de coefficients de dépendance des queues de distributions obliques et asymétriques à deux variables. Ces distri¬butions obliques et asymétriques sont largement utiles dans les applications statistiques. Elles sont générées principalement par le mélange moyenne-variance de lois normales et le mélange de lois normales asymétriques d'échelles, qui distinguent la structure de dépendance de queue comme indiqué par nos principaux résultats. Le troisième modèle de risque à deux variables concerne le rapprochement des maxima de séries triangulaires elliptiques obliques. Les résultats théoriques sont fondés sur certaines hypothèses concernant le périmètre aléatoire sous-jacent des queues de distributions. -- This thesis aims to investigate the extremal properties of certain risk models of interest in vari¬ous applications from insurance, finance and statistics. This thesis develops along two principal lines, namely: In the first part, we focus on two univariate risk models, i.e., deflated risk and reinsurance risk models. Therein we investigate their tail expansions under certain tail conditions of the common risks. Our main results are illustrated by some typical examples and numerical simu¬lations as well. Finally, the findings are formulated into some applications in insurance fields, for instance, the approximations of Value-at-Risk, conditional tail expectations etc. The second part of this thesis is devoted to the following three bivariate models: The first model is concerned with bivariate censoring of extreme events. For this model, we first propose a class of estimators for both tail dependence coefficient and tail probability. These estimators are flexible due to a tuning parameter and their asymptotic distributions are obtained under some second order bivariate slowly varying conditions of the model. Then, we give some examples and present a small Monte Carlo simulation study followed by an application on a real-data set from insurance. The objective of our second bivariate risk model is the investigation of tail dependence coefficient of bivariate skew slash distributions. Such skew slash distributions are extensively useful in statistical applications and they are generated mainly by normal mean-variance mixture and scaled skew-normal mixture, which distinguish the tail dependence structure as shown by our principle results. The third bivariate risk model is concerned with the approximation of the component-wise maxima of skew elliptical triangular arrays. The theoretical results are based on certain tail assumptions on the underlying random radius.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Early detection of landslide surface deformation with 3D remote sensing techniques, as TLS, has become a great challenge during last decade. To improve our understanding of landslide deformation, a series of analogue simulation have been carried out on non-rigid bodies coupled with 3D digitizer. All these experiments have been carried out under controlled conditions, as water level and slope angle inclination. We were able to follow 3D surface deformation suffered by complex landslide bodies from precursory deformation still larger failures. These experiments were the basis for the development of a new algorithm for the quantification of surface deformation using automatic tracking method on discrete points of the slope surface. To validate the algorithm, comparisons were made between manually obtained results and algorithm surface displacement results. Outputs will help in understanding 3D deformation during pre-failure stages and failure mechanisms, which are fundamental aspects for future implementation of 3D remote sensing techniques in early warning systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The dynamic properties of helix 12 in the ligand binding domain of nuclear receptors are a major determinant of AF-2 domain activity. We investigated the molecular and structural basis of helix 12 mobility, as well as the involvement of individual residues with regard to peroxisome proliferator-activated receptor alpha (PPARalpha) constitutive and ligand-dependent transcriptional activity. Functional assays of the activity of PPARalpha helix 12 mutants were combined with free energy molecular dynamics simulations. The agreement between the results from these approaches allows us to make robust claims concerning the mechanisms that govern helix 12 functions. Our data support a model in which PPARalpha helix 12 transiently adopts a relatively stable active conformation even in the absence of a ligand. This conformation provides the interface for the recruitment of a coactivator and results in constitutive activity. The receptor agonists stabilize this conformation and increase PPARalpha transcription activation potential. Finally, we disclose important functions of residues in PPARalpha AF-2, which determine the positioning of helix 12 in the active conformation in the absence of a ligand. Substitution of these residues suppresses PPARalpha constitutive activity, without changing PPARalpha ligand-dependent activation potential.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Gene-on-gene regulations are key components of every living organism. Dynamical abstract models of genetic regulatory networks help explain the genome's evolvability and robustness. These properties can be attributed to the structural topology of the graph formed by genes, as vertices, and regulatory interactions, as edges. Moreover, the actual gene interaction of each gene is believed to play a key role in the stability of the structure. With advances in biology, some effort was deployed to develop update functions in Boolean models that include recent knowledge. We combine real-life gene interaction networks with novel update functions in a Boolean model. We use two sub-networks of biological organisms, the yeast cell-cycle and the mouse embryonic stem cell, as topological support for our system. On these structures, we substitute the original random update functions by a novel threshold-based dynamic function in which the promoting and repressing effect of each interaction is considered. We use a third real-life regulatory network, along with its inferred Boolean update functions to validate the proposed update function. Results of this validation hint to increased biological plausibility of the threshold-based function. To investigate the dynamical behavior of this new model, we visualized the phase transition between order and chaos into the critical regime using Derrida plots. We complement the qualitative nature of Derrida plots with an alternative measure, the criticality distance, that also allows to discriminate between regimes in a quantitative way. Simulation on both real-life genetic regulatory networks show that there exists a set of parameters that allows the systems to operate in the critical region. This new model includes experimentally derived biological information and recent discoveries, which makes it potentially useful to guide experimental research. The update function confers additional realism to the model, while reducing the complexity and solution space, thus making it easier to investigate.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The paper proposes an approach aimed at detecting optimal model parameter combinations to achieve the most representative description of uncertainty in the model performance. A classification problem is posed to find the regions of good fitting models according to the values of a cost function. Support Vector Machine (SVM) classification in the parameter space is applied to decide if a forward model simulation is to be computed for a particular generated model. SVM is particularly designed to tackle classification problems in high-dimensional space in a non-parametric and non-linear way. SVM decision boundaries determine the regions that are subject to the largest uncertainty in the cost function classification, and, therefore, provide guidelines for further iterative exploration of the model space. The proposed approach is illustrated by a synthetic example of fluid flow through porous media, which features highly variable response due to the parameter values' combination.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper illustrates the philosophy which forms the basis of calibrationexercises in general equilibrium macroeconomic models and the details of theprocedure, the advantages and the disadvantages of the approach, with particularreference to the issue of testing ``false'' economic models. We provide anoverview of the most recent simulation--based approaches to the testing problemand compare them to standard econometric methods used to test the fit of non--lineardynamic general equilibrium models. We illustrate how simulation--based techniques can be used to formally evaluate the fit of a calibrated modelto the data and obtain ideas on how to improve the model design using a standardproblem in the international real business cycle literature, i.e. whether amodel with complete financial markets and no restrictions to capital mobility is able to reproduce the second order properties of aggregate savingand aggregate investment in an open economy.