984 resultados para stochastic simulation
Resumo:
Objectives: Several population pharmacokinetic (PPK) and pharmacokinetic-pharmacodynamic (PK-PD) analyses have been performed with the anticancer drug imatinib. Inspired by the approach of meta-analysis, we aimed to compare and combine results from published studies in a useful way - in particular for improving the clinical interpretation of imatinib concentration measurements in the scope of therapeutic drug monitoring (TDM). Methods: Original PPK analyses and PK-PD studies (PK surrogate: trough concentration Cmin; PD outcomes: optimal early response and specific adverse events) were searched systematically on MEDLINE. From each identified PPK model, a predicted concentration distribution under standard dosage was derived through 1000 simulations (NONMEM), after standardizing model parameters to common covariates. A "reference range" was calculated from pooled simulated concentrations in a semi-quantitative approach (without specific weighting) over the whole dosing interval. Meta-regression summarized relationships between Cmin and optimal/suboptimal early treatment response. Results: 9 PPK models and 6 relevant PK-PD reports in CML patients were identified. Model-based predicted median Cmin ranged from 555 to 1388 ng/ml (grand median: 870 ng/ml and inter-quartile range: 520-1390 ng/ml). The probability to achieve optimal early response was predicted to increase from 60 to 85% from 520 to 1390 ng/ml across PK-PD studies (odds ratio for doubling Cmin: 2.7). Reporting of specific adverse events was too heterogeneous to perform a regression analysis. The general frequency of anemia, rash and fluid retention increased however consistently with Cmin, but less than response probability. Conclusions: Predicted drug exposure may differ substantially between various PPK analyses. In this review, heterogeneity was mainly attributed to 2 "outlying" models. The established reference range seems to cover the range where both good efficacy and acceptable tolerance are expected for most patients. TDM guided dose adjustment appears therefore justified for imatinib in CML patients. Its usefulness remains now to be prospectively validated in a randomized trial.
Resumo:
Abstract. Given a model that can be simulated, conditional moments at a trial parameter value can be calculated with high accuracy by applying kernel smoothing methods to a long simulation. With such conditional moments in hand, standard method of moments techniques can be used to estimate the parameter. Because conditional moments are calculated using kernel smoothing rather than simple averaging, it is not necessary that the model be simulable subject to the conditioning information that is used to define the moment conditions. For this reason, the proposed estimator is applicable to general dynamic latent variable models. It is shown that as the number of simulations diverges, the estimator is consistent and a higher-order expansion reveals the stochastic difference between the infeasible GMM estimator based on the same moment conditions and the simulated version. In particular, we show how to adjust standard errors to account for the simulations. Monte Carlo results show how the estimator may be applied to a range of dynamic latent variable (DLV) models, and that it performs well in comparison to several other estimators that have been proposed for DLV models.
Resumo:
Nowadays, many of the health care systems are large and complex environments and quite dynamic, specifically Emergency Departments, EDs. It is opened and working 24 hours per day throughout the year with limited resources, whereas it is overcrowded. Thus, is mandatory to simulate EDs to improve qualitatively and quantitatively their performance. This improvement can be achieved modelling and simulating EDs using Agent-Based Model, ABM and optimising many different staff scenarios. This work optimises the staff configuration of an ED. In order to do optimisation, objective functions to minimise or maximise have to be set. One of those objective functions is to find the best or optimum staff configuration that minimise patient waiting time. The staff configuration comprises: doctors, triage nurses, and admissions, the amount and sort of them. Staff configuration is a combinatorial problem, that can take a lot of time to be solved. HPC is used to run the experiments, and encouraging results were obtained. However, even with the basic ED used in this work the search space is very large, thus, when the problem size increases, it is going to need more resources of processing in order to obtain results in an acceptable time.
Gaussian estimates for the density of the non-linear stochastic heat equation in any space dimension
Resumo:
In this paper, we establish lower and upper Gaussian bounds for the probability density of the mild solution to the stochastic heat equation with multiplicative noise and in any space dimension. The driving perturbation is a Gaussian noise which is white in time with some spatially homogeneous covariance. These estimates are obtained using tools of the Malliavin calculus. The most challenging part is the lower bound, which is obtained by adapting a general method developed by Kohatsu-Higa to the underlying spatially homogeneous Gaussian setting. Both lower and upper estimates have the same form: a Gaussian density with a variance which is equal to that of the mild solution of the corresponding linear equation with additive noise.
Resumo:
In this paper the scales of classes of stochastic processes are introduced. New interpolation theorems and boundedness of some transforms of stochastic processes are proved. Interpolation method for generously-monotonous rocesses is entered. Conditions and statements of interpolation theorems concern he xed stochastic process, which diers from the classical results.
Resumo:
This paper presents the Juste-Neige system for predicting the snow height on the ski runs of a resort using a multi-agent simulation software. Its aim is to facilitate snow cover management in order to i) reduce the production cost of artificial snow and to improve the profit margin for the companies managing the ski resorts; and ii) to reduce the water and energy consumption, and thus to reduce the environmental impact, by producing only the snow needed for a good skiing experience. The software provides maps with the predicted snow heights for up to 13 days. On these maps, the areas most exposed to snow erosion are highlighted. The software proceeds in three steps: i) interpolation of snow height measurements with a neural network; ii) local meteorological forecasts for every ski resort; iii) simulation of the impact caused by skiers using a multi-agent system. The software has been evaluated in the Swiss ski resort of Verbier and provides useful predictions.
Resumo:
Introduction : Le syndrome de Brugada, décrit en 1992 par Pedro et Josep Brugada, est un syndrome cardiaque caractérisé par un sus-décalage particulier du segment ST associé à un bloc de branche droit atypique au niveau des dérivations ECG V1 à V3. Les altérations ECG du syndrome de Brugada sont classifiées en 3 types dont seul le type 1 est diagnostique. Les mécanismes physiopathologiques exacts de ce syndrome sont pour le moment encore controversés. Plusieurs hypothèses sont proposées dans la littérature dont deux principales retiennent l'attention : 1) le modèle du trouble de repolarisation stipule des potentiels d'action réduits en durée et en amplitude liés à un changement de répartition de canaux potassiques 2) le modèle du trouble de dépolarisation spécifie un retard de conduction se traduisant par une dépolarisation retardée. Dans le STEMI, un sus-décalage ST ressemblant à celui du syndrome de Brugada est expliqué par deux théories : 1) le courant de lésion diastolique suggère une élévation du potentiel diastolique transformé artificiellement en sus-décalage ST par les filtres utilisés dans tous les appareils ECG.¦Objectif : Recréer les manifestations ECG du syndrome de Brugada en appliquant les modifications du potentiel d'action des cardiomyocytes rapportées dans la littérature.¦Méthode : Pour ce travail, nous avons utilisé "ECGsim", un simulateur informatique réaliste d'ECG disponible gratuitement sur www.ecgsim.org. Ce programme est basé sur une reconstruction de l'ECG de surface à l'aide de 1500 noeuds représentant chacun les potentiels d'action des ventricules droit et gauche, épicardiques et endocardiques. L'ECG simulé peut être donc vu comme l'intégration de l'ensemble de ces potentiels d'action en tenant compte des propriétés de conductivité des tissus s'interposant entre les électrodes de surface et le coeur. Dans ce programme, nous avons définit trois zones, de taille différente, comprenant la chambre de chasse du ventricule droit. Pour chaque zone, nous avons reproduit les modifications des potentiels d'action citées dans les modèles du trouble de repolarisation et de dépolarisation et des théories de courant de lésion systolique et diastolique. Nous avons utilisé, en plus des douze dérivations habituelles, une électrode positionnée en V2IC3 (i.e. 3ème espace intercostal) sur le thorax virtuel du programme ECGsim.¦Résultats : Pour des raisons techniques, le modèle du trouble de repolarisation n'a pas pu être entièrement réalisée dans ce travail. Le modèle du trouble de dépolarisation ne reproduit pas d'altération de type Brugada mais un bloc de branche droit plus ou moins complet. Le courant de lésion diastolique permet d'obtenir un sus-décalage ST en augmentant le potentiel diastolique épicardique des cardiomyocytes de la chambre de chasse du ventricule droit. Une inversion de l'onde T apparaît lorsque la durée du potentiel d'action est prolongée. L'amplitude du sus-décalage ST dépend de la valeur du potentiel diastolique, de la taille de la lésion et de sa localisation épicardique ou transmurale. Le courant de lésion systolique n'entraîne pas de sus-décalage ST mais accentue l'amplitude de l'onde T.¦Discussion et conclusion : Dans ce travail, l'élévation du potentiel diastolique avec un prolongement de la durée du potentiel d'action est la combinaison qui reproduit le mieux les altérations ECG du Brugada. Une persistance de cellules de type nodal au niveau de la chambre de chasse du ventricule droit pourrait être une explication à ces modifications particulières du potentiel d'action. Le risque d'arythmie dans la Brugada pourrait également être expliqué par une automaticité anormale des cellules de type nodal. Ainsi, des altérations des mécanismes cellulaires impliqués dans le maintien du potentiel diastolique pourraient être présentes dans le syndrome de Brugada, ce qui, à notre connaissance, n'a jamais été rapporté dans la littérature.
Resumo:
BACKGROUND: The ambition of most molecular biologists is the understanding of the intricate network of molecular interactions that control biological systems. As scientists uncover the components and the connectivity of these networks, it becomes possible to study their dynamical behavior as a whole and discover what is the specific role of each of their components. Since the behavior of a network is by no means intuitive, it becomes necessary to use computational models to understand its behavior and to be able to make predictions about it. Unfortunately, most current computational models describe small networks due to the scarcity of kinetic data available. To overcome this problem, we previously published a methodology to convert a signaling network into a dynamical system, even in the total absence of kinetic information. In this paper we present a software implementation of such methodology. RESULTS: We developed SQUAD, a software for the dynamic simulation of signaling networks using the standardized qualitative dynamical systems approach. SQUAD converts the network into a discrete dynamical system, and it uses a binary decision diagram algorithm to identify all the steady states of the system. Then, the software creates a continuous dynamical system and localizes its steady states which are located near the steady states of the discrete system. The software permits to make simulations on the continuous system, allowing for the modification of several parameters. Importantly, SQUAD includes a framework for perturbing networks in a manner similar to what is performed in experimental laboratory protocols, for example by activating receptors or knocking out molecular components. Using this software we have been able to successfully reproduce the behavior of the regulatory network implicated in T-helper cell differentiation. CONCLUSION: The simulation of regulatory networks aims at predicting the behavior of a whole system when subject to stimuli, such as drugs, or determine the role of specific components within the network. The predictions can then be used to interpret and/or drive laboratory experiments. SQUAD provides a user-friendly graphical interface, accessible to both computational and experimental biologists for the fast qualitative simulation of large regulatory networks for which kinetic data is not necessarily available.
Resumo:
Hidden Markov models (HMMs) are probabilistic models that are well adapted to many tasks in bioinformatics, for example, for predicting the occurrence of specific motifs in biological sequences. MAMOT is a command-line program for Unix-like operating systems, including MacOS X, that we developed to allow scientists to apply HMMs more easily in their research. One can define the architecture and initial parameters of the model in a text file and then use MAMOT for parameter optimization on example data, decoding (like predicting motif occurrence in sequences) and the production of stochastic sequences generated according to the probabilistic model. Two examples for which models are provided are coiled-coil domains in protein sequences and protein binding sites in DNA. A wealth of useful features include the use of pseudocounts, state tying and fixing of selected parameters in learning, and the inclusion of prior probabilities in decoding. AVAILABILITY: MAMOT is implemented in C++, and is distributed under the GNU General Public Licence (GPL). The software, documentation, and example model files can be found at http://bcf.isb-sib.ch/mamot
Resumo:
Hem realitzat l’estudi de moviments humans i hem buscat la forma de poder crear aquests moviments en temps real sobre entorns digitals de forma que la feina que han de dur a terme els artistes i animadors sigui reduïda. Hem fet un estudi de les diferents tècniques d’animació de personatges que podem trobar actualment en l’industria de l’entreteniment així com les principals línies de recerca, estudiant detingudament la tècnica més utilitzada, la captura de moviments. La captura de moviments permet enregistrar els moviments d’una persona mitjançant sensors òptics, sensors magnètics i vídeo càmeres. Aquesta informació és emmagatzemada en arxius que després podran ser reproduïts per un personatge en temps real en una aplicació digital. Tot moviment enregistrat ha d’estar associat a un personatge, aquest és el procés de rigging, un dels punts que hem treballat ha estat la creació d’un sistema d’associació de l’esquelet amb la malla del personatge de forma semi-automàtica, reduint la feina de l’animador per a realitzar aquest procés. En les aplicacions en temps real com la realitat virtual, cada cop més s’està simulant l’entorn en el que viuen els personatges mitjançant les lleis de Newton, de forma que tot canvi en el moviment d’un cos ve donat per l’aplicació d’una força sobre aquest. La captura de moviments no escala bé amb aquests entorns degut a que no és capaç de crear noves animacions realistes a partir de l’enregistrada que depenguin de l’interacció amb l’entorn. L’objectiu final del nostre treball ha estat realitzar la creació d’animacions a partir de forces tal i com ho fem en la realitat en temps real. Per a això hem introduït un model muscular i un sistema de balanç sobre el personatge de forma que aquest pugui respondre a les interaccions amb l’entorn simulat mitjançant les lleis de Newton de manera realista.