144 resultados para simulation tools


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Whole-body counting is a technique of choice for assessing the intake of gamma-emitting radionuclides. An appropriate calibration is necessary, which is done either by experimental measurement or by Monte Carlo (MC) calculation. The aim of this work was to validate a MC model for calibrating whole-body counters (WBCs) by comparing the results of computations with measurements performed on an anthropomorphic phantom and to investigate the effect of a change in phantom's position on the WBC counting sensitivity. GEANT MC code was used for the calculations, and an IGOR phantom loaded with several types of radionuclides was used for the experimental measurements. The results show a reasonable agreement between measurements and MC computation. A 1-cm error in phantom positioning changes the activity estimation by >2%. Considering that a 5-cm deviation of the positioning of the phantom may occur in a realistic counting scenario, this implies that the uncertainty of the activity measured by a WBC is ∼10-20%.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Political participation is often very low in Switzerland especially among students and young citizens. In the run-up to the Swiss parliamentary election in October 2007 several online tools and campaigns were developed with the aim to increase not only the level of information about the political programs of parties and candidates, but also the electoral participation of younger citizens. From a practical point of view this paper will describe the development, marketing efforts and the distribution as well as the use of two of these tools : the so-called "Parteienkompass" (party compass) and the "myVote"-tool - an online voting assistance tool based on an issue-matching system comparing policy preferences between voters and candidates on an individual level. We also havea look at similar tools stemming from Voting Advice Applications (VAA) in other countries in Western Europe. The paper closes with the results of an evaluation and an outlook to further developments and on-going projects in the near future in Switzerland.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ski resorts are deploying more and more systems of artificial snow. These tools are necessary to ensure an important economic activity for the high alpine valleys. However, artificial snow raises important environmental issues that can be reduced by an optimization of its production. This paper presents a software prototype based on artificial intelligence to help ski resorts better manage their snowpack. It combines on one hand a General Neural Network for the analysis of the snow cover and the spatial prediction, with on the other hand a multiagent simulation of skiers for the analysis of the spatial impact of ski practice. The prototype has been tested on the ski resort of Verbier (Switzerland).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Changes in intracellular Na(+) concentration underlie essential neurobiological processes, but few reliable tools exist for their measurement. Here we characterize a new synthetic Na(+)-sensitive fluorescent dye, Asante Natrium Green (ANG), with unique properties. This indicator was excitable in the visible spectrum and by two-photon illumination, suffered little photobleaching and located to the cytosol were it remained for long durations without noticeable unwanted effects on basic cell properties. When used in brain tissue, ANG yielded a bright fluorescent signal during physiological Na(+) responses both in neurons and astrocytes. Synchronous electrophysiological and fluorometric recordings showed that ANG produced accurate Na(+) measurement in situ. This new Na(+) indicator opens innovative ways of probing neuronal circuits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the dynamics of a water-oil meniscus moving from a smaller to a larger pore. The process is characterised by an abrupt change in the configuration, yielding a sudden energy release. A theoretic study for static conditions provides analytical solutions of the surface energy content of the system. Although the configuration after the sudden energy release is energetically more convenient, an energy barrier must be overcome before the process can happen spontaneously. The energy barrier depends on the system geometry and on the flow parameters. The analytical results are compared to numerical simulations that solve the full Navier-Stokes equation in the pore space and employ the Volume Of Fluid (VOF) method to track the evolution of the interface. First, the numerical simulations of a quasi-static process are validated by comparison with the analytical solutions for a static meniscus, then numerical simulations with varying injection velocity are used to investigate dynamic effects on the configuration change. During the sudden energy jump the system exhibits an oscillatory behaviour. Extension to more complex geometries might elucidate the mechanisms leading to a dynamic capillary pressure and to bifurcations in final distributions of fluid phases in porous

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Significant progress has been made with regard to the quantitative integration of geophysical and hydrological data at the local scale. However, extending the corresponding approaches to the regional scale represents a major, and as-of-yet largely unresolved, challenge. To address this problem, we have developed a downscaling procedure based on a non-linear Bayesian sequential simulation approach. The basic objective of this algorithm is to estimate the value of the sparsely sampled hydraulic conductivity at non-sampled locations based on its relation to the electrical conductivity, which is available throughout the model space. The in situ relationship between the hydraulic and electrical conductivities is described through a non-parametric multivariate kernel density function. This method is then applied to the stochastic integration of low-resolution, re- gional-scale electrical resistivity tomography (ERT) data in combination with high-resolution, local-scale downhole measurements of the hydraulic and electrical conductivities. Finally, the overall viability of this downscaling approach is tested and verified by performing and comparing flow and transport simulation through the original and the downscaled hydraulic conductivity fields. Our results indicate that the proposed procedure does indeed allow for obtaining remarkably faithful estimates of the regional-scale hydraulic conductivity structure and correspondingly reliable predictions of the transport characteristics over relatively long distances.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present research deals with an important public health threat, which is the pollution created by radon gas accumulation inside dwellings. The spatial modeling of indoor radon in Switzerland is particularly complex and challenging because of many influencing factors that should be taken into account. Indoor radon data analysis must be addressed from both a statistical and a spatial point of view. As a multivariate process, it was important at first to define the influence of each factor. In particular, it was important to define the influence of geology as being closely associated to indoor radon. This association was indeed observed for the Swiss data but not probed to be the sole determinant for the spatial modeling. The statistical analysis of data, both at univariate and multivariate level, was followed by an exploratory spatial analysis. Many tools proposed in the literature were tested and adapted, including fractality, declustering and moving windows methods. The use of Quan-tité Morisita Index (QMI) as a procedure to evaluate data clustering in function of the radon level was proposed. The existing methods of declustering were revised and applied in an attempt to approach the global histogram parameters. The exploratory phase comes along with the definition of multiple scales of interest for indoor radon mapping in Switzerland. The analysis was done with a top-to-down resolution approach, from regional to local lev¬els in order to find the appropriate scales for modeling. In this sense, data partition was optimized in order to cope with stationary conditions of geostatistical models. Common methods of spatial modeling such as Κ Nearest Neighbors (KNN), variography and General Regression Neural Networks (GRNN) were proposed as exploratory tools. In the following section, different spatial interpolation methods were applied for a par-ticular dataset. A bottom to top method complexity approach was adopted and the results were analyzed together in order to find common definitions of continuity and neighborhood parameters. Additionally, a data filter based on cross-validation was tested with the purpose of reducing noise at local scale (the CVMF). At the end of the chapter, a series of test for data consistency and methods robustness were performed. This lead to conclude about the importance of data splitting and the limitation of generalization methods for reproducing statistical distributions. The last section was dedicated to modeling methods with probabilistic interpretations. Data transformation and simulations thus allowed the use of multigaussian models and helped take the indoor radon pollution data uncertainty into consideration. The catego-rization transform was presented as a solution for extreme values modeling through clas-sification. Simulation scenarios were proposed, including an alternative proposal for the reproduction of the global histogram based on the sampling domain. The sequential Gaussian simulation (SGS) was presented as the method giving the most complete information, while classification performed in a more robust way. An error measure was defined in relation to the decision function for data classification hardening. Within the classification methods, probabilistic neural networks (PNN) show to be better adapted for modeling of high threshold categorization and for automation. Support vector machines (SVM) on the contrary performed well under balanced category conditions. In general, it was concluded that a particular prediction or estimation method is not better under all conditions of scale and neighborhood definitions. Simulations should be the basis, while other methods can provide complementary information to accomplish an efficient indoor radon decision making.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Accurate prediction of transcription factor binding sites is needed to unravel the function and regulation of genes discovered in genome sequencing projects. To evaluate current computer prediction tools, we have begun a systematic study of the sequence-specific DNA-binding of a transcription factor belonging to the CTF/NFI family. Using a systematic collection of rationally designed oligonucleotides combined with an in vitro DNA binding assay, we found that the sequence specificity of this protein cannot be represented by a simple consensus sequence or weight matrix. For instance, CTF/NFI uses a flexible DNA binding mode that allows for variations of the binding site length. From the experimental data, we derived a novel prediction method using a generalised profile as a binding site predictor. Experimental evaluation of the generalised profile indicated that it accurately predicts the binding affinity of the transcription factor to natural or synthetic DNA sequences. Furthermore, the in vitro measured binding affinities of a subset of oligonucleotides were found to correlate with their transcriptional activities in transfected cells. The combined computational-experimental approach exemplified in this work thus resulted in an accurate prediction method for CTF/NFI binding sites potentially functioning as regulatory regions in vivo.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pharmacokinetic variability in drug levels represent for some drugs a major determinant of treatment success, since sub-therapeutic concentrations might lead to toxic reactions, treatment discontinuation or inefficacy. This is true for most antiretroviral drugs, which exhibit high inter-patient variability in their pharmacokinetics that has been partially explained by some genetic and non-genetic factors. The population pharmacokinetic approach represents a very useful tool for the description of the dose-concentration relationship, the quantification of variability in the target population of patients and the identification of influencing factors. It can thus be used to make predictions and dosage adjustment optimization based on Bayesian therapeutic drug monitoring (TDM). This approach has been used to characterize the pharmacokinetics of nevirapine (NVP) in 137 HIV-positive patients followed within the frame of a TDM program. Among tested covariates, body weight, co-administration of a cytochrome (CYP) 3A4 inducer or boosted atazanavir as well as elevated aspartate transaminases showed an effect on NVP elimination. In addition, genetic polymorphism in the CYP2B6 was associated with reduced NVP clearance. Altogether, these factors could explain 26% in NVP variability. Model-based simulations were used to compare the adequacy of different dosage regimens in relation to the therapeutic target associated with treatment efficacy. In conclusion, the population approach is very useful to characterize the pharmacokinetic profile of drugs in a population of interest. The quantification and the identification of the sources of variability is a rational approach to making optimal dosage decision for certain drugs administered chronically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Human biomonitoring is a widely used method in the assessment of occupational exposure to chemical substances and recommended biological limits are published periodically for interpretation and decision-making. However, it is increasingly recognized that a large variability is associated with biological monitoring, making interpretation less efficient than assumed. In order to improve the applicability of biological monitoring, specific factors responsible for this variability should be identified and their contribution quantified. Among these factors, age and sex are easily identifiable, and present knowledge about pharmaceutical chemicals suggests that they play an important role on the toxicokinetics of occupational chemical agents, and therefore on the biological monitoring results.The aim of the present research project was to assess the influence of age and sex on biological indicators corresponding to organic solvents. This has been done experimentally and by toxicokinetic computer simulation. Another purpose was to explore the effect of selected CYP2E1 polymorphisms on the toxicokinetic profile.Age differences were identified by numerical simulations using a general toxicokinetic model from a previous study which was applied to 14 chemicals, representing 21 specific biological entities, with, among others, toluene, phenol, lead and mercury. These models were runn with the modified parameters, indicating in some cases important differences due to age. The expected changes are mostly of the order of 10-20 %, but differences up to 50 % were observed in some cases. These differences appear to depend on the chemical and on the biological entity considered.Sex differences were quantified by controlled human exposures, which were carried out in a 12 m3 exposure chamber for three organic solvents separately: methyl ethyl ketone, 1-methoxy-2-propanol and 1,1,1-trichloroethane. The human volunteer groups were composed 12 of ten young men and fifteen young women, the latter subdivided into those with and without hormonal contraceptive. They were exposed during six hours at rest and at half of the threshold limit value. The kinetics of the parent compounds (organic volatiles) and their metabolite(s) were followed in blood, urine and expired air over time. Analyses of the solvent and their metabolites were performed by using headspace gas chromatography, CYP2E1 genotypes by using PCR-based RFLP methods. Experimental data were used to calibrate the toxicokinetic models developed for the three solvents. The results obtained for the different biomarkers of exposure mainly showed an effect on the urinary levels of several biomarkers among women due to the use of hormonal contraceptive, with an increase of about 50 % in the metabolism rate. The results also showed a difference due to the genotype CYP2E1*6, when exposed to methyl ethyl ketone, with a tendency to increase CYP2E1 activity when volunteers were carriers of the mutant allele. Simulations showed that it is possible to use simple toxicokinetic tools in order to predict internal exposure when exposed to organic solvents. Our study suggests that not only physiological differences but also exogenous sex hormones could influence CYP2E1 enzyme activity. The variability among the urinary biological indicators levels gives evidence of an interindividual susceptibility, an aspect that should have its place in the approaches for setting limits of occupational exposure.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Therapeutic drug monitoring (TDM) and pharmacogenetic tests play a major role in minimising adverse drug reactions and enhancing optimal therapeutic response. The response to medication varies greatly between individuals, according to genetic constitution, age, sex, co-morbidities, environmental factors including diet and lifestyle (e.g. smoking and alcohol intake), and drug-related factors such as pharmacokinetic or pharmacodynamic drug-drug interactions. Most adverse drug reactions are type A reactions, i.e. plasma-level dependent, and represent one of the major causes of hospitalisation, in some cases leading to death. However, they may be avoidable to some extent if pharmacokinetic and pharmacogenetic factors are taken into consideration. This article provides a review of the literature and describes how to apply and interpret TDM and certain pharmacogenetic tests and is illustrated by case reports. An algorithm on the use of TDM and pharmacogenetic tests to help characterise adverse drug reactions is also presented. Although, in the scientific community, differences in drug response are increasingly recognised, there is an urgent need to translate this knowledge into clinical recommendations. Databases on drug-drug interactions and the impact of pharmacogenetic polymorphisms and adverse drug reaction information systems will be helpful to guide clinicians in individualised treatment choices.