971 resultados para Computer Modeling
Resumo:
Both structural and dynamical properties of 7Li at 470 and 843 K are studied by molecular dynamics simulation and the results are comapred with the available experimental data. Two effective interatomic potentials are used, i.e., a potential derived from the Ashcroft pseudopotential [Phys. Lett. 23, 48 (1966)] and a recently proposed potential deduced from the neutral pseudoatom method [J. Phys.: Condens. Matter 5, 4283 (1993)]. Although the shape of the two potential functions is very different, the majority of the properties calculated from them are very similar. The differences among the results using the two interaction models are carefully discussed.
Resumo:
Estimation of soil load-bearing capacity from mathematical models that relate preconsolidation pressure (σp) to mechanical resistance to penetration (PR) and gravimetric soil water content (U) is important for defining strategies to prevent compaction of agricultural soils. Our objective was therefore to model the σp and compression index (CI) according to the PR (with an impact penetrometer in the field and a static penetrometer inserted at a constant rate in the laboratory) and U in a Rhodic Eutrudox. The experiment consisted of six treatments: no-tillage system (NT); NT with chiseling; and NT with additional compaction by combine traffic (passing 4, 8, 10, and 20 times). Soil bulk density, total porosity, PR (in field and laboratory measurements), U, σp, and CI values were determined in the 5.5-10.5 cm and 13.5-18.5 cm layers. Preconsolidation pressure (σp) and CI were modeled according to PR in different U. The σp increased and the CI decreased linearly with increases in the PR values. The correlations between σp and PR and PR and CI are influenced by U. From these correlations, the soil load-bearing capacity and compaction susceptibility can be estimated by PR readings evaluated in different U.
Resumo:
ABSTRACT Intrinsic equilibrium constants of 17 representative Brazilian Oxisols were estimated from potentiometric titration measuring the adsorption of H+ and OH− on amphoteric surfaces in suspensions of varying ionic strength. Equilibrium constants were fitted to two surface complexation models: diffuse layer and constant capacitance. The former was fitted by calculating total site concentration from curve fitting estimates and pH-extrapolation of the intrinsic equilibrium constants to the PZNPC (hand calculation), considering one and two reactive sites, and by the FITEQL software. The latter was fitted only by FITEQL, with one reactive site. Soil chemical and physical properties were correlated to the intrinsic equilibrium constants. Both surface complexation models satisfactorily fit our experimental data, but for results at low ionic strength, optimization did not converge in FITEQL. Data were incorporated in Visual MINTEQ and they provide a modeling system that can predict protonation-dissociation reactions in the soil surface under changing environmental conditions.
Resumo:
ABSTRACT Intrinsic equilibrium constants for 22 representative Brazilian Oxisols were estimated from a cadmium adsorption experiment. Equilibrium constants were fitted to two surface complexation models: diffuse layer and constant capacitance. Intrinsic equilibrium constants were optimized by FITEQL and by hand calculation using Visual MINTEQ in sweep mode, and Excel spreadsheets. Data from both models were incorporated into Visual MINTEQ. Constants estimated by FITEQL and incorporated in Visual MINTEQ software failed to predict observed data accurately. However, FITEQL raw output data rendered good results when predicted values were directly compared with observed values, instead of incorporating the estimated constants into Visual MINTEQ. Intrinsic equilibrium constants optimized by hand calculation and incorporated in Visual MINTEQ reliably predicted Cd adsorption reactions on soil surfaces under changing environmental conditions.
Resumo:
Self- and cross-velocity correlation functions and related transport coefficients of molten salts are studied by molecular-dynamics simulation. Six representative systems are considered, i.e., NaCl and KCl alkali halides, CuCl and CuBr noble-metal halides, and SrCl2 and ZnCl2 divalent metal-ion halides. Computer simulation results are compared with experimental self-diffusion coefficients and electrical conductivities. Special attention is paid to dynamic cross correlations and their dependence on the Coulomb interactions as well as on the size and mass differences between anions and cations.
Resumo:
Underbody plows can be very useful tools in winter maintenance, especially when compacted snow or hard ice must be removed from the roadway. By the application of significant down-force, and the use of an appropriate cutting edge angle, compacted snow and ice can be removed very effectively by such plows, with much greater efficiency than any other tool under those circumstances. However, the successful operation of an underbody plow requires considerable skill. If too little down pressure is applied to the plow, then it will not cut the ice or compacted snow. However, if too much force is applied, then either the cutting edge may gouge the road surface, causing significant damage often to both the road surface and the plow, or the plow may ride up on the cutting edge so that it is no longer controllable by the operator. Spinning of the truck in such situations is easily accomplished. Further, excessive down force will result in rapid wear of the cutting edge. Given this need for a high level of operator skill, the operation of an underbody plow is a candidate for automation. In order to successfully automate the operation of an underbody plow, a control system must be developed that follows a set of rules that represent appropriate operation of such a plow. These rules have been developed, based upon earlier work in which operational underbody plows were instrumented to determine the loading upon them (both vertical and horizontal) and the angle at which the blade was operating.These rules have been successfully coded into two different computer programs, both using the MatLab® software. In the first program, various load and angle inputs are analyzed to determine when, whether, and how they violate the rules of operation. This program is essentially deterministic in nature. In the second program, the Simulink® package in the MatLab® software system was used to implement these rules using fuzzy logic. Fuzzy logic essentially replaces a fixed and constant rule with one that varies in such a way as to improve operational control. The development of the fuzzy logic in this simulation was achieved simply by using appropriate routines in the computer software, rather than being developed directly. The results of the computer testing and simulation indicate that a fully automated, computer controlled underbody plow is indeed possible. The issue of whether the next steps toward full automation should be taken (and by whom) has also been considered, and the possibility of some sort of joint venture between a Department of Transportation and a vendor has been suggested.
Resumo:
Introduction: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on measurement of blood concentrations. Maintaining concentrations within a target range requires pharmacokinetic and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. In the last decades computer programs have been developed to assist clinicians in this assignment. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Method: Literature and Internet search was performed to identify software. All programs were tested on common personal computer. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software's characteristics. Numbers of drugs handled vary widely and 8 programs offer the ability to the user to add its own drug model. 10 computer programs are able to compute Bayesian dosage adaptation based on a blood concentration (a posteriori adjustment) while 9 are also able to suggest a priori dosage regimen (prior to any blood concentration measurement), based on individual patient covariates, such as age, gender, weight. Among those applying Bayesian analysis, one uses the non-parametric approach. The top 2 software emerging from this benchmark are MwPharm and TCIWorks. Other programs evaluated have also a good potential but are less sophisticated (e.g. in terms of storage or report generation) or less user-friendly.¦Conclusion: Whereas 2 integrated programs are at the top of the ranked listed, such complex tools would possibly not fit all institutions, and each software tool must be regarded with respect to individual needs of hospitals or clinicians. Interest in computing tool to support therapeutic monitoring is still growing. Although developers put efforts into it the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capacity of data storage and report generation.
Resumo:
Objectives: Therapeutic drug monitoring (TDM) aims at optimizing treatment by individualizing dosage regimen based on blood concentrations measurement. Maintaining concentrations within a target range requires pharmacokinetic (PK) and clinical capabilities. Bayesian calculation represents a gold standard in TDM approach but requires computing assistance. The aim of this benchmarking was to assess and compare computer tools designed to support TDM clinical activities.¦Methods: Literature and Internet were searched to identify software. Each program was scored against a standardized grid covering pharmacokinetic relevance, user-friendliness, computing aspects, interfacing, and storage. A weighting factor was applied to each criterion of the grid to consider its relative importance. To assess the robustness of the software, six representative clinical vignettes were also processed through all of them.¦Results: 12 software tools were identified, tested and ranked. It represents a comprehensive review of the available software characteristics. Numbers of drugs handled vary from 2 to more than 180, and integration of different population types is available for some programs. Nevertheless, 8 programs offer the ability to add new drug models based on population PK data. 10 computer tools incorporate Bayesian computation to predict dosage regimen (individual parameters are calculated based on population PK models). All of them are able to compute Bayesian a posteriori dosage adaptation based on a blood concentration while 9 are also able to suggest a priori dosage regimen, only based on individual patient covariates. Among those applying Bayesian analysis, MM-USC*PACK uses a non-parametric approach. The top 2 programs emerging from this benchmark are MwPharm and TCIWorks. Others programs evaluated have also a good potential but are less sophisticated or less user-friendly.¦Conclusions: Whereas 2 software packages are ranked at the top of the list, such complex tools would possibly not fit all institutions, and each program must be regarded with respect to individual needs of hospitals or clinicians. Programs should be easy and fast for routine activities, including for non-experienced users. Although interest in TDM tools is growing and efforts were put into it in the last years, there is still room for improvement, especially in terms of institutional information system interfacing, user-friendliness, capability of data storage and automated report generation.
Resumo:
Objectives: Acetate brain metabolism has the particularity to occur specifically in glial cells. Labeling studies, using acetate labeled either with 13C (NMR) or 11C (PET), are governed by the same biochemical reactions and thus follow the same mathematical principles. In this study, the objective was to adapt an NMR acetate brain metabolism model to analyse [1-11C]acetate infusion in rats. Methods: Brain acetate infusion experiments were modeled using a two-compartment model approach used in NMR.1-3 The [1-11C]acetate labeling study was done using a beta scintillator.4 The measured radioactive signal represents the time evolution of the sum of all labeled metabolites in the brain. Using a coincidence counter in parallel, an arterial input curve was measured. The 11C at position C-1 of acetate is metabolized in the first turn of the TCA cycle to the position 5 of glutamate (Figure 1A). Through the neurotransmission process, it is further transported to the position 5 of glutamine and the position 5 of neuronal glutamate. After the second turn of the TCA cycle, tracer from [1-11C]acetate (and also a part from glial [5-11C]glutamate) is transferred to glial [1-11C]glutamate and further to [1-11C]glutamine and neuronal glutamate through the neurotransmission cycle. Brain poster session: oxidative mechanisms S460 Journal of Cerebral Blood Flow & Metabolism (2009) 29, S455-S466 Results: The standard acetate two-pool PET model describes the system by a plasma pool and a tissue pool linked by rate constants. Experimental data are not fully described with only one tissue compartment (Figure 1B). The modified NMR model was fitted successfully to tissue time-activity curves from 6 single animals, by varying the glial mitochondrial fluxes and the neurotransmission flux Vnt. A glial composite rate constant Kgtg=Vgtg/[Ace]plasma was extracted. Considering an average acetate concentration in plasma of 1 mmol/g5 and the negligible additional amount injected, we found an average Vgtg = 0.08±0.02 (n = 6), in agreement with previous NMR measurements.1 The tissue time-activity curve is dominated by glial glutamate and later by glutamine (Figure 1B). Labeling of neuronal pools has a low influence, at least for the 20 mins of beta-probe acquisition. Based on the high diffusivity of CO2 across the blood-brain barrier; 11CO2 is not predominant in the total tissue curve, even if the brain CO2 pool is big compared with other metabolites, due to its strong dilution through unlabeled CO2 from neuronal metabolism and diffusion from plasma. Conclusion: The two-compartment model presented here is also able to fit data of positron emission experiments and to extract specific glial metabolic fluxes. 11C-labeled acetate presents an alternative for faster measurements of glial oxidative metabolism compared to NMR, potentially applicable to human PET imaging. However, to quantify the relative value of the TCA cycle flux compared to the transmitochondrial flux, the chemical sensitivity of NMR is required. PET and NMR are thus complementary.
Resumo:
The present research deals with the review of the analysis and modeling of Swiss franc interest rate curves (IRC) by using unsupervised (SOM, Gaussian Mixtures) and supervised machine (MLP) learning algorithms. IRC are considered as objects embedded into different feature spaces: maturities; maturity-date, parameters of Nelson-Siegel model (NSM). Analysis of NSM parameters and their temporal and clustering structures helps to understand the relevance of model and its potential use for the forecasting. Mapping of IRC in a maturity-date feature space is presented and analyzed for the visualization and forecasting purposes.
Resumo:
Debris flows and related landslide processes occur in many regions all over Norway and pose a significant hazard to inhabited areas. Within the framework of the development of a national debris flows susceptibility map, we are working on a modeling approach suitable for Norway with a nationwide coverage. The discrimination of source areas is based on an index approach, which includes topographic parameters and hydrological settings. For the runout modeling, we use the Flow-R model (IGAR, University of Lausanne), which is based on combined probabilistic and energetic algorithms for the assessment of the spreading of the flow and maximum runout distances. First results for different test areas have shown that runout distances can be modeled reliably. For the selection of source areas, however, additional factors have to be considered, such as the lithological and quaternary geological setting, in order to accommodate the strong variation in debris flow activity in the different geological, geomorphological and climate regions of Norway.
Resumo:
Photopolymerization is commonly used in a broad range of bioapplications, such as drug delivery, tissue engineering, and surgical implants, where liquid materials are injected and then hardened by means of illumination to create a solid polymer network. However, photopolymerization using a probe, e.g., needle guiding both the liquid and the curing illumination, has not been thoroughly investigated. We present a Monte Carlo model that takes into account the dynamic absorption and scattering parameters as well as solid-liquid boundaries of the photopolymer to yield the shape and volume of minimally invasively injected, photopolymerized hydrogels. In the first part of the article, our model is validated using a set of well-known poly(ethylene glycol) dimethacrylate hydrogels showing an excellent agreement between simulated and experimental volume-growth-rates. In the second part, in situ experimental results and simulations for photopolymerization in tissue cavities are presented. It was found that a cavity with a volume of 152 mm3 can be photopolymerized from the output of a 0.28-mm2 fiber by adding scattering lipid particles while only a volume of 38 mm3 (25%) was achieved without particles. The proposed model provides a simple and robust method to solve complex photopolymerization problems, where the dimension of the light source is much smaller than the volume of the photopolymerizable hydrogel.