57 resultados para Sievert Pressione Assorbimento Desorbimento Idrogeno Volume Software Cinetica PCI Composizione

em University of Queensland eSpace - Australia


Relevância:

30.00% 30.00%

Publicador:

Resumo:

The diffusion model for percutaneous absorption is developed for the specific case of delivery to the skin being limited by the application of a finite amount of solute. Two cases are considered; in the first, there is an application of a finite donor (vehicle) volume, and in the second, there are solvent-deposited solids and a thin vehicle with a high partition coefficient. In both cases, the potential effect of an interfacial resistance at the stratum corneum surface is also considered. As in the previous paper, which was concerned with the application of a constant donor concentration, clearance limitations due to the viable eqidermis, the in vitro sampling rate, or perfusion rate in vivo are included. Numerical inversion of the Laplace domain solutions was used for simulations of solute flux and cumulative amount absorbed and to model specific examples of percutaneous absorption of solvent-deposited solids. It was concluded that numerical inversions of the Laplace domain solutions for a diffusion model of the percutaneous absorption, using standard scientific software (such as SCIENTIST, MicroMath Scientific software) on modern personal computers, is a practical alternative to computation of infinite series solutions. Limits of the Laplace domain solutions were used to define the moments of the flux-time profiles for finite donor volumes and the slope of the terminal log flux-time profile. The mean transit time could be related to the diffusion time through stratum corneum, viable epidermal, and donor diffusion layer resistances and clearance from the receptor phase. Approximate expressions for the time to reach maximum flux (peak time) and maximum flux were also derived. The model was then validated using reported amount-time and flux-time profiles for finite doses applied to the skin. It was concluded that for very small donor phase volume or for very large stratum corneum-vehicle partitioning coefficients (e.g., for solvent deposited solids), the flux and amount of solute absorbed are affected by receptor conditions to a lesser extent than is obvious for a constant donor constant donor concentrations. (C) 2001 Wiley-Liss, Inc. and the American Pharmaceutical Association J Pharm Sci 90:504-520, 2001.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Computer assisted learning has an important role in the teaching of pharmacokinetics to health sciences students because it transfers the emphasis from the purely mathematical domain to an 'experiential' domain in which graphical and symbolic representations of actions and their consequences form the major focus for learning. Basic pharmacokinetic concepts can be taught by experimenting with the interplay between dose and dosage interval with drug absorption (e.g. absorption rate, bioavailability), drug distribution (e.g. volume of distribution, protein binding) and drug elimination (e.g. clearance) on drug concentrations using library ('canned') pharmacokinetic models. Such 'what if' approaches are found in calculator-simulators such as PharmaCalc, Practical Pharmacokinetics and PK Solutions. Others such as SAAM II, ModelMaker, and Stella represent the 'systems dynamics' genre, which requires the user to conceptualise a problem and formulate the model on-screen using symbols, icons, and directional arrows. The choice of software should be determined by the aims of the subject/course, the experience and background of the students in pharmacokinetics, and institutional factors including price and networking capabilities of the package(s). Enhanced learning may result if the computer teaching of pharmacokinetics is supported by tutorials, especially where the techniques are applied to solving problems in which the link with healthcare practices is clearly established.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This report describes recent updates to the custom-built data-acquisition hardware operated by the Center for Hypersonics. In 2006, an ISA-to-USB bridging card was developed as part of Luke Hillyard's final-year thesis. This card allows the hardware to be connected to any recent personal computers via a (USB or RS232) serial port and it provides a number of simple text-based commands for control of the hardware. A graphical user interface program was also updated to help the experimenter manage the data acquisition functions. Sampled data is stored in text files that have been compressed with the gzip for mat. To simplify the later archiving or transport of the data, all files specific to a shot are stored in a single directory. This includes a text file for the run description, the signal configuration file and the individual sampled-data files, one for each signal that was recorded.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The XSophe-Sophe-XeprView((R)) computer simulation software suite enables scientists to easily determine spin Hamiltonian parameters from isotropic, randomly oriented and single crystal continuous wave electron paramagnetic resonance (CW EPR) spectra from radicals and isolated paramagnetic metal ion centers or clusters found in metalloproteins, chemical systems and materials science. XSophe provides an X-windows graphical user interface to the Sophe programme and allows: creation of multiple input files, local and remote execution of Sophe, the display of sophelog (output from Sophe) and input parameters/files. Sophe is a sophisticated computer simulation software programme employing a number of innovative technologies including; the Sydney OPera HousE (SOPHE) partition and interpolation schemes, a field segmentation algorithm, the mosaic misorientation linewidth model, parallelization and spectral optimisation. In conjunction with the SOPHE partition scheme and the field segmentation algorithm, the SOPHE interpolation scheme and the mosaic misorientation linewidth model greatly increase the speed of simulations for most spin systems. Employing brute force matrix diagonalization in the simulation of an EPR spectrum from a high spin Cr(III) complex with the spin Hamiltonian parameters g(e) = 2.00, D = 0.10 cm(-1), E/D = 0.25, A(x) = 120.0, A(y) = 120.0, A(z) = 240.0 x 10(-4) cm(-1) requires a SOPHE grid size of N = 400 (to produce a good signal to noise ratio) and takes 229.47 s. In contrast the use of either the SOPHE interpolation scheme or the mosaic misorientation linewidth model requires a SOPHE grid size of only N = 18 and takes 44.08 and 0.79 s, respectively. Results from Sophe are transferred via the Common Object Request Broker Architecture (CORBA) to XSophe and subsequently to XeprView((R)) where the simulated CW EPR spectra (1D and 2D) can be compared to the experimental spectra. Energy level diagrams, transition roadmaps and transition surfaces aid the interpretation of complicated randomly oriented CW EPR spectra and can be viewed with a web browser and an OpenInventor scene graph viewer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

One of the challenges in scientific visualization is to generate software libraries suitable for the large-scale data emerging from tera-scale simulations and instruments. We describe the efforts currently under way at SDSC and NPACI to address these challenges. The scope of the SDSC project spans data handling, graphics, visualization, and scientific application domains. Components of the research focus on the following areas: intelligent data storage, layout and handling, using an associated “Floor-Plan” (meta data); performance optimization on parallel architectures; extension of SDSC’s scalable, parallel, direct volume renderer to allow perspective viewing; and interactive rendering of fractional images (“imagelets”), which facilitates the examination of large datasets. These concepts are coordinated within a data-visualization pipeline, which operates on component data blocks sized to fit within the available computing resources. A key feature of the scheme is that the meta data, which tag the data blocks, can be propagated and applied consistently. This is possible at the disk level, in distributing the computations across parallel processors; in “imagelet” composition; and in feature tagging. The work reflects the emerging challenges and opportunities presented by the ongoing progress in high-performance computing (HPC) and the deployment of the data, computational, and visualization Grids.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The artificial dissipation effects in some solutions obtained with a Navier-Stokes flow solver are demonstrated. The solvers were used to calculate the flow of an artificially dissipative fluid, which is a fluid having dissipative properties which arise entirely from the solution method itself. This was done by setting the viscosity and heat conduction coefficients in the Navier-Stokes solvers to zero everywhere inside the flow, while at the same time applying the usual no-slip and thermal conducting boundary conditions at solid boundaries. An artificially dissipative flow solution is found where the dissipation depends entirely on the solver itself. If the difference between the solutions obtained with the viscosity and thermal conductivity set to zero and their correct values is small, it is clear that the artificial dissipation is dominating and the solutions are unreliable.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using Landsat imagery, forest canopy density (FCD) estimated with the FCD Mapper®, was correlated with predominant height (PDH, measured as the average height of the tallest 50 trees per hectare) for 20 field plots measured in native forest at Noosa Heads, south-east Queensland, Australia. A corresponding image was used to calculate FCD in Leyte Island, the Philippines and was validated on the ground for accuracy. The FCD Mapper was produced for the International Tropical Timber Organisation and estimates FCD as an index of canopy density using reflectance characteristics of Landsat Enhanced Thematic (ETM) Mapper images. The FCD Mapper is a ‘semi-expert’ computer program which uses interactive screens to allow the operator to make decisions concerning the classification of land into bare soil, grass and forest. At Noosa, a positive strong nonlinear relationship (r2 = 0.86) was found between FCD and PDH for 15 field plots with variable PDH but complete canopy closure. An additional five field plots were measured in forest with a broken canopy and the software assessed these plots as having a much lower FCD than forest with canopy closure. FCD estimates for forest and agricultural land in the island of Leyte and subsequent field validation showed that at appropriate settings, the FCD Mapper differentiated between tropical rainforest and banana or coconut plantation. These findings suggest that in forests with a closed canopy this remote sensing technique has promise for forest inventory and productivity assessment. The findings also suggest that the software has promise for discriminating between native forest with a complete canopy and forest which has a broken canopy, such as coconut or banana plantation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A copolymer of X-hydroxyethyl methacrylate (HEMA) with 2-ethoxy ethyl methacrylate (EEMA) was synthesized and the molecular mobility, free volume, and density properties examined as a function of composition. These properties were correlated with the equilibrium water uptake in order to determine which of the properties were most influential in causing high water sorption, as these materials are suitable candidates for hydrogel systems. It was found that the polar HEMA repeat unit results in a rigid, glassy sample at room temperature due to the high degree of hydrogen bonding between chains whereas high EEMA content leads to rubbery samples with subambient glass transition temperatures. The free volume properties on the molecular scale measured by positron annihilation Lifetime spectroscopy (PALS) showed that higher HEMA content led to smaller, fewer holes and a lower free volume fraction than EEMA. Therefore the high water uptake of HEEMA-containing copolymers is largely related to the high polarity of the HEMA unit compared to EEMA, despite the low content of free volume into which the water can initially diffuse. Trends in density with copolymer composition, as measured on a macroscopic level, differs to that seen by PALS and indicates that the two techniques are measuring different scales of packing. (C) 1998 John Wiley & Sons, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Plant cells are characterized by low water content, so the fraction of cell volume (volume fraction) in a vessel is large compared with other cell systems, even if the cell concentrations are the same. Therefore, concentration of plant cells should preferably be expressed by the liquid volume basis rather than by the total vessel volume basis. In this paper, a new model is proposed to analyze behavior of a plant cell culture by dividing the cell suspension into the biotic- and abiotic-phases, Using this model, we analyzed the cell-growth and the alkaloid production by Catharanthus roseus, Large errors in the simulated results were observed if the phase-segregation was not considered.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Expokit provides a set of routines aimed at computing matrix exponentials. More precisely, it computes either a small matrix exponential in full, the action of a large sparse matrix exponential on an operand vector, or the solution of a system of linear ODEs with constant inhomogeneity. The backbone of the sparse routines consists of matrix-free Krylov subspace projection methods (Arnoldi and Lanczos processes), and that is why the toolkit is capable of coping with sparse matrices of large dimension. The software handles real and complex matrices and provides specific routines for symmetric and Hermitian matrices. The computation of matrix exponentials is a numerical issue of critical importance in the area of Markov chains and furthermore, the computed solution is subject to probabilistic constraints. In addition to addressing general matrix exponentials, a distinct attention is assigned to the computation of transient states of Markov chains.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Magnetic resonance imaging (MRI) was used to evaluate and compare with anthropometry a fundamental bioelectrical impedance analysis (BIA) method for predicting muscle and adipose tissue composition in the lower limb. Healthy volunteers (eight men and eight women), aged 41 to 62 years, with mean (S.D.) body mass indices of 28.6 (5.4) kg/m(2) and 25.1 (5.4) kg/m(2) respectively, were subjected to MRI leg scans, from which 20-cm sections of thigh and IO-cm sections of lower leg (calf) were analysed for muscle and adipose tissue content, using specifically developed software. Muscle and adipose tissue were also predicted from anthropometric measurements of circumferences and skinfold thicknesses, and by use of fundamental BIA equations involving section impedance at 50 kHz and tissue-specific resistivities. Anthropometric assessments of circumferences, cross-sectional areas and volumes for total constituent tissues matched closely MRI estimates. Muscle volume was substantially overestimated (bias: thigh, -40%; calf, -18%) and adipose tissue underestimated (bias: thigh, 43%; calf, 8%) by anthropometry, in contrast to generally better predictions by the fundamental BIA approach for muscle (bias:thigh, -12%; calf, 5%) and adipose tissue (bias:thigh, 17%; calf, -28%). However, both methods demonstrated considerable individual variability (95% limits of agreement 20-77%). In general, there was similar reproducibility for anthropometric and fundamental BIA methods in the thigh (inter-observer residual coefficient of variation for muscle 3.5% versus 3.8%), but the latter was better in the calf (inter-observer residual coefficient of variation for muscle 8.2% versus 4.5%). This study suggests that the fundamental BIA method has advantages over anthropometry for measuring lower limb tissue composition in healthy individuals.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Bioelectrical impedance analysis (BIA) has been reported to be insensitive to changes in water volumes in individual subjects, This study was designed to investigate the effect on the intra- and extracellular resistances (Ri and Re) of the segments of subjects for whom body water was changed without significant change to the total amount of electrolyte in the respective fluids, Twelve healthy adult subjects were recruited. Ri and Re of the leg, trunk, and arm of the subjects were determined from BIA measures prior to commencement of two separate studies that involved intervention, resulting in a loss/gain of body water effected either bt a sauna followed by water intake (study 1) or by ingestion (study 2). Ri and Re of the segments were also determined at a number of times following these interventions, The mean change in body water, expressed as a percentage of body weight, was 0.9% in study 1 and 1.25% in study 2. For each study, the results for each subject were normalized for each limb to the initial (prestudy) value and then the normalized results for each segment were pooled for all subjects, ANOVA of these pooled results failed to demonstrate any significant differences between the normalized mean values of Ri or Re of the segments measured through the course of each study, The failure to detect a change in Ri or Re is explained in terms of the basic theory of BIA.

Relevância:

20.00% 20.00%

Publicador:

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this and a preceding paper, we provide an introduction to the Fujitsu VPP range of vector-parallel supercomputers and to some of the computational chemistry software available for the VPP. Here, we consider the implementation and performance of seven popular chemistry application packages. The codes discussed range from classical molecular dynamics to semiempirical and ab initio quantum chemistry. All have evolved from sequential codes, and have typically been parallelised using a replicated data approach. As such they are well suited to the large-memory/fast-processor architecture of the VPP. For one code, CASTEP, a distributed-memory data-driven parallelisation scheme is presented. (C) 2000 Published by Elsevier Science B.V. All rights reserved.