970 resultados para Mathematical methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of mathematical models have been used to describe percutaneous absorption kinetics. In general, most of these models have used either diffusion-based or compartmental equations. The object of any mathematical model is to a) be able to represent the processes associated with absorption accurately, b) be able to describe/summarize experimental data with parametric equations or moments, and c) predict kinetics under varying conditions. However, in describing the processes involved, some developed models often suffer from being of too complex a form to be practically useful. In this chapter, we attempt to approach the issue of mathematical modeling in percutaneous absorption from four perspectives. These are to a) describe simple practical models, b) provide an overview of the more complex models, c) summarize some of the more important/useful models used to date, and d) examine sonic practical applications of the models. The range of processes involved in percutaneous absorption and considered in developing the mathematical models in this chapter is shown in Fig. 1. We initially address in vitro skin diffusion models and consider a) constant donor concentration and receptor conditions, b) the corresponding flux, donor, skin, and receptor amount-time profiles for solutions, and c) amount- and flux-time profiles when the donor phase is removed. More complex issues, such as finite-volume donor phase, finite-volume receptor phase, the presence of an efflux. rate constant at the membrane-receptor interphase, and two-layer diffusion, are then considered. We then look at specific models and issues concerned with a) release from topical products, b) use of compartmental models as alternatives to diffusion models, c) concentration-dependent absorption, d) modeling of skin metabolism, e) role of solute-skin-vehicle interactions, f) effects of vehicle loss, a) shunt transport, and h) in vivo diffusion, compartmental, physiological, and deconvolution models. We conclude by examining topics such as a) deep tissue penetration, b) pharmacodynamics, c) iontophoresis, d) sonophoresis, and e) pitfalls in modeling.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use the finite element method to model the heat transfer phenomenon through permeable cracks in hydrothermal systems with upward throughflow. Since the finite element method is an approximate numerical method, the method must be validated before it is used to soh,e any new, kind of problem. However, the analytical solution, which can be used to validate the finite element method and other numerical methods, is rather limited in the literature, especially, for the problem considered here. Keeping this in mind, we have derived analytical solutions for the temperature distribution along the vertical axis of a crack in a fluid-saturated porous layer. After the finite element method is validated by comparing the numerical solution with the analytical solution for the same benchmark problem, it is used to investigate the pore-fluid flow and heat transfer in layered hydrothermal systems with vertical permeable cracks. The related analytical and numerical results have demonstrated that vertical cracks are effective and efficient members to transfer heat energy from the bottom section to the top section in hydrothermal systems with upward throughflow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background and aim of the study: Results of valve re-replacement (reoperation) in 898 patients undergoing aortic valve replacement with cryopreserved homograft valves between 1975 and 1998 are reported. The study aim was to provide estimates of unconditional probability of valve reoperation and cumulative incidence function (actual risk) of reoperation. Methods: Valves were implanted by subcoronary insertion (n = 500), inclusion cylinder (n = 46), and aortic root replacement (n = 352). Probability of reoperation was estimated by adopting a mixture model framework within which estimates were adjusted for two risk factors: patient age at initial replacement, and implantation technique. Results: For a patient aged 50 years, the probability of reoperation in his/her lifetime was estimated as 44% and 56% for non-root and root replacement techniques, respectively. For a patient aged 70 years, estimated probability of reoperation was 16% and 25%, respectively. Given that a reoperation is required, patients with non-root replacement have a higher hazard rate than those with root replacement (hazards ratio = 1.4), indicating that non-root replacement patients tend to undergo reoperation earlier before death than root replacement patients. Conclusion: Younger patient age and root versus non-root replacement are risk factors for reoperation. Valve durability is much less in younger patients, while root replacement patients appear more likely to live longer and hence are more likely to require reoperation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: A variety of methods for prediction of peptide binding to major histocompatibility complex (MHC) have been proposed. These methods are based on binding motifs, binding matrices, hidden Markov models (HMM), or artificial neural networks (ANN). There has been little prior work on the comparative analysis of these methods. Materials and Methods: We performed a comparison of the performance of six methods applied to the prediction of two human MHC class I molecules, including binding matrices and motifs, ANNs, and HMMs. Results: The selection of the optimal prediction method depends on the amount of available data (the number of peptides of known binding affinity to the MHC molecule of interest), the biases in the data set and the intended purpose of the prediction (screening of a single protein versus mass screening). When little or no peptide data are available, binding motifs are the most useful alternative to random guessing or use of a complete overlapping set of peptides for selection of candidate binders. As the number of known peptide binders increases, binding matrices and HMM become more useful predictors. ANN and HMM are the predictive methods of choice for MHC alleles with more than 100 known binding peptides. Conclusion: The ability of bioinformatic methods to reliably predict MHC binding peptides, and thereby potential T-cell epitopes, has major implications for clinical immunology, particularly in the area of vaccine design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the transformation of maximally entangled states under the action of Lorentz transformations in a fully relativistic setting. By explicit calculation of the Wigner rotation, we describe the relativistic analog of the Bell states as viewed from two inertial frames moving with constant velocity with respect to each other. Though the finite dimensional matrices describing the Lorentz transformations are non-unitary, each single particle state of the entangled pair undergoes an effective, momentum dependent, local unitary rotation, thereby preserving the entanglement fidelity of the bipartite state. The details of how these unitary transformations are manifested are explicitly worked out for the Bell states comprised of massive spin 1/2 particles and massless photon polarizations. The relevance of this work to non-inertial frames is briefly discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper is concerned with the use of scientific visualization methods for the analysis of feedforward neural networks (NNs). Inevitably, the kinds of data associated with the design and implementation of neural networks are of very high dimensionality, presenting a major challenge for visualization. A method is described using the well-known statistical technique of principal component analysis (PCA). This is found to be an effective and useful method of visualizing the learning trajectories of many learning algorithms such as back-propagation and can also be used to provide insight into the learning process and the nature of the error surface.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The aim of this work was to exemplify the specific contribution of both two- and three-dimensional (31)) X-ray computed tomography to characterise earthworm burrow systems. To achieve this purpose we used 3D mathematical morphology operators to characterise burrow systems resulting from the activity of an anecic (Aporrectodea noctunia), and an endogeic species (Allolobophora chlorotica), when both species were introduced either separately or together into artificial soil cores. Images of these soil cores were obtained using a medical X-ray tomography scanner. Three-dimensional reconstructions of burrow systems were obtained using a specifically developed segmentation algorithm. To study the differences between burrow systems, a set of classical tools of mathematical morphology (granulometries) were used. So-called granulometries based on different structuring elements clearly separated the different burrow systems. They enabled us to show that burrows made by the anecic species were fatter, longer, more vertical, more continuous but less sinuous than burrows of the endogeic species. The granulometry transform of the soil matrix showed that burrows made by A. nocturna were more evenly distributed than those of A. chlorotica. Although a good discrimination was possible when only one species was introduced into the soil cores, it was not possible to separate burrows of the two species from each other in cases where species were introduced into the same soil core. This limitation, partly due to the insufficient spatial resolution of the medical scanner, precluded the use of the morphological operators to study putative interactions between the two species.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimating energy requirements is necessary in clinical practice when indirect calorimetry is impractical. This paper systematically reviews current methods for estimating energy requirements. Conclusions include: there is discrepancy between the characteristics of populations upon which predictive equations are based and current populations; tools are not well understood, and patient care can be compromised by inappropriate application of the tools. Data comparing tools and methods are presented and issues for practitioners are discussed. (C) 2003 International Life Sciences Institute.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analytical and bioanalytical methods of high-performance liquid chromatography with fluorescence detection (HPLC-FLD) were developed and validated for the determination of chloroaluminum phthalocyanine in different formulations of polymeric nanocapsules, plasma and livers of mice. Plasma and homogenized liver samples were extracted with ethyl acetate, and zinc phthalocyanine was used as internal standard. The results indicated that the methods were linear and selective for all matrices studied. Analysis of accuracy and precision showed adequate values, with variations lower than 10% in biological samples and lower than 2% in analytical samples. The recoveries were as high as 96% and 99% in the plasma and livers, respectively. The quantification limit of the analytical method was 1.12 ng/ml, and the limits of quantification of the bioanalytical method were 15 ng/ml and 75 ng/g for plasma and liver samples, respectively. The bioanalytical method developed was sensitive in the ranges of 15-100 ng/ml in plasma and 75-500 ng/g in liver samples and was applied to studies of biodistribution and pharmacokinetics of AlClPc. (C) 2011 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The image reconstruction using the EIT (Electrical Impedance Tomography) technique is a nonlinear and ill-posed inverse problem which demands a powerful direct or iterative method. A typical approach for solving the problem is to minimize an error functional using an iterative method. In this case, an initial solution close enough to the global minimum is mandatory to ensure the convergence to the correct minimum in an appropriate time interval. The aim of this paper is to present a new, simple and low cost technique (quadrant-searching) to reduce the search space and consequently to obtain an initial solution of the inverse problem of EIT. This technique calculates the error functional for four different contrast distributions placing a large prospective inclusion in the four quadrants of the domain. Comparing the four values of the error functional it is possible to get conclusions about the internal electric contrast. For this purpose, initially we performed tests to assess the accuracy of the BEM (Boundary Element Method) when applied to the direct problem of the EIT and to verify the behavior of error functional surface in the search space. Finally, numerical tests have been performed to verify the new technique.