194 resultados para Methods: numerical


Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use the finite element method to model the heat transfer phenomenon through permeable cracks in hydrothermal systems with upward throughflow. Since the finite element method is an approximate numerical method, the method must be validated before it is used to soh,e any new, kind of problem. However, the analytical solution, which can be used to validate the finite element method and other numerical methods, is rather limited in the literature, especially, for the problem considered here. Keeping this in mind, we have derived analytical solutions for the temperature distribution along the vertical axis of a crack in a fluid-saturated porous layer. After the finite element method is validated by comparing the numerical solution with the analytical solution for the same benchmark problem, it is used to investigate the pore-fluid flow and heat transfer in layered hydrothermal systems with vertical permeable cracks. The related analytical and numerical results have demonstrated that vertical cracks are effective and efficient members to transfer heat energy from the bottom section to the top section in hydrothermal systems with upward throughflow.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: A variety of methods for prediction of peptide binding to major histocompatibility complex (MHC) have been proposed. These methods are based on binding motifs, binding matrices, hidden Markov models (HMM), or artificial neural networks (ANN). There has been little prior work on the comparative analysis of these methods. Materials and Methods: We performed a comparison of the performance of six methods applied to the prediction of two human MHC class I molecules, including binding matrices and motifs, ANNs, and HMMs. Results: The selection of the optimal prediction method depends on the amount of available data (the number of peptides of known binding affinity to the MHC molecule of interest), the biases in the data set and the intended purpose of the prediction (screening of a single protein versus mass screening). When little or no peptide data are available, binding motifs are the most useful alternative to random guessing or use of a complete overlapping set of peptides for selection of candidate binders. As the number of known peptide binders increases, binding matrices and HMM become more useful predictors. ANN and HMM are the predictive methods of choice for MHC alleles with more than 100 known binding peptides. Conclusion: The ability of bioinformatic methods to reliably predict MHC binding peptides, and thereby potential T-cell epitopes, has major implications for clinical immunology, particularly in the area of vaccine design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The earth's tectonic plates are strong, viscoelastic shells which make up the outermost part of a thermally convecting, predominantly viscous layer. Brittle failure of the lithosphere occurs when stresses are high. In order to build a realistic simulation of the planet's evolution, the complete viscoelastic/brittle convection system needs to be considered. A particle-in-cell finite element method is demonstrated which can simulate very large deformation viscoelasticity with a strain-dependent yield stress. This is applied to a plate-deformation problem. Numerical accuracy is demonstrated relative to analytic benchmarks, and the characteristics of the method are discussed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We previously showed that 16-day-old rats exposed to a relatively high dose of ethanol at 10-15 postnatal days of age have fewer neurons in the hilus region of the hippocampus compared with controls. Dentate gyrus granule cell numbers, however, showed no statistically significant changes attributable to the ethanol treatment. It is possible that some of the changes in brain morphology, brought about as a result of the exposure to ethanol during early life, may not be manifested until later in life. This question has been further addressed in an extension to our previous study. Wistar rats were exposed to a relatively high daily dose of ethanol on postnatal days 10-15 by placement in a chamber containing ethanol vapour, for 3 h/day. The blood ethanol concentration was found to be similar to430 mg/dl at the end of the period of exposure. Groups of ethanol-treated (ET), separation control (SC), and mother-reared control (MRC) rats were anaesthetised and killed either at 16 or 30 days of age by perfusion with phosphate-buffered 2.5% glutaraldehyde. The Cavalieri principle and the physical disector methods were used to estimate, respectively, the regional volumes and neuron cell numerical densities in the hilus and granule cell regions of the dentate gyrus. The total numbers of neurons in the hilus region and granule cell layer were computed from these estimates. It was found that 16-day-old animals had 398,000-441,000 granule cells, irrespective of group. The numbers of granule cells increased such that by 30 days of age, rats had 487,000-525,500 granule cells. However, there were no significant differences between ethanol-treated rats and their age-matched controls in granule cell numbers. In contrast, ethanol-treated rats had slightly but significantly fewer neurons in the hilus region than did control animals at 16 days of age, but not at 30 days of age. Therefore, it appears that a short period of ethanol exposure during early life can have effects on neuron numbers of some hippocampal neurons, but not others. The effects on hilar neuron numbers, observed as a result of such short periods of ethanol treatment, appeared to be transitory. (C) 2003 Wiley-Liss, Inc.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

[1] The physical conditions required to provide for the tectonic stability of cratonic crust and for the relative longevity of deep cratonic lithosphere within a dynamic, convecting mantle are explored through a suite of numerical simulations. The simulations allow chemically distinct continents to reside within the upper thermal boundary layer of a thermally convecting mantle layer. A rheologic formulation, which models both brittle and ductile behavior, is incorporated to allow for plate-like behavior and the associated subduction of oceanic lithosphere. Several mechanisms that may stabilize cratons are considered. The two most often invoked mechanisms, chemical buoyancy and/or high viscosity of cratonic root material, are found to be relatively ineffective if cratons come into contact with subduction zones. High root viscosity can provide for stability and longevity but only within a thick root limit in which the thickness of chemically distinct, high-viscosity cratonic lithosphere exceeds the thickness of old oceanic lithosphere by at least a factor of 2. This end-member implies a very thick mechanical lithosphere for cratons. A high brittle yield stress for cratonic lithosphere as a whole, relative to oceanic lithosphere, is found to be an effective and robust means for providing stability and lithospheric longevity. This mode does not require exceedingly deep strength within cratons. A high yield stress for only the crustal or mantle component of the cratonic lithosphere is found to be less effective as detachment zones can then form at the crust-mantle interface which decreases the longevity potential of cratonic roots. The degree of yield stress variations between cratonic and oceanic lithosphere required for stability and longevity can be decreased if cratons are bordered by continental lithosphere that has a relatively low yield stress, i.e., mobile belts. Simulations that combine all the mechanisms can lead to crustal stability and deep root longevity for model cratons over several mantle overturn times, but the dominant stabilizing factor remains a relatively high brittle yield stress for cratonic lithosphere.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The paper presents a theory for modeling flow in anisotropic, viscous rock. This theory has originally been developed for the simulation of large deformation processes including the folding and kinking of multi-layered visco-elastic rock (Muhlhaus et al. [1,2]). The orientation of slip planes in the context of crystallographic slip is determined by the normal vector - the director - of these surfaces. The model is applied to simulate anisotropic mantle convection. We compare the evolution of flow patterns, Nusselt number and director orientations for isotropic and anisotropic rheologies. In the simulations we utilize two different finite element methodologies: The Lagrangian Integration Point Method Moresi et al [8] and an Eulerian formulation, which we implemented into the finite element based pde solver Fastflo (www.cmis.csiro.au/Fastflo/). The reason for utilizing two different finite element codes was firstly to study the influence of an anisotropic power law rheology which currently is not implemented into the Lagrangian Integration point scheme [8] and secondly to study the numerical performance of Eulerian (Fastflo)- and Lagrangian integration schemes [8]. It turned out that whereas in the Lagrangian method the Nusselt number vs time plot reached only a quasi steady state where the Nusselt number oscillates around a steady state value the Eulerian scheme reaches exact steady states and produces a high degree of alignment (director orientation locally orthogonal to velocity vector almost everywhere in the computational domain). In the simulations emergent anisotropy was strongest in terms of modulus contrast in the up and down-welling plumes. Mechanisms for anisotropic material behavior in the mantle dynamics context are discussed by Christensen [3]. The dominant mineral phases in the mantle generally do not exhibit strong elastic anisotropy but they still may be oriented by the convective flow. Thus viscous anisotropy (the main focus of this paper) may or may not correlate with elastic or seismic anisotropy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We illustrate the flow behaviour of fluids with isotropic and anisotropic microstructure (internal length, layering with bending stiffness) by means of numerical simulations of silo discharge and flow alignment in simple shear. The Cosserat theory is used to provide an internal length in the constitutive model through bending stiffness to describe isotropic microstructure and this theory is coupled to a director theory to add specific orientation of grains to describe anisotropic microstructure. The numerical solution is based on an implicit form of the Material Point Method developed by Moresi et al. [1].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimating energy requirements is necessary in clinical practice when indirect calorimetry is impractical. This paper systematically reviews current methods for estimating energy requirements. Conclusions include: there is discrepancy between the characteristics of populations upon which predictive equations are based and current populations; tools are not well understood, and patient care can be compromised by inappropriate application of the tools. Data comparing tools and methods are presented and issues for practitioners are discussed. (C) 2003 International Life Sciences Institute.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Carbon monoxide, the chief killer in fires, and other species are modelled for a series of enclosure fires. The conditions emulate building fires where CO is formed in the rich, turbulent, nonpremixed flame and is transported frozen to lean mixtures by the ceiling jet which is cooled by radiation and dilution. Conditional moment closure modelling is used and computational domain minimisation criteria are developed which reduce the computational cost of this method. The predictions give good agreement for CO and other species in the lean, quenched-gas stream, holding promise that this method may provide a practical means of modelling real, three-dimensional fire situations. (c) 2005 The Combustion Institute. Published by Elsevier Inc. All rights reserved.