135 resultados para soft computing methods
Resumo:
A number of techniques have been developed to study the disposition of drugs in the head and, in particular, the role of the blood-brain barrier (BBB) in drug uptake. The techniques can be divided into three groups: in-vitro, in-vivo and in-situ. The most suitable method depends on the purpose(s) and requirements of the particular study being conducted. In-vitro techniques involve the isolation of cerebral endothelial cells so that direct investigations of these cells can be carried out. The most recent preparations are able to maintain structural and functional characteristics of the BBB by simultaneously culturing endothelial cells with astrocytic cells,The main advantages of the in-vitro methods are the elimination of anaesthetics and surgery. In-vivo methods consist of a diverse range of techniques and include the traditional Brain Uptake Index and indicator diffusion methods, as well as microdialysis and positron emission tomography. In-vivo methods maintain the cells and vasculature of an organ in their normal physiological states and anatomical position within the animal. However, the shortcomings include renal acid hepatic elimination of solutes as well as the inability to control blood flow. In-situ techniques, including the perfused head, are more technically demanding. However, these models have the ability to vary the composition and flow rate of the artificial perfusate. This review is intended as a guide for selecting the most appropriate method for studying drug uptake in the brain.
Resumo:
Two factors generally reported to influence bone density are body composition and muscle strength. However, it is unclear if these relationships are consistent across race and sex, especially in older persons. If differences do exist by race and/or sex, then strategies to maintain bone mass or minimize bone loss in older adults may need to be modified accordingly. Therefore, we examined the independent effects of bone mineral-free lean mass (LM), fat mass (FM), and muscle strength on regional and whole body bone mineral density (BMD) in a cohort of 2619 well-functioning older adults participating in the Health, Aging, and Body Composition (Health ABC) Study with complete measures. Participants included 738 white women, 599 black women, 827 white men, and 455 black men aged 70-79 years. BMD (g/cm(2)) of the femoral neck, whole body, upper and lower limb, and whole body and upper limb bone mineral-free LM and FM was assessed by dual-energy X-ray absorptiometry (DXA). Handgrip strength and knee extensor torque were determined by dynamometry. In analyses stratified by race and sex and adjusted for a number of confounders, LM was a significant (p < 0.001) determinant of BMD, except in white women for the lower limb and whole body. In women, FM also was an independent contributor to BMD at the femoral neck, and both PM and muscle strength contributed to limb BMD. The following were the respective Beta-weights (regression coefficients for standardized data, Std beta) and percent difference in BMD per unit (7.5 kg) LM: femoral neck, 0.202-0.386 and 4.7-6.9 %; lower limb,.0.209-0.357 and 2.9-3.5%; whole body, 0.239-0.484 and 3.0-4.7 %; and upper limb (unit = 0.5 kg), 0.231-0.407 and 3.1-3.4%. Adjusting for bone size (bone mineral apparent density [BMAD]) or body size BMD/height) diminished the importance of LM, and the contributory effect of FM became more pronounced. These results indicate that LM and FM were associated with bone mineral depending on the bone site and bone index used. Where differences did occur, they were primarily by sex not race. To preserve BMD, maintaining or increasing LM in the elderly would appear to be an appropriate strategy, regardless of race or sex.
Resumo:
This paper presents a means of structuring specifications in real-time Object-Z: an integration of Object-Z with the timed refinement calculus. Incremental modification of classes using inheritance and composition of classes to form multi-component systems are examined. Two approaches to the latter are considered: using Object-Z's notion of object instantiation and introducing a parallel composition operator similar to those found in process algebras. The parallel composition operator approach is both more concise and allows more general modelling of concurrency. Its incorporation into the existing semantics of real-time Object-Z is presented.
Resumo:
Background: A variety of methods for prediction of peptide binding to major histocompatibility complex (MHC) have been proposed. These methods are based on binding motifs, binding matrices, hidden Markov models (HMM), or artificial neural networks (ANN). There has been little prior work on the comparative analysis of these methods. Materials and Methods: We performed a comparison of the performance of six methods applied to the prediction of two human MHC class I molecules, including binding matrices and motifs, ANNs, and HMMs. Results: The selection of the optimal prediction method depends on the amount of available data (the number of peptides of known binding affinity to the MHC molecule of interest), the biases in the data set and the intended purpose of the prediction (screening of a single protein versus mass screening). When little or no peptide data are available, binding motifs are the most useful alternative to random guessing or use of a complete overlapping set of peptides for selection of candidate binders. As the number of known peptide binders increases, binding matrices and HMM become more useful predictors. ANN and HMM are the predictive methods of choice for MHC alleles with more than 100 known binding peptides. Conclusion: The ability of bioinformatic methods to reliably predict MHC binding peptides, and thereby potential T-cell epitopes, has major implications for clinical immunology, particularly in the area of vaccine design.
Resumo:
Estimating energy requirements is necessary in clinical practice when indirect calorimetry is impractical. This paper systematically reviews current methods for estimating energy requirements. Conclusions include: there is discrepancy between the characteristics of populations upon which predictive equations are based and current populations; tools are not well understood, and patient care can be compromised by inappropriate application of the tools. Data comparing tools and methods are presented and issues for practitioners are discussed. (C) 2003 International Life Sciences Institute.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
We present a scheme which offers a significant reduction in the resources required to implement linear optics quantum computing. The scheme is a variation of the proposal of Knill, Laflamme and Milburn, and makes use of an incremental approach to the error encoding to boost probability of success.