151 resultados para mixed verification methods


Relevância:

20.00% 20.00%

Publicador:

Resumo:

I noted with interest the article by Drs Perrin and Guex, entitled &dquo;Edema and leg volume: Methods of assessment,&dquo; published in Angiology 51:9-12, 2000. This was a timely and comprehensive review of the various methods in clinical use for the assessment of peripheral edema, notably in the leg. I would like to take this opportunity to alert readers to a further technique useful for this purpose, namely, bioelectrical impedance analysis. An early reportl described its use for the measurement of edema in the leg, but other than its successful use for the assessment of edema in the arm following masteCtoMy,2,1 the potential of the method remains to be fully realized. This is unfortunate since the method directly and quantifiably measures edema.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Depending on the size and shape of the materials, methods employed to achieve effective fluidization during fluid bed drying varies from use of simple hole distributors for small, light weight materials to special techniques for lager and/or moist materials. This paper reviews common air distributors used in fluidized bed drying of food particulates. Also it reviews special methods of fluidizing larger irregular food particulates.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A number of techniques have been developed to study the disposition of drugs in the head and, in particular, the role of the blood-brain barrier (BBB) in drug uptake. The techniques can be divided into three groups: in-vitro, in-vivo and in-situ. The most suitable method depends on the purpose(s) and requirements of the particular study being conducted. In-vitro techniques involve the isolation of cerebral endothelial cells so that direct investigations of these cells can be carried out. The most recent preparations are able to maintain structural and functional characteristics of the BBB by simultaneously culturing endothelial cells with astrocytic cells,The main advantages of the in-vitro methods are the elimination of anaesthetics and surgery. In-vivo methods consist of a diverse range of techniques and include the traditional Brain Uptake Index and indicator diffusion methods, as well as microdialysis and positron emission tomography. In-vivo methods maintain the cells and vasculature of an organ in their normal physiological states and anatomical position within the animal. However, the shortcomings include renal acid hepatic elimination of solutes as well as the inability to control blood flow. In-situ techniques, including the perfused head, are more technically demanding. However, these models have the ability to vary the composition and flow rate of the artificial perfusate. This review is intended as a guide for selecting the most appropriate method for studying drug uptake in the brain.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The reactions of mercury(II) with the mixed donor encapsulating ligands 3,6,16-trithia-6,11,19-triazabicyclo[6.6.6]icosane (AMN(3)S(3)sar) and 1-amino-8-methyl-6,19-dithia-3,10,13,16-tetraazabicyclo[6.6.6]icosane (AMN(4)S(2)sar) have been studied. NMR ligand-ligand competition experiments with the ligands 1,4,8,11-tetraazaeyclotetradecane ([14]aneN(4)), 1-thia-4,7,10-triazacyclododecane ([12]aneN(3)S) and ethylenediaminetetraacetic acid (EDTA) with AMN(3)S(3)sar and Hg(II) indicated that [14]aneN(4) would be an appropriate competing ligand for the, determination of the Hg(II) stability constant. Calculations indicated the ratio of concentrations of AMN3S3sar, [14]aneN(4) and Hg(II) required for the determination of the stability constant ranged from 1:1:1 to 1:5:1. Refinement of the titration curves yielded log(10)K[Hg(AMN(3)S(3)sar)](2+) = 17.7. A similar competition titration resulted in the determination of the stability constant for the AMN(4)S(2)sar system as log(10)K[Hg(AMN(4)S(2)sar)](2+) = 19.5. The observed binding constants for the mixed N/S donor systems and the hexaaza analogues sar (3,6,10,13,16,19-hexaazabicyclo [6.6.6]icosane) and diamsar (1,8-diamino-3,6,10,13,16,19 -hexazabicyclo [6.6.6] icosane (log(10)K-[Hg(diamsar)](2+) = 26.4; log(10)K[Hg(sar)](2+) = 28.1) differ by approximately ten orders of magnitude. The difference is ascribed not to a cryptate effect but to a mismatch in the Hg-N and Hg-S bond lengths in the N/S systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: A variety of methods for prediction of peptide binding to major histocompatibility complex (MHC) have been proposed. These methods are based on binding motifs, binding matrices, hidden Markov models (HMM), or artificial neural networks (ANN). There has been little prior work on the comparative analysis of these methods. Materials and Methods: We performed a comparison of the performance of six methods applied to the prediction of two human MHC class I molecules, including binding matrices and motifs, ANNs, and HMMs. Results: The selection of the optimal prediction method depends on the amount of available data (the number of peptides of known binding affinity to the MHC molecule of interest), the biases in the data set and the intended purpose of the prediction (screening of a single protein versus mass screening). When little or no peptide data are available, binding motifs are the most useful alternative to random guessing or use of a complete overlapping set of peptides for selection of candidate binders. As the number of known peptide binders increases, binding matrices and HMM become more useful predictors. ANN and HMM are the predictive methods of choice for MHC alleles with more than 100 known binding peptides. Conclusion: The ability of bioinformatic methods to reliably predict MHC binding peptides, and thereby potential T-cell epitopes, has major implications for clinical immunology, particularly in the area of vaccine design.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A series of alpha-sialon (alpha') compositions containing mixed stabilising cations were prepared, by introducing additional CaO to a basic Sm alpha-sialon compositions. The thermal stability of these Sm-Ca-containing alpha-sialon phases was investigated using XRD, SEM and EDXS techniques. It was found that the addition of calcium into the Sm alpha-sialon systems greatly improved the stability of the alpha-sialon phases. Calcium was found to be incorporated into the alpha-sialon structure, coexistent with the samarium, and partitioning of the calcium and samarium was observed between the alpha' phase and grain boundary phases. This indicates a technique which may be used to improve the thermal stability of the alpha' phase while maintaining good refractory phases at the gialon grain boundaries. (C) 2003 Elsevier Science B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Shear deformation of fault gouge or other particulate materials often results in observed strain localization, or more precisely, the localization of measured deformation gradients. In conventional elastic materials the strain localization cannot take place therefore this phenomenon is attributed to special types of non-elastic constitutive behaviour. For particulate materials however the Cosserat continuum which takes care of microrotations independent of displacements is a more appropriate model. In elastic Cosserat continuum the localization in displacement gradients is possible under some combinations of the generalized Cosserat elastic moduli. The same combinations of parameters also correspond to a considerable dispersion in shear wave propagation which can be used for independent experimental verification of the proposed mechanism of apparent strain localization in fault gouge.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Estimating energy requirements is necessary in clinical practice when indirect calorimetry is impractical. This paper systematically reviews current methods for estimating energy requirements. Conclusions include: there is discrepancy between the characteristics of populations upon which predictive equations are based and current populations; tools are not well understood, and patient care can be compromised by inappropriate application of the tools. Data comparing tools and methods are presented and issues for practitioners are discussed. (C) 2003 International Life Sciences Institute.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Langmuir and Langmuir-Blodgett (LB) films of a cationic amphiphilic porphyrin mixed with n-alkanes octadecane and hexatriacontane were prepared and characterized, to examine the influence of the alkanes on film structure and stability. While the structure present in these films was controlled primarily by the porphyrin, the addition of the alkanes resulted in significant changes to both the phase behavior of the Langmuir films and the molecular arrangement of the LB films. These changes, as well as the observed chain length effects, are explained in terms of the intermolecular interactions present in the films.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.