996 resultados para export operation methods
Resumo:
A number of techniques have been developed to study the disposition of drugs in the head and, in particular, the role of the blood-brain barrier (BBB) in drug uptake. The techniques can be divided into three groups: in-vitro, in-vivo and in-situ. The most suitable method depends on the purpose(s) and requirements of the particular study being conducted. In-vitro techniques involve the isolation of cerebral endothelial cells so that direct investigations of these cells can be carried out. The most recent preparations are able to maintain structural and functional characteristics of the BBB by simultaneously culturing endothelial cells with astrocytic cells,The main advantages of the in-vitro methods are the elimination of anaesthetics and surgery. In-vivo methods consist of a diverse range of techniques and include the traditional Brain Uptake Index and indicator diffusion methods, as well as microdialysis and positron emission tomography. In-vivo methods maintain the cells and vasculature of an organ in their normal physiological states and anatomical position within the animal. However, the shortcomings include renal acid hepatic elimination of solutes as well as the inability to control blood flow. In-situ techniques, including the perfused head, are more technically demanding. However, these models have the ability to vary the composition and flow rate of the artificial perfusate. This review is intended as a guide for selecting the most appropriate method for studying drug uptake in the brain.
Resumo:
Background: A variety of methods for prediction of peptide binding to major histocompatibility complex (MHC) have been proposed. These methods are based on binding motifs, binding matrices, hidden Markov models (HMM), or artificial neural networks (ANN). There has been little prior work on the comparative analysis of these methods. Materials and Methods: We performed a comparison of the performance of six methods applied to the prediction of two human MHC class I molecules, including binding matrices and motifs, ANNs, and HMMs. Results: The selection of the optimal prediction method depends on the amount of available data (the number of peptides of known binding affinity to the MHC molecule of interest), the biases in the data set and the intended purpose of the prediction (screening of a single protein versus mass screening). When little or no peptide data are available, binding motifs are the most useful alternative to random guessing or use of a complete overlapping set of peptides for selection of candidate binders. As the number of known peptide binders increases, binding matrices and HMM become more useful predictors. ANN and HMM are the predictive methods of choice for MHC alleles with more than 100 known binding peptides. Conclusion: The ability of bioinformatic methods to reliably predict MHC binding peptides, and thereby potential T-cell epitopes, has major implications for clinical immunology, particularly in the area of vaccine design.
Resumo:
Objectives and Methods: Reoperations are an integral part of a cardiac surgeon's practice. We share our experience of 546 reoperations over the last 21 years to January 2000, with the focus directed towards the timing of reoperation, reducing the mortality and morbidity of reoperation and rereplacement aortic valve surgery, and understanding the important risk factors. In addition, the precise technical steps that facilitate careful successful explantation of various devices (allograft, stented and stentless xenografts, and mechanical valves) are detailed. Results: Optimal planned reoperation before deterioration to New York Heart Association Class III/IV levels and before unfavorable cardiac and comorbidity general system failure occurs has produced low mortality and morbidity as compared with first operation results. However, unfavorable delays and late rereferral result in mortality rates of up to 22% for emergency redo AVR for degenerated bioprostheses. Conclusion: Cardiac surgical units have the opportunity to establish a closer patient-surgeon relationship, which favors, when necessary, the optimal timing of reoperation. Knowledge of the more important risk factors and adherence to specific technical steps at explantation of various devices enhances satisfactory reoperation outcomes.
Resumo:
Myb-binding protein 1a (Mybbp1a) is a novel nuclear protein localized predominantly, but not exclusively, in nucleoli. Although initially isolated as a c-Myb interacting protein, Mybbp1a is expressed ubiquitously, associates with a number of different transcription factors, and may play a role in both RNA polymerase I- and II-mediated transcriptional regulation. However, its precise function remains unclear. In this study we show that Mybbp1a is a nucleocytoplasmic shuttling protein and investigate the mechanisms responsible for both nuclear import and export. The carboxyl terminus of Mybbp1a, which contains seven short basic amino acid repeat sequences, is responsible for both nuclear and nucleolar localization, and this activity can be transferred to a heterologous protein. Deletion mapping demonstrated that these repeat sequences appear to act incrementally, with successive deletions resulting in a corresponding increase in the proportion of protein localized in the cytoplasm. Glutathione S-transferase pulldown experiments showed that the nuclear receptor importin-alpha/beta mediates Mybbp1a nuclear import. Interspecies heterokaryons were used to demonstrate that Mybbp1a was capable of shuttling between the nucleus and the cytoplasm. Deletion analysis and in vivo export studies using a heterologous assay system identified several nuclear export sequences which facilitate Mybbp1a nuclear export of Mybbp1a by CRM1-dependent and -independent pathways. (C) 2003 Elsevier Science (USA). All rights reserved.
Resumo:
Estimating energy requirements is necessary in clinical practice when indirect calorimetry is impractical. This paper systematically reviews current methods for estimating energy requirements. Conclusions include: there is discrepancy between the characteristics of populations upon which predictive equations are based and current populations; tools are not well understood, and patient care can be compromised by inappropriate application of the tools. Data comparing tools and methods are presented and issues for practitioners are discussed. (C) 2003 International Life Sciences Institute.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Analytical and bioanalytical methods of high-performance liquid chromatography with fluorescence detection (HPLC-FLD) were developed and validated for the determination of chloroaluminum phthalocyanine in different formulations of polymeric nanocapsules, plasma and livers of mice. Plasma and homogenized liver samples were extracted with ethyl acetate, and zinc phthalocyanine was used as internal standard. The results indicated that the methods were linear and selective for all matrices studied. Analysis of accuracy and precision showed adequate values, with variations lower than 10% in biological samples and lower than 2% in analytical samples. The recoveries were as high as 96% and 99% in the plasma and livers, respectively. The quantification limit of the analytical method was 1.12 ng/ml, and the limits of quantification of the bioanalytical method were 15 ng/ml and 75 ng/g for plasma and liver samples, respectively. The bioanalytical method developed was sensitive in the ranges of 15-100 ng/ml in plasma and 75-500 ng/g in liver samples and was applied to studies of biodistribution and pharmacokinetics of AlClPc. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
UV-VIS-Spectrophotometric and spectrofluorimetric methods have been developed and validated allowing the quantification of chloroaluminum phthalocyanine (CIAIPc) in nanocarriers. In order to validate the methods, the linearity, limit of detection (LOD), limit of quantification (LOQ), precision, accuracy, and selectivity were examined according to USP 30 and ICH guidelines. Linearities range were found between 0.50-3.00 mu g.mL(-1) (Y=0.3829 X [CIAIPc, mu g.mL(-1)] + 0.0126; r=0.9992) for spectrophotometry, and 0.05-1.00 mu g.mL(-1) (Y=2.24 x 10(6) X [CIAIPc, mu g.L(-1)] + 9.74 x 10(4); r=0.9978) for spectrofluorimetry. In addition, ANOVA and Lack-of-fit tests demonstrated that the regression equations were statistically significant (p<0.05), and the resulting linear model is fully adequate for both analytical methods. The LOD values were 0.09 and 0.01 mu g.mL(-1), while the LOCI were 0.27 and 0.04 mu g.mL(-1) for spectrophotometric and spectrofluorimetric methods, respectively. Repeatability and intermediate precision for proposed methods showed relative standard deviation (RSD) between 0.58% to 4.80%. The percent recovery ranged from 98.9% to 102.7% for spectrophotometric analyses and from 94.2% to 101.2% for spectrofluorimetry. No interferences from common excipients were detected and both methods were considered specific. Therefore, the methods are accurate, precise, specific, and reproducible and hence can be applied for quantification of CIAIPc in nanoemulsions (NE) and nanocapsules (NC).
Resumo:
Anemia screening before blood donation requires an accurate, quick, practical, and easy method with minimal discomfort for the donors. The aim of this study was to compare the accuracy of two quantitative methods of anemia screening: the HemoCue 201(+) (Aktiebolaget Leo Diagnostics) hemoglobin (Hb) and microhematocrit (micro-Hct) tests. Two blood samples of a single fingerstick were obtained from 969 unselected potential female donors to determine the Hb by HemoCue 201(+) and micro-Hct using HemataSTAT II (Separation Technology, Inc.), in alternating order. From each participant, a venous blood sample was drawn and run in an automatic hematology analyzer (ABX Pentra 60, ABX Diagnostics). Considering results of ABX Pentra 60 as true values, the sensitivity and specificity of HemoCue 201(+) and micro-Hct as screening methods were compared, using a venous Hb level of 12.0 g per dL as cutoff for anemia. The sensitivities of the HemoCue 201(+) and HemataSTAT II in detecting anemia were 56 percent (95% confidence interval [CI], 46.1%-65.5%) and 39.5 percent (95% CI, 30.2%-49.3%), respectively (p < 0.001). Analyzing only candidates with a venous Hb level lower than 11.0 g per dL, the deferral rate was 100 percent by HemoCue 201(+) and 77 percent by HemataSTAT II. The specificities of the methods were 93.5 and 93.2 percent, respectively. The HemoCue 201(+) showed greater discriminating power for detecting anemia in prospective blood donors than the micro-Hct method. Both presented equivalent deferral error rates of nonanemic potential donors. Compared to the micro-Hct, HemoCue 201(+) reduces the risk of anemic female donors giving blood, specially for those with lower Hb levels, without increasing the deferral of nonanemic potential donors.
Resumo:
This special issue represents a further exploration of some issues raised at a symposium entitled “Functional magnetic resonance imaging: From methods to madness” presented during the 15th annual Theoretical and Experimental Neuropsychology (TENNET XV) meeting in Montreal, Canada in June, 2004. The special issue’s theme is methods and learning in functional magnetic resonance imaging (fMRI), and it comprises 6 articles (3 reviews and 3 empirical studies). The first (Amaro and Barker) provides a beginners guide to fMRI and the BOLD effect (perhaps an alternative title might have been “fMRI for dummies”). While fMRI is now commonplace, there are still researchers who have yet to employ it as an experimental method and need some basic questions answered before they venture into new territory. This article should serve them well. A key issue of interest at the symposium was how fMRI could be used to elucidate cerebral mechanisms responsible for new learning. The next 4 articles address this directly, with the first (Little and Thulborn) an overview of data from fMRI studies of category-learning, and the second from the same laboratory (Little, Shin, Siscol, and Thulborn) an empirical investigation of changes in brain activity occurring across different stages of learning. While a role for medial temporal lobe (MTL) structures in episodic memory encoding has been acknowledged for some time, the different experimental tasks and stimuli employed across neuroimaging studies have not surprisingly produced conflicting data in terms of the precise subregion(s) involved. The next paper (Parsons, Haut, Lemieux, Moran, and Leach) addresses this by examining effects of stimulus modality during verbal memory encoding. Typically, BOLD fMRI studies of learning are conducted over short time scales, however, the fourth paper in this series (Olson, Rao, Moore, Wang, Detre, and Aguirre) describes an empirical investigation of learning occurring over a longer than usual period, achieving this by employing a relatively novel technique called perfusion fMRI. This technique shows considerable promise for future studies. The final article in this special issue (de Zubicaray) represents a departure from the more familiar cognitive neuroscience applications of fMRI, instead describing how neuroimaging studies might be conducted to both inform and constrain information processing models of cognition.