1000 resultados para Globalizing methods
Resumo:
Estimating energy requirements is necessary in clinical practice when indirect calorimetry is impractical. This paper systematically reviews current methods for estimating energy requirements. Conclusions include: there is discrepancy between the characteristics of populations upon which predictive equations are based and current populations; tools are not well understood, and patient care can be compromised by inappropriate application of the tools. Data comparing tools and methods are presented and issues for practitioners are discussed. (C) 2003 International Life Sciences Institute.
Resumo:
Taking functional programming to its extremities in search of simplicity still requires integration with other development (e.g. formal) methods. Induction is the key to deriving and verifying functional programs, but can be simplified through packaging proofs with functions, particularly folds, on data (structures). Totally Functional Programming avoids the complexities of interpretation by directly representing data (structures) as platonic combinators - the functions characteristic to the data. The link between the two simplifications is that platonic combinators are a kind of partially-applied fold, which means that platonic combinators inherit fold-theoretic properties, but with some apparent simplifications due to the platonic combinator representation. However, despite observable behaviour within functional programming that suggests that TFP is widely-applicable, significant work remains before TFP as such could be widely adopted.
Resumo:
Objective: The Assessing Cost-Effectiveness - Mental Health (ACE-MH) study aims to assess from a health sector perspective, whether there are options for change that could improve the effectiveness and efficiency of Australia's current mental health services by directing available resources toward 'best practice' cost-effective services. Method: The use of standardized evaluation methods addresses the reservations expressed by many economists about the simplistic use of League Tables based on economic studies confounded by differences in methods, context and setting. The cost-effectiveness ratio for each intervention is calculated using economic and epidemiological data. This includes systematic reviews and randomised controlled trials for efficacy, the Australian Surveys of Mental Health and Wellbeing for current practice and a combination of trials and longitudinal studies for adherence. The cost-effectiveness ratios are presented as cost (A$) per disability-adjusted life year (DALY) saved with a 95% uncertainty interval based on Monte Carlo simulation modelling. An assessment of interventions on 'second filter' criteria ('equity', 'strength of evidence', 'feasibility' and 'acceptability to stakeholders') allows broader concepts of 'benefit' to be taken into account, as well as factors that might influence policy judgements in addition to cost-effectiveness ratios. Conclusions: The main limitation of the study is in the translation of the effect size from trials into a change in the DALY disability weight, which required the use of newly developed methods. While comparisons within disorders are valid, comparisons across disorders should be made with caution. A series of articles is planned to present the results.
Resumo:
Computational models complement laboratory experimentation for efficient identification of MHC-binding peptides and T-cell epitopes. Methods for prediction of MHC-binding peptides include binding motifs, quantitative matrices, artificial neural networks, hidden Markov models, and molecular modelling. Models derived by these methods have been successfully used for prediction of T-cell epitopes in cancer, autoimmunity, infectious disease, and allergy. For maximum benefit, the use of computer models must be treated as experiments analogous to standard laboratory procedures and performed according to strict standards. This requires careful selection of data for model building, and adequate testing and validation. A range of web-based databases and MHC-binding prediction programs are available. Although some available prediction programs for particular MHC alleles have reasonable accuracy, there is no guarantee that all models produce good quality predictions. In this article, we present and discuss a framework for modelling, testing, and applications of computational methods used in predictions of T-cell epitopes. (C) 2004 Elsevier Inc. All rights reserved.
Resumo:
Analytical and bioanalytical methods of high-performance liquid chromatography with fluorescence detection (HPLC-FLD) were developed and validated for the determination of chloroaluminum phthalocyanine in different formulations of polymeric nanocapsules, plasma and livers of mice. Plasma and homogenized liver samples were extracted with ethyl acetate, and zinc phthalocyanine was used as internal standard. The results indicated that the methods were linear and selective for all matrices studied. Analysis of accuracy and precision showed adequate values, with variations lower than 10% in biological samples and lower than 2% in analytical samples. The recoveries were as high as 96% and 99% in the plasma and livers, respectively. The quantification limit of the analytical method was 1.12 ng/ml, and the limits of quantification of the bioanalytical method were 15 ng/ml and 75 ng/g for plasma and liver samples, respectively. The bioanalytical method developed was sensitive in the ranges of 15-100 ng/ml in plasma and 75-500 ng/g in liver samples and was applied to studies of biodistribution and pharmacokinetics of AlClPc. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
UV-VIS-Spectrophotometric and spectrofluorimetric methods have been developed and validated allowing the quantification of chloroaluminum phthalocyanine (CIAIPc) in nanocarriers. In order to validate the methods, the linearity, limit of detection (LOD), limit of quantification (LOQ), precision, accuracy, and selectivity were examined according to USP 30 and ICH guidelines. Linearities range were found between 0.50-3.00 mu g.mL(-1) (Y=0.3829 X [CIAIPc, mu g.mL(-1)] + 0.0126; r=0.9992) for spectrophotometry, and 0.05-1.00 mu g.mL(-1) (Y=2.24 x 10(6) X [CIAIPc, mu g.L(-1)] + 9.74 x 10(4); r=0.9978) for spectrofluorimetry. In addition, ANOVA and Lack-of-fit tests demonstrated that the regression equations were statistically significant (p<0.05), and the resulting linear model is fully adequate for both analytical methods. The LOD values were 0.09 and 0.01 mu g.mL(-1), while the LOCI were 0.27 and 0.04 mu g.mL(-1) for spectrophotometric and spectrofluorimetric methods, respectively. Repeatability and intermediate precision for proposed methods showed relative standard deviation (RSD) between 0.58% to 4.80%. The percent recovery ranged from 98.9% to 102.7% for spectrophotometric analyses and from 94.2% to 101.2% for spectrofluorimetry. No interferences from common excipients were detected and both methods were considered specific. Therefore, the methods are accurate, precise, specific, and reproducible and hence can be applied for quantification of CIAIPc in nanoemulsions (NE) and nanocapsules (NC).
Resumo:
This paper uses Bourdieu to develop theorizing about policy processes in education and to extend the policy cycle approach in a time of globalization. Use is made of Bourdieu's concept of social field and the argument is sustained that in the context of globalization the field of educational policy has reduced autonomy, with enhanced cross-field effects in educational policy production, particularly from the fields of the economy and journalism. Given the social rather than geographical character of Bourdieu's concept of social fields, it is also argued that the concept can be, and indeed has to be, stretched beyond the nation to take account of the emergent global policy field in education. Utilizing Bourdieu's late work on the globalization of the economy through neo-liberal politics, we argue that a non-reified account of the emergent global educational policy field can be provided.
Resumo:
Anemia screening before blood donation requires an accurate, quick, practical, and easy method with minimal discomfort for the donors. The aim of this study was to compare the accuracy of two quantitative methods of anemia screening: the HemoCue 201(+) (Aktiebolaget Leo Diagnostics) hemoglobin (Hb) and microhematocrit (micro-Hct) tests. Two blood samples of a single fingerstick were obtained from 969 unselected potential female donors to determine the Hb by HemoCue 201(+) and micro-Hct using HemataSTAT II (Separation Technology, Inc.), in alternating order. From each participant, a venous blood sample was drawn and run in an automatic hematology analyzer (ABX Pentra 60, ABX Diagnostics). Considering results of ABX Pentra 60 as true values, the sensitivity and specificity of HemoCue 201(+) and micro-Hct as screening methods were compared, using a venous Hb level of 12.0 g per dL as cutoff for anemia. The sensitivities of the HemoCue 201(+) and HemataSTAT II in detecting anemia were 56 percent (95% confidence interval [CI], 46.1%-65.5%) and 39.5 percent (95% CI, 30.2%-49.3%), respectively (p < 0.001). Analyzing only candidates with a venous Hb level lower than 11.0 g per dL, the deferral rate was 100 percent by HemoCue 201(+) and 77 percent by HemataSTAT II. The specificities of the methods were 93.5 and 93.2 percent, respectively. The HemoCue 201(+) showed greater discriminating power for detecting anemia in prospective blood donors than the micro-Hct method. Both presented equivalent deferral error rates of nonanemic potential donors. Compared to the micro-Hct, HemoCue 201(+) reduces the risk of anemic female donors giving blood, specially for those with lower Hb levels, without increasing the deferral of nonanemic potential donors.
Resumo:
This special issue represents a further exploration of some issues raised at a symposium entitled “Functional magnetic resonance imaging: From methods to madness” presented during the 15th annual Theoretical and Experimental Neuropsychology (TENNET XV) meeting in Montreal, Canada in June, 2004. The special issue’s theme is methods and learning in functional magnetic resonance imaging (fMRI), and it comprises 6 articles (3 reviews and 3 empirical studies). The first (Amaro and Barker) provides a beginners guide to fMRI and the BOLD effect (perhaps an alternative title might have been “fMRI for dummies”). While fMRI is now commonplace, there are still researchers who have yet to employ it as an experimental method and need some basic questions answered before they venture into new territory. This article should serve them well. A key issue of interest at the symposium was how fMRI could be used to elucidate cerebral mechanisms responsible for new learning. The next 4 articles address this directly, with the first (Little and Thulborn) an overview of data from fMRI studies of category-learning, and the second from the same laboratory (Little, Shin, Siscol, and Thulborn) an empirical investigation of changes in brain activity occurring across different stages of learning. While a role for medial temporal lobe (MTL) structures in episodic memory encoding has been acknowledged for some time, the different experimental tasks and stimuli employed across neuroimaging studies have not surprisingly produced conflicting data in terms of the precise subregion(s) involved. The next paper (Parsons, Haut, Lemieux, Moran, and Leach) addresses this by examining effects of stimulus modality during verbal memory encoding. Typically, BOLD fMRI studies of learning are conducted over short time scales, however, the fourth paper in this series (Olson, Rao, Moore, Wang, Detre, and Aguirre) describes an empirical investigation of learning occurring over a longer than usual period, achieving this by employing a relatively novel technique called perfusion fMRI. This technique shows considerable promise for future studies. The final article in this special issue (de Zubicaray) represents a departure from the more familiar cognitive neuroscience applications of fMRI, instead describing how neuroimaging studies might be conducted to both inform and constrain information processing models of cognition.
Resumo:
Purpose: Many methods exist in the literature for identifying PEEP to set in ARDS patients following a lung recruitment maneuver (RM). We compared ten published parameters for setting PEEP following a RM. Methods: Lung injury was induced by bilateral lung lavage in 14 female Dorset sheep, yielding a PaO(2) 100-150 mmHg at F(I)O(2) 1.0 and PEEP 5 cmH(2)O. A quasi-static P-V curve was then performed using the supersyringe method; PEEP was set to 20 cmH(2)O and a RM performed with pressure control ventilation (inspiratory pressure set to 40-50 cmH(2)O), until PaO(2) + PaCO(2) > 400 mmHg. Following the RM, a decremental PEEP trial was performed. The PEEP was decreased in 1 cmH(2)O steps every 5 min until 15 cmH(2)O was reached. Parameters measured during the decremental PEEP trial were compared with parameters obtained from the P-V curve. Results: For setting PEEP, maximum dynamic tidal respiratory compliance, maximum PaO(2), maximum PaO(2) + PaCO(2), and minimum shunt calculated during the decremental PEEP trial, and the lower Pflex and point of maximal compliance increase on the inflation limb of the P-V curve (Pmci,i) were statistically indistinguishable. The PEEP value obtained using the deflation upper Pflex and the point of maximal compliance decrease on the deflation limb were significantly higher, and the true inflection point on the inflation limb and minimum PaCO(2) were significantly lower than the other variables. Conclusion: In this animal model of ARDS, dynamic tidal respiratory compliance, maximum PaO(2), maximum PaO(2) + PaCO(2), minimum shunt, inflation lower Pflex and Pmci,i yield similar values for PEEP following a recruitment maneuver.
Coronary CT angiography using 64 detector rows: methods and design of the multi-centre trial CORE-64
Resumo:
Multislice computed tomography (MSCT) for the noninvasive detection of coronary artery stenoses is a promising candidate for widespread clinical application because of its non-invasive nature and high sensitivity and negative predictive value as found in several previous studies using 16 to 64 simultaneous detector rows. A multi-centre study of CT coronary angiography using 16 simultaneous detector rows has shown that 16-slice CT is limited by a high number of nondiagnostic cases and a high false-positive rate. A recent meta-analysis indicated a significant interaction between the size of the study sample and the diagnostic odds ratios suggestive of small study bias, highlighting the importance of evaluating MSCT using 64 simultaneous detector rows in a multi-centre approach with a larger sample size. In this manuscript we detail the objectives and methods of the prospective ""CORE-64"" trial (""Coronary Evaluation Using Multidetector Spiral Computed Tomography Angiography using 64 Detectors""). This multi-centre trial was unique in that it assessed the diagnostic performance of 64-slice CT coronary angiography in nine centres worldwide in comparison to conventional coronary angiography. In conclusion, the multi-centre, multi-institutional and multi-continental trial CORE-64 has great potential to ultimately assess the per-patient diagnostic performance of coronary CT angiography using 64 simultaneous detector rows.
Resumo:
OBJECTIVE. Coronary MDCT angiography has been shown to be an accurate noninvasive tool for the diagnosis of obstructive coronary artery disease (CAD). Its sensitivity and negative predictive value for diagnosing percentage of stenosis are unsurpassed compared with those of other noninvasive testing methods. However, in its current form, it provides no information regarding the physiologic impact of CAD and is a poor predictor of myocardial ischemia. CORE320 is a multicenter multinational diagnostic study with the primary objective to evaluate the diagnostic accuracy of 320-MDCT for detecting coronary artery luminal stenosis and corresponding myocardial perfusion deficits in patients with suspected CAD compared with the reference standard of conventional coronary angiography and SPECT myocardial perfusion imaging. CONCLUSION. We aim to describe the CT acquisition, reconstruction, and analysis methods of the CORE320 study.