970 resultados para Computational methods
Resumo:
Analytical and bioanalytical methods of high-performance liquid chromatography with fluorescence detection (HPLC-FLD) were developed and validated for the determination of chloroaluminum phthalocyanine in different formulations of polymeric nanocapsules, plasma and livers of mice. Plasma and homogenized liver samples were extracted with ethyl acetate, and zinc phthalocyanine was used as internal standard. The results indicated that the methods were linear and selective for all matrices studied. Analysis of accuracy and precision showed adequate values, with variations lower than 10% in biological samples and lower than 2% in analytical samples. The recoveries were as high as 96% and 99% in the plasma and livers, respectively. The quantification limit of the analytical method was 1.12 ng/ml, and the limits of quantification of the bioanalytical method were 15 ng/ml and 75 ng/g for plasma and liver samples, respectively. The bioanalytical method developed was sensitive in the ranges of 15-100 ng/ml in plasma and 75-500 ng/g in liver samples and was applied to studies of biodistribution and pharmacokinetics of AlClPc. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
UV-VIS-Spectrophotometric and spectrofluorimetric methods have been developed and validated allowing the quantification of chloroaluminum phthalocyanine (CIAIPc) in nanocarriers. In order to validate the methods, the linearity, limit of detection (LOD), limit of quantification (LOQ), precision, accuracy, and selectivity were examined according to USP 30 and ICH guidelines. Linearities range were found between 0.50-3.00 mu g.mL(-1) (Y=0.3829 X [CIAIPc, mu g.mL(-1)] + 0.0126; r=0.9992) for spectrophotometry, and 0.05-1.00 mu g.mL(-1) (Y=2.24 x 10(6) X [CIAIPc, mu g.L(-1)] + 9.74 x 10(4); r=0.9978) for spectrofluorimetry. In addition, ANOVA and Lack-of-fit tests demonstrated that the regression equations were statistically significant (p<0.05), and the resulting linear model is fully adequate for both analytical methods. The LOD values were 0.09 and 0.01 mu g.mL(-1), while the LOCI were 0.27 and 0.04 mu g.mL(-1) for spectrophotometric and spectrofluorimetric methods, respectively. Repeatability and intermediate precision for proposed methods showed relative standard deviation (RSD) between 0.58% to 4.80%. The percent recovery ranged from 98.9% to 102.7% for spectrophotometric analyses and from 94.2% to 101.2% for spectrofluorimetry. No interferences from common excipients were detected and both methods were considered specific. Therefore, the methods are accurate, precise, specific, and reproducible and hence can be applied for quantification of CIAIPc in nanoemulsions (NE) and nanocapsules (NC).
Resumo:
Carbon monoxide, the chief killer in fires, and other species are modelled for a series of enclosure fires. The conditions emulate building fires where CO is formed in the rich, turbulent, nonpremixed flame and is transported frozen to lean mixtures by the ceiling jet which is cooled by radiation and dilution. Conditional moment closure modelling is used and computational domain minimisation criteria are developed which reduce the computational cost of this method. The predictions give good agreement for CO and other species in the lean, quenched-gas stream, holding promise that this method may provide a practical means of modelling real, three-dimensional fire situations. (c) 2005 The Combustion Institute. Published by Elsevier Inc. All rights reserved.
Resumo:
Anemia screening before blood donation requires an accurate, quick, practical, and easy method with minimal discomfort for the donors. The aim of this study was to compare the accuracy of two quantitative methods of anemia screening: the HemoCue 201(+) (Aktiebolaget Leo Diagnostics) hemoglobin (Hb) and microhematocrit (micro-Hct) tests. Two blood samples of a single fingerstick were obtained from 969 unselected potential female donors to determine the Hb by HemoCue 201(+) and micro-Hct using HemataSTAT II (Separation Technology, Inc.), in alternating order. From each participant, a venous blood sample was drawn and run in an automatic hematology analyzer (ABX Pentra 60, ABX Diagnostics). Considering results of ABX Pentra 60 as true values, the sensitivity and specificity of HemoCue 201(+) and micro-Hct as screening methods were compared, using a venous Hb level of 12.0 g per dL as cutoff for anemia. The sensitivities of the HemoCue 201(+) and HemataSTAT II in detecting anemia were 56 percent (95% confidence interval [CI], 46.1%-65.5%) and 39.5 percent (95% CI, 30.2%-49.3%), respectively (p < 0.001). Analyzing only candidates with a venous Hb level lower than 11.0 g per dL, the deferral rate was 100 percent by HemoCue 201(+) and 77 percent by HemataSTAT II. The specificities of the methods were 93.5 and 93.2 percent, respectively. The HemoCue 201(+) showed greater discriminating power for detecting anemia in prospective blood donors than the micro-Hct method. Both presented equivalent deferral error rates of nonanemic potential donors. Compared to the micro-Hct, HemoCue 201(+) reduces the risk of anemic female donors giving blood, specially for those with lower Hb levels, without increasing the deferral of nonanemic potential donors.
Resumo:
This special issue represents a further exploration of some issues raised at a symposium entitled “Functional magnetic resonance imaging: From methods to madness” presented during the 15th annual Theoretical and Experimental Neuropsychology (TENNET XV) meeting in Montreal, Canada in June, 2004. The special issue’s theme is methods and learning in functional magnetic resonance imaging (fMRI), and it comprises 6 articles (3 reviews and 3 empirical studies). The first (Amaro and Barker) provides a beginners guide to fMRI and the BOLD effect (perhaps an alternative title might have been “fMRI for dummies”). While fMRI is now commonplace, there are still researchers who have yet to employ it as an experimental method and need some basic questions answered before they venture into new territory. This article should serve them well. A key issue of interest at the symposium was how fMRI could be used to elucidate cerebral mechanisms responsible for new learning. The next 4 articles address this directly, with the first (Little and Thulborn) an overview of data from fMRI studies of category-learning, and the second from the same laboratory (Little, Shin, Siscol, and Thulborn) an empirical investigation of changes in brain activity occurring across different stages of learning. While a role for medial temporal lobe (MTL) structures in episodic memory encoding has been acknowledged for some time, the different experimental tasks and stimuli employed across neuroimaging studies have not surprisingly produced conflicting data in terms of the precise subregion(s) involved. The next paper (Parsons, Haut, Lemieux, Moran, and Leach) addresses this by examining effects of stimulus modality during verbal memory encoding. Typically, BOLD fMRI studies of learning are conducted over short time scales, however, the fourth paper in this series (Olson, Rao, Moore, Wang, Detre, and Aguirre) describes an empirical investigation of learning occurring over a longer than usual period, achieving this by employing a relatively novel technique called perfusion fMRI. This technique shows considerable promise for future studies. The final article in this special issue (de Zubicaray) represents a departure from the more familiar cognitive neuroscience applications of fMRI, instead describing how neuroimaging studies might be conducted to both inform and constrain information processing models of cognition.
Resumo:
Purpose: Many methods exist in the literature for identifying PEEP to set in ARDS patients following a lung recruitment maneuver (RM). We compared ten published parameters for setting PEEP following a RM. Methods: Lung injury was induced by bilateral lung lavage in 14 female Dorset sheep, yielding a PaO(2) 100-150 mmHg at F(I)O(2) 1.0 and PEEP 5 cmH(2)O. A quasi-static P-V curve was then performed using the supersyringe method; PEEP was set to 20 cmH(2)O and a RM performed with pressure control ventilation (inspiratory pressure set to 40-50 cmH(2)O), until PaO(2) + PaCO(2) > 400 mmHg. Following the RM, a decremental PEEP trial was performed. The PEEP was decreased in 1 cmH(2)O steps every 5 min until 15 cmH(2)O was reached. Parameters measured during the decremental PEEP trial were compared with parameters obtained from the P-V curve. Results: For setting PEEP, maximum dynamic tidal respiratory compliance, maximum PaO(2), maximum PaO(2) + PaCO(2), and minimum shunt calculated during the decremental PEEP trial, and the lower Pflex and point of maximal compliance increase on the inflation limb of the P-V curve (Pmci,i) were statistically indistinguishable. The PEEP value obtained using the deflation upper Pflex and the point of maximal compliance decrease on the deflation limb were significantly higher, and the true inflection point on the inflation limb and minimum PaCO(2) were significantly lower than the other variables. Conclusion: In this animal model of ARDS, dynamic tidal respiratory compliance, maximum PaO(2), maximum PaO(2) + PaCO(2), minimum shunt, inflation lower Pflex and Pmci,i yield similar values for PEEP following a recruitment maneuver.
Coronary CT angiography using 64 detector rows: methods and design of the multi-centre trial CORE-64
Resumo:
Multislice computed tomography (MSCT) for the noninvasive detection of coronary artery stenoses is a promising candidate for widespread clinical application because of its non-invasive nature and high sensitivity and negative predictive value as found in several previous studies using 16 to 64 simultaneous detector rows. A multi-centre study of CT coronary angiography using 16 simultaneous detector rows has shown that 16-slice CT is limited by a high number of nondiagnostic cases and a high false-positive rate. A recent meta-analysis indicated a significant interaction between the size of the study sample and the diagnostic odds ratios suggestive of small study bias, highlighting the importance of evaluating MSCT using 64 simultaneous detector rows in a multi-centre approach with a larger sample size. In this manuscript we detail the objectives and methods of the prospective ""CORE-64"" trial (""Coronary Evaluation Using Multidetector Spiral Computed Tomography Angiography using 64 Detectors""). This multi-centre trial was unique in that it assessed the diagnostic performance of 64-slice CT coronary angiography in nine centres worldwide in comparison to conventional coronary angiography. In conclusion, the multi-centre, multi-institutional and multi-continental trial CORE-64 has great potential to ultimately assess the per-patient diagnostic performance of coronary CT angiography using 64 simultaneous detector rows.
Resumo:
OBJECTIVE. Coronary MDCT angiography has been shown to be an accurate noninvasive tool for the diagnosis of obstructive coronary artery disease (CAD). Its sensitivity and negative predictive value for diagnosing percentage of stenosis are unsurpassed compared with those of other noninvasive testing methods. However, in its current form, it provides no information regarding the physiologic impact of CAD and is a poor predictor of myocardial ischemia. CORE320 is a multicenter multinational diagnostic study with the primary objective to evaluate the diagnostic accuracy of 320-MDCT for detecting coronary artery luminal stenosis and corresponding myocardial perfusion deficits in patients with suspected CAD compared with the reference standard of conventional coronary angiography and SPECT myocardial perfusion imaging. CONCLUSION. We aim to describe the CT acquisition, reconstruction, and analysis methods of the CORE320 study.
Resumo:
Minimal perfect hash functions are used for memory efficient storage and fast retrieval of items from static sets. We present an infinite family of efficient and practical algorithms for generating order preserving minimal perfect hash functions. We show that almost all members of the family construct space and time optimal order preserving minimal perfect hash functions, and we identify the one with minimum constants. Members of the family generate a hash function in two steps. First a special kind of function into an r-graph is computed probabilistically. Then this function is refined deterministically to a minimal perfect hash function. We give strong theoretical evidence that the first step uses linear random time. The second step runs in linear deterministic time. The family not only has theoretical importance, but also offers the fastest known method for generating perfect hash functions.
Resumo:
Little consensus exists in the literature regarding methods for determination of the onset of electromyographic (EMG) activity. The aim of this study was to compare the relative accuracy of a range of computer-based techniques with respect to EMG onset determined visually by an experienced examiner. Twenty-seven methods were compared which varied in terms of EMG processing (low pass filtering at 10, 50 and 500 Hz), threshold value (1, 2 and 3 SD beyond mean of baseline activity) and the number of samples for which the mean must exceed the defined threshold (20, 50 and 100 ms). Three hundred randomly selected trials of a postural task were evaluated using each technique. The visual determination of EMG onset was found to be highly repeatable between days. Linear regression equations were calculated for the values selected by each computer method which indicated that the onset values selected by the majority of the parameter combinations deviated significantly from the visually derived onset values. Several methods accurately selected the time of onset of EMG activity and are recommended for future use. Copyright (C) 1996 Elsevier Science Ireland Ltd.
Resumo:
The refinement calculus provides a framework for the stepwise development of imperative programs from specifications. In this paper we study a refinement calculus for deriving logic programs. Dealing with logic programs rather than imperative programs has the dual advantages that, due to the expressive power of logic programs, the final program is closer to the original specification, and each refinement step can achieve more. Together these reduce the overall number of derivation steps. We present a logic programming language extended with specification constructs (including general predicates, assertions, and types and invariants) to form a wide-spectrum language. General predicates allow non-executable properties to be included in specifications. Assertions, types and invariants make assumptions about the intended inputs of a procedure explicit, and can be used during refinement to optimize the constructed logic program. We provide a semantics for the extended logic programming language and derive a set of refinement laws. Finally we apply these to an example derivation.
Resumo:
This study investigated the effect of two anti-pronation taping techniques on vertical navicular height, an indicator of foot pronation, after its application and 20 min of exercise. The taping techniques were: the low dye (LD) and low dye with the addition of calcaneal slings and reverse sixes (LDCR). A repeated measures study was used. It found that LDCR was superior to LD and control immediately after application and exercise. LD was better than control immediately after application but not after exercise. These findings provide practical directions to clinicians regularly using anti-pronation taping techniques.
Resumo:
Here, we examine morphological changes in cortical thickness of patients with Alzheimer`s disease (AD) using image analysis algorithms for brain structure segmentation and study automatic classification of AD patients using cortical and volumetric data. Cortical thickness of AD patients (n = 14) was measured using MRI cortical surface-based analysis and compared with healthy subjects (n = 20). Data was analyzed using an automated algorithm for tissue segmentation and classification. A Support Vector Machine (SVM) was applied over the volumetric measurements of subcortical and cortical structures to separate AD patients from controls. The group analysis showed cortical thickness reduction in the superior temporal lobe, parahippocampal gyrus, and enthorhinal cortex in both hemispheres. We also found cortical thinning in the isthmus of cingulate gyrus and middle temporal gyrus at the right hemisphere, as well as a reduction of the cortical mantle in areas previously shown to be associated with AD. We also confirmed that automatic classification algorithms (SVM) could be helpful to distinguish AD patients from healthy controls. Moreover, the same areas implicated in the pathogenesis of AD were the main parameters driving the classification algorithm. While the patient sample used in this study was relatively small, we expect that using a database of regional volumes derived from MRI scans of a large number of subjects will increase the SVM power of AD patient identification.