134 resultados para Error correction methods


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Anemia screening before blood donation requires an accurate, quick, practical, and easy method with minimal discomfort for the donors. The aim of this study was to compare the accuracy of two quantitative methods of anemia screening: the HemoCue 201(+) (Aktiebolaget Leo Diagnostics) hemoglobin (Hb) and microhematocrit (micro-Hct) tests. Two blood samples of a single fingerstick were obtained from 969 unselected potential female donors to determine the Hb by HemoCue 201(+) and micro-Hct using HemataSTAT II (Separation Technology, Inc.), in alternating order. From each participant, a venous blood sample was drawn and run in an automatic hematology analyzer (ABX Pentra 60, ABX Diagnostics). Considering results of ABX Pentra 60 as true values, the sensitivity and specificity of HemoCue 201(+) and micro-Hct as screening methods were compared, using a venous Hb level of 12.0 g per dL as cutoff for anemia. The sensitivities of the HemoCue 201(+) and HemataSTAT II in detecting anemia were 56 percent (95% confidence interval [CI], 46.1%-65.5%) and 39.5 percent (95% CI, 30.2%-49.3%), respectively (p < 0.001). Analyzing only candidates with a venous Hb level lower than 11.0 g per dL, the deferral rate was 100 percent by HemoCue 201(+) and 77 percent by HemataSTAT II. The specificities of the methods were 93.5 and 93.2 percent, respectively. The HemoCue 201(+) showed greater discriminating power for detecting anemia in prospective blood donors than the micro-Hct method. Both presented equivalent deferral error rates of nonanemic potential donors. Compared to the micro-Hct, HemoCue 201(+) reduces the risk of anemic female donors giving blood, specially for those with lower Hb levels, without increasing the deferral of nonanemic potential donors.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Treatment of excessive gingival display usually involves procedures such as Le Fort impaction or maxillary gingivectomies. The authors propose an alternative technique that reduces the muscular function of the elevator of the upper lip muscle and repositioning of the upper lip. Methods: Fourteen female patients with excessive gingival exposure were operated on between February of 2008 and March of 2009. They were filmed before and at least 6 months after the procedure. They were asked to perform their fullest smile, and the maximum gingival exposures were measured and analyzed using ImageJ software. Patients were operated on under local anesthesia. Their gingival mucosa was freed from the maxilla using a periosteum elevator. Skin and subcutaneous tissue were dissected bluntly from the underlying musculature of the upper lip. A frenuloplasty was performed to lengthen the upper lip. Both levator labii superioris muscles were dissected and divided. Results: The postoperative course was uneventful in all of the patients. The mean gingival exposure before surgery was 5.22 +/- 1.48 mm; 6 months after surgery, it was 1.91 +/- 1.50 mm. The mean gingival exposure reduction was 3.31 +/- 1.05 mm (p < 0.001), ranging from 1.59 to 4.83 mm. Conclusion: This study shows that the proposed technique was efficient in reducing the amount of exposed gum during smile in all patients in this series. (Plast. Reconstr. Surg. 126: 1014, 2010.)

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND AND PURPOSE: Functional brain variability has been scarcely investigated in cognitively healthy elderly subjects, and it is currently debated whether previous findings of regional metabolic variability are artifacts associated with brain atrophy. The primary purpose of this study was to test whether there is regional cerebral age-related hypometabolism specifically in later stages of life. MATERIALS AND METHODS: MR imaging and FDG-PET data were acquired from 55 cognitively healthy elderly subjects, and voxel-based linear correlations between age and GM volume or regional cerebral metabolism were conducted by using SPM5 in images with and without correction for PVE. To investigate sex-specific differences in the pattern of brain aging, we repeated the above voxelwise calculations after dividing our sample by sex. RESULTS: Our analysis revealed 2 large clusters of age-related metabolic decrease in the overall sample, 1 in the left orbitofrontal cortex and the other in the right temporolimbic region, encompassing the hippocampus, the parahippocampal gyrus, and the amygdala. The division of our sample by sex revealed significant sex-specific age-related metabolic decrease in the left temporolimbic region of men and in the left dorsolateral frontal cortex of women. When we applied atrophy correction to our PET data, none of the above-mentioned correlations remained significant. CONCLUSIONS: Our findings suggest that age-related functional brain variability in cognitively healthy elderly individuals is largely secondary to the degree of regional brain atrophy, and the findings provide support to the notion that appropriate PVE correction is a key tool in neuroimaging investigations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background: Biochemical analysis of fluid is the primary laboratory approach hi pleural effusion diagnosis. Standardization of the steps between collection and laboratorial analyses are fundamental to maintain the quality of the results. We evaluated the influence of temperature and storage time on sample stability. Methods: Pleural fluid from 30 patients was submitted to analyses of proteins, albumin, lactic dehydrogenase (LDH), cholesterol, triglycerides, and glucose. Aliquots were stored at 21 degrees, 4 degrees, and-20 degrees C, and concentrations were determined after 1, 2, 3, 4, 7, and 14 days. LDH isoenzymes were quantified in 7 random samples. Results: Due to the instability of isoenzymes 4 and 5, a decrease in LDH was observed in the first 24 h in samples maintained at -20 degrees C and after 2 days when maintained at 4 degrees C. Aside from glucose, all parameters were stable for up to at least day 4 when stored at room temperature or 4 degrees C. Conclusions: Temperature and storage time are potential preanalytical errors in pleural fluid analyses, mainly if we consider the instability of glucose and LDH. The ideal procedure is to execute all the tests immediately after collection. However, most of the tests can be done in refrigerated sample;, excepting LDH analysis. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND CONTEXT: The vertebral spine angle in the frontal plane is an important parameter in the assessment of scoliosis and may be obtained from panoramic X-ray images. Technological advances have allowed for an increased use of digital X-ray images in clinical practice. PURPOSE: In this context, the objective of this study is to assess the reliability of computer-assisted Cobb angle measurements taken from digital X-ray images. STUDY DESIGN/SETTING: Clinical investigation quantifying scoliotic deformity with Cobb method to evaluate the intra- and interobserver variability using manual and digital techniques. PATIENT SAMPLE: Forty-nine patients diagnosed with idiopathic scoliosis were chosen based on convenience, without predilection for gender, age, type, location, or magnitude of the curvature. OUTCOME MEASURES: Images were examined to evaluate Cobb angle variability, end plate selection, as well as intra- and interobserver errors. METHODS: Specific software was developed to digitally reproduce the Cobb method and calculate semiautomatically the degree of scoliotic deformity. During the study, three observers estimated the Cobb angle using both the digital and the traditional manual methods. RESULTS: The results showed that Cobb angle measurements may be reproduced in the computer as reliably as with the traditional manual method, in similar conditions to those found in clinical practice. CONCLUSIONS: The computer-assisted method (digital method) is clinically advantageous and appropriate to assess the scoliotic curvature in the frontal plane using Cobb method. (C) 2010 Elsevier Inc. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

We estimate the conditions for detectability of two planets in a 2/1 mean-motion resonance from radial velocity data, as a function of their masses, number of observations and the signal-to-noise ratio. Even for a data set of the order of 100 observations and standard deviations of the order of a few meters per second, we find that Jovian-size resonant planets are difficult to detect if the masses of the planets differ by a factor larger than similar to 4. This is consistent with the present population of real exosystems in the 2/1 commensurability, most of which have resonant pairs with similar minimum masses, and could indicate that many other resonant systems exist, but are currently beyond the detectability limit. Furthermore, we analyze the error distribution in masses and orbital elements of orbital fits from synthetic data sets for resonant planets in the 2/1 commensurability. For various mass ratios and number of data points we find that the eccentricity of the outer planet is systematically overestimated, although the inner planet`s eccentricity suffers a much smaller effect. If the initial conditions correspond to small-amplitude oscillations around stable apsidal corotation resonances, the amplitudes estimated from the orbital fits are biased toward larger amplitudes, in accordance to results found in real resonant extrasolar systems.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Predictive performance evaluation is a fundamental issue in design, development, and deployment of classification systems. As predictive performance evaluation is a multidimensional problem, single scalar summaries such as error rate, although quite convenient due to its simplicity, can seldom evaluate all the aspects that a complete and reliable evaluation must consider. Due to this, various graphical performance evaluation methods are increasingly drawing the attention of machine learning, data mining, and pattern recognition communities. The main advantage of these types of methods resides in their ability to depict the trade-offs between evaluation aspects in a multidimensional space rather than reducing these aspects to an arbitrarily chosen (and often biased) single scalar measure. Furthermore, to appropriately select a suitable graphical method for a given task, it is crucial to identify its strengths and weaknesses. This paper surveys various graphical methods often used for predictive performance evaluation. By presenting these methods in the same framework, we hope this paper may shed some light on deciding which methods are more suitable to use in different situations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Nesse artigo, tem-se o interesse em avaliar diferentes estratégias de estimação de parâmetros para um modelo de regressão linear múltipla. Para a estimação dos parâmetros do modelo foram utilizados dados de um ensaio clínico em que o interesse foi verificar se o ensaio mecânico da propriedade de força máxima (EM-FM) está associada com a massa femoral, com o diâmetro femoral e com o grupo experimental de ratas ovariectomizadas da raça Rattus norvegicus albinus, variedade Wistar. Para a estimação dos parâmetros do modelo serão comparadas três metodologias: a metodologia clássica, baseada no método dos mínimos quadrados; a metodologia Bayesiana, baseada no teorema de Bayes; e o método Bootstrap, baseado em processos de reamostragem.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper we describe and evaluate a geometric mass-preserving redistancing procedure for the level set function on general structured grids. The proposed algorithm is adapted from a recent finite element-based method and preserves the mass by means of a localized mass correction. A salient feature of the scheme is the absence of adjustable parameters. The algorithm is tested in two and three spatial dimensions and compared with the widely used partial differential equation (PDE)-based redistancing method using structured Cartesian grids. Through the use of quantitative error measures of interest in level set methods, we show that the overall performance of the proposed geometric procedure is better than PDE-based reinitialization schemes, since it is more robust with comparable accuracy. We also show that the algorithm is well-suited for the highly stretched curvilinear grids used in CFD simulations. Copyright (C) 2010 John Wiley & Sons, Ltd.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The multivariate skew-t distribution (J Multivar Anal 79:93-113, 2001; J R Stat Soc, Ser B 65:367-389, 2003; Statistics 37:359-363, 2003) includes the Student t, skew-Cauchy and Cauchy distributions as special cases and the normal and skew-normal ones as limiting cases. In this paper, we explore the use of Markov Chain Monte Carlo (MCMC) methods to develop a Bayesian analysis of repeated measures, pretest/post-test data, under multivariate null intercept measurement error model (J Biopharm Stat 13(4):763-771, 2003) where the random errors and the unobserved value of the covariate (latent variable) follows a Student t and skew-t distribution, respectively. The results and methods are numerically illustrated with an example in the field of dentistry.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Skew-normal distribution is a class of distributions that includes the normal distributions as a special case. In this paper, we explore the use of Markov Chain Monte Carlo (MCMC) methods to develop a Bayesian analysis in a multivariate, null intercept, measurement error model [R. Aoki, H. Bolfarine, J.A. Achcar, and D. Leao Pinto Jr, Bayesian analysis of a multivariate null intercept error-in -variables regression model, J. Biopharm. Stat. 13(4) (2003b), pp. 763-771] where the unobserved value of the covariate (latent variable) follows a skew-normal distribution. The results and methods are applied to a real dental clinical trial presented in [A. Hadgu and G. Koch, Application of generalized estimating equations to a dental randomized clinical trial, J. Biopharm. Stat. 9 (1999), pp. 161-178].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The purpose of this study was to evaluate the influence of different light sources and photo-activation methods on degree of conversion (DC%) and polymerization shrinkage (PS) of a nanocomposite resin (Filtek (TM) Supreme XT, 3M/ESPE). Two light-curing units (LCUs), one halogen-lamp (QTH) and one light-emitting-diode (LED), and two different photo-activation methods (continuous and gradual) were investigated in this study. The specimens were divided in four groups: group 1-power density (PD) of 570 mW/cm(2) for 20 s (QTH); group 2-PD 0 at 570 mW/cm(2) for 10 s + 10 s at 570 mW/cm(2) (QTH); group 3-PD 860 mW/cm(2) for 20 s (LED), and group 4-PD 125 mW/cm(2) for 10 s + 10 s at 860 mW/cm(2) (LED). A testing machine EMIC with rectangular steel bases (6 x 1 x 2 mm) was used to record the polymerization shrinkage forces (MPa) for a period that started with the photo-activation and ended after two minutes of measurement. For each group, ten repetitions (n = 40) were performed. For DC% measurements, five specimens (n = 20) for each group were made in a metallic mold (2 mm thickness and 4 mm diameter, ISO 4049) and them pulverized, pressed with bromide potassium (KBr) and analyzed with FT-IR spectroscopy. The data of PS were analyzed by Analysis of Variance (ANOVA) with Welch`s correction and Tamhane`s test. The PS means (MPa) were: 0.60 (G1); 0.47 (G2); 0.52 (G3) and 0.45 (G4), showing significant differences between two photo-activation methods, regardless of the light source used. The continuous method provided the highest values for PS. The data of DC% were analyzed by Analysis of Variance (ANOVA) and shows significant differences for QTH LCUs, regardless of the photo-activation method used. The QTH provided the lowest values for DC%. The gradual method provides lower polymerization contraction, either with halogen lamp or LED. Degree of conversion (%) for continuous or gradual photo-activation method was influenced by the LCUs. Thus, the presented results suggest that gradual method photo-activation with LED LCU would suffice to ensure adequate degree of conversion and minimum polymerization shrinkage.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper describes the first phase of a project attempting to construct an efficient general-purpose nonlinear optimizer using an augmented Lagrangian outer loop with a relative error criterion, and an inner loop employing a state-of-the art conjugate gradient solver. The outer loop can also employ double regularized proximal kernels, a fairly recent theoretical development that leads to fully smooth subproblems. We first enhance the existing theory to show that our approach is globally convergent in both the primal and dual spaces when applied to convex problems. We then present an extensive computational evaluation using the CUTE test set, showing that some aspects of our approach are promising, but some are not. These conclusions in turn lead to additional computational experiments suggesting where to next focus our theoretical and computational efforts.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this article, we discuss inferential aspects of the measurement error regression models with null intercepts when the unknown quantity x (latent variable) follows a skew normal distribution. We examine first the maximum-likelihood approach to estimation via the EM algorithm by exploring statistical properties of the model considered. Then, the marginal likelihood, the score function and the observed information matrix of the observed quantities are presented allowing direct inference implementation. In order to discuss some diagnostics techniques in this type of models, we derive the appropriate matrices to assessing the local influence on the parameter estimates under different perturbation schemes. The results and methods developed in this paper are illustrated considering part of a real data set used by Hadgu and Koch [1999, Application of generalized estimating equations to a dental randomized clinical trial. Journal of Biopharmaceutical Statistics, 9, 161-178].

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This work presents a Bayesian semiparametric approach for dealing with regression models where the covariate is measured with error. Given that (1) the error normality assumption is very restrictive, and (2) assuming a specific elliptical distribution for errors (Student-t for example), may be somewhat presumptuous; there is need for more flexible methods, in terms of assuming only symmetry of errors (admitting unknown kurtosis). In this sense, the main advantage of this extended Bayesian approach is the possibility of considering generalizations of the elliptical family of models by using Dirichlet process priors in dependent and independent situations. Conditional posterior distributions are implemented, allowing the use of Markov Chain Monte Carlo (MCMC), to generate the posterior distributions. An interesting result shown is that the Dirichlet process prior is not updated in the case of the dependent elliptical model. Furthermore, an analysis of a real data set is reported to illustrate the usefulness of our approach, in dealing with outliers. Finally, semiparametric proposed models and parametric normal model are compared, graphically with the posterior distribution density of the coefficients. (C) 2009 Elsevier Inc. All rights reserved.