23 resultados para refractive error

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

60.00% 60.00%

Publicador:

Resumo:

Purpose: To study the oculometric parameters of hyperopia in children with esotropic amblyopia, comparing amblyopic eyes with fellow eyes. Methods: Thirty-seven patients (5-8 years old) with bilateral hyperopia and esotropic amblyopia underwent a comprehensive ophthalmic examination, including cycloplegic refraction, keratometry and A-scan ultrasonography. Anterior chamber depth, lens thickness, vitreous chamber depth and total axial length were recorded. The refractive power of the crystalline lens was calculated using Bennett`s equations. Paired Student`s t-tests were used to compare ocular biometric measurements between amblyopic eyes and their fellow eyes. The associations of biometric parameters with refractive errors were assessed using Pearson correlation coefficients and linear regression. Multivariable models including axial length, corneal power and lens power were also constructed. Results: Amblyopic eyes were found to have significantly more hyperopic refraction, less corneal power, greater lens power, shorter vitreous chamber depth and shorter axial length, despite similar anterior chamber depth and lens thickness. The strongest correlation with refractive error was observed for the axial length/corneal radius ratio (r(36) = -0.92, p < 0.001 for amblyopic and r(36) = 0.87, p < 0.001 for fellow eyes). Axial length accounted for 39.2% (R(2)) of the refractive error variance in amblyopic eyes and 35.5% in fellow eyes. Adding corneal power to the model increased R(2) to 85.7% and 79.6%, respectively. A statistically significant correlation was found between axial length and corneal power, indicating decreasing corneal power with increasing axial length, and they were similar for amblyopic eyes (r(36) = 0.53,p < 0.001) and fellow eyes (r(36) = -0.57, p < 0.001). A statistically significant correlation was also found between axial length and lens power, indicating decreasing lens power with increasing axial length (r(36) = -0.72, p < 0.001 for amblyopic eyes and r(36) = -0.69, p < 0.001 for fellow eyes). Conclusions: We observed that the correlation among the major oculometric parameters and their individual contribution to hyperopia in esotropic children were similar in amblyopic and non-amblyopic eyes. This finding suggests that the counterbalancing effect of greater corneal and lens power associated with shorter axial length is similar in both eyes of patients with esotropic amblyopia.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Genome wide association studies (GWAS) are becoming the approach of choice to identify genetic determinants of complex phenotypes and common diseases. The astonishing amount of generated data and the use of distinct genotyping platforms with variable genomic coverage are still analytical challenges. Imputation algorithms combine directly genotyped markers information with haplotypic structure for the population of interest for the inference of a badly genotyped or missing marker and are considered a near zero cost approach to allow the comparison and combination of data generated in different studies. Several reports stated that imputed markers have an overall acceptable accuracy but no published report has performed a pair wise comparison of imputed and empiric association statistics of a complete set of GWAS markers. Results: In this report we identified a total of 73 imputed markers that yielded a nominally statistically significant association at P < 10(-5) for type 2 Diabetes Mellitus and compared them with results obtained based on empirical allelic frequencies. Interestingly, despite their overall high correlation, association statistics based on imputed frequencies were discordant in 35 of the 73 (47%) associated markers, considerably inflating the type I error rate of imputed markers. We comprehensively tested several quality thresholds, the haplotypic structure underlying imputed markers and the use of flanking markers as predictors of inaccurate association statistics derived from imputed markers. Conclusions: Our results suggest that association statistics from imputed markers showing specific MAF (Minor Allele Frequencies) range, located in weak linkage disequilibrium blocks or strongly deviating from local patterns of association are prone to have inflated false positive association signals. The present study highlights the potential of imputation procedures and proposes simple procedures for selecting the best imputed markers for follow-up genotyping studies.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this study, the innovation approach is used to estimate the measurement total error associated with power system state estimation. This is required because the power system equations are very much correlated with each other and as a consequence part of the measurements errors is masked. For that purpose an index, innovation index (II), which provides the quantity of new information a measurement contains is proposed. A critical measurement is the limit case of a measurement with low II, it has a zero II index and its error is totally masked. In other words, that measurement does not bring any innovation for the gross error test. Using the II of a measurement, the masked gross error by the state estimation is recovered; then the total gross error of that measurement is composed. Instead of the classical normalised measurement residual amplitude, the corresponding normalised composed measurement residual amplitude is used in the gross error detection and identification test, but with m degrees of freedom. The gross error processing turns out to be very simple to implement, requiring only few adaptations to the existing state estimation software. The IEEE-14 bus system is used to validate the proposed gross error detection and identification test.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

With the relentless quest for improved performance driving ever tighter tolerances for manufacturing, machine tools are sometimes unable to meet the desired requirements. One option to improve the tolerances of machine tools is to compensate for their errors. Among all possible sources of machine tool error, thermally induced errors are, in general for newer machines, the most important. The present work demonstrates the evaluation and modelling of the behaviour of the thermal errors of a CNC cylindrical grinding machine during its warm-up period.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We describe a one-time signature scheme based on the hardness of the syndrome decoding problem, and prove it secure in the random oracle model. Our proposal can be instantiated on general linear error correcting codes, rather than restricted families like alternant codes for which a decoding trapdoor is known to exist. (C) 2010 Elsevier Inc. All rights reserved,

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The purpose of this article is to present a quantitative analysis of the human failure contribution in the collision and/or grounding of oil tankers, considering the recommendation of the ""Guidelines for Formal Safety Assessment"" of the International Maritime Organization. Initially, the employed methodology is presented, emphasizing the use of the technique for human error prediction to reach the desired objective. Later, this methodology is applied to a ship operating on the Brazilian coast and, thereafter, the procedure to isolate the human actions with the greatest potential to reduce the risk of an accident is described. Finally, the management and organizational factors presented in the ""International Safety Management Code"" are associated with these selected actions. Therefore, an operator will be able to decide where to work in order to obtain an effective reduction in the probability of accidents. Even though this study does not present a new methodology, it can be considered as a reference in the human reliability analysis for the maritime industry, which, in spite of having some guides for risk analysis, has few studies related to human reliability effectively applied to the sector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background: Biochemical analysis of fluid is the primary laboratory approach hi pleural effusion diagnosis. Standardization of the steps between collection and laboratorial analyses are fundamental to maintain the quality of the results. We evaluated the influence of temperature and storage time on sample stability. Methods: Pleural fluid from 30 patients was submitted to analyses of proteins, albumin, lactic dehydrogenase (LDH), cholesterol, triglycerides, and glucose. Aliquots were stored at 21 degrees, 4 degrees, and-20 degrees C, and concentrations were determined after 1, 2, 3, 4, 7, and 14 days. LDH isoenzymes were quantified in 7 random samples. Results: Due to the instability of isoenzymes 4 and 5, a decrease in LDH was observed in the first 24 h in samples maintained at -20 degrees C and after 2 days when maintained at 4 degrees C. Aside from glucose, all parameters were stable for up to at least day 4 when stored at room temperature or 4 degrees C. Conclusions: Temperature and storage time are potential preanalytical errors in pleural fluid analyses, mainly if we consider the instability of glucose and LDH. The ideal procedure is to execute all the tests immediately after collection. However, most of the tests can be done in refrigerated sample;, excepting LDH analysis. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To analyze the effects of variations in femtosecond laser energy level on corneal stromal cell death. and inflammatory cell influx following flap creation in a rabbit model. METHODS: Eighteen rabbits were stratified in three different groups according to level of energy applied for flap creation (six animals per group). Three different energy levels were chosen for both the lamellar and side cut; 2.7 mu J (high energy), 1.6 mu J (intermediate energy), and 0.5 mu J (low energy) with a 60 kHz, model II, femtosecond laser (IntraLase). The opposite eye of each rabbit served as a control. At the 24-hour time point after surgery, all rabbits were euthanized and the comeoscleral rims were analyzed for the levels of cell death and inflammatory cell influx with the terminal uridine deoxynucleotidyl transferase dUTP-nick end labeling (TUNEL) assay and immunocytochemistry for monocyte marker CD11b, respectively. RESULTS: The high energy group (31.9 +/- 7.1 [standard error of mean (SEM) 2.9]) had significantly more TUNEL positive cells in the central flap compared to the intermediate (22.2 +/- 1.9 [SEM 0.8], P=.004), low (17.9 +/- 4.0 [SEM 1.6], P <= .001), and control eye (0.06 +/- 0.02 [SEM 0.009], P <= .001) groups. The intermediate and low energy groups also had significantly more TUNEL positive cells than the control groups (P <= .001). The difference between the intermediate and low energy levels was not significant (P=.56). The mean for CD11b-positive cells/400x field at the flap edge was 26.1 +/- 29.3 (SEM 11.9), 5.8 +/- 4.1 (SEM 1.6), 1.6 +/- 4.1 (SEM 1.6), and 0.005 +/- 0.01 (SEM 0.005) for high energy, intermediate energy, low energy, and control groups, respectively. Only the intermediate energy group showed statistically more inflammatory cells than control eyes (P = .015), most likely due to variability between eyes. CONCLUSIONS: Higher energy levels trigger greater cell death when the femtosecond laser is used to create corneal flaps: Greater corneal inflammatory cell infiltration is observed with higher femtosecond laser energy levels. [J Refract Surg. 2009;25:869-874.] doi:10.3928/1081597X-20090917-08

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Parenteral anticoagulation is a cornerstone in the management of venous and arterial thrombosis. Unfractionated heparin has a wide dose/response relationship, requiring frequent and troublesome laboratorial follow-up. Because of all these factors, low-molecular-weight heparin use has been increasing. Inadequate dosage has been pointed out as a potential problem because the use of subjectively estimated weight instead of real measured weight is common practice in the emergency department (ED). To evaluate the impact of inadequate weight estimation on enoxaparin dosage, we investigated the adequacy of anticoagulation of patients in a tertiary ED where subjective weight estimation is common practice. We obtained the estimated, informed, and measured weight of 28 patients in need of parenteral anticoagulation. Basal and steady-state (after the second subcutaneous shot of enoxaparin) anti-Xa activity was obtained as a measure of adequate anticoagulation. The patients were divided into 2 groups according the anticoagulation adequacy. From the 28 patients enrolled, 75% (group 1, n = 21) received at least 0.9 mg/kg per dose BID and 25% (group 2, n = 7) received less than 0.9 mg/kg per dose BID of enoxaparin. Only 4 (14.3%) of all patients had anti-Xa activity less than the inferior limit of the therapeutic range (<0.5 UI/mL), all of them from group 2. In conclusion, when weight estimation was used to determine the enoxaparin dosage, 25% of the patients were inadequately anticoagulated (anti-Xa activity <0.5 UI/mL) during the initial crucial phase of treatment. (C) 2011 Elsevier Inc. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We estimate the conditions for detectability of two planets in a 2/1 mean-motion resonance from radial velocity data, as a function of their masses, number of observations and the signal-to-noise ratio. Even for a data set of the order of 100 observations and standard deviations of the order of a few meters per second, we find that Jovian-size resonant planets are difficult to detect if the masses of the planets differ by a factor larger than similar to 4. This is consistent with the present population of real exosystems in the 2/1 commensurability, most of which have resonant pairs with similar minimum masses, and could indicate that many other resonant systems exist, but are currently beyond the detectability limit. Furthermore, we analyze the error distribution in masses and orbital elements of orbital fits from synthetic data sets for resonant planets in the 2/1 commensurability. For various mass ratios and number of data points we find that the eccentricity of the outer planet is systematically overestimated, although the inner planet`s eccentricity suffers a much smaller effect. If the initial conditions correspond to small-amplitude oscillations around stable apsidal corotation resonances, the amplitudes estimated from the orbital fits are biased toward larger amplitudes, in accordance to results found in real resonant extrasolar systems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this paper we deal with robust inference in heteroscedastic measurement error models Rather than the normal distribution we postulate a Student t distribution for the observed variables Maximum likelihood estimates are computed numerically Consistent estimation of the asymptotic covariance matrices of the maximum likelihood and generalized least squares estimators is also discussed Three test statistics are proposed for testing hypotheses of interest with the asymptotic chi-square distribution which guarantees correct asymptotic significance levels Results of simulations and an application to a real data set are also reported (C) 2009 The Korean Statistical Society Published by Elsevier B V All rights reserved

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The multivariate skew-t distribution (J Multivar Anal 79:93-113, 2001; J R Stat Soc, Ser B 65:367-389, 2003; Statistics 37:359-363, 2003) includes the Student t, skew-Cauchy and Cauchy distributions as special cases and the normal and skew-normal ones as limiting cases. In this paper, we explore the use of Markov Chain Monte Carlo (MCMC) methods to develop a Bayesian analysis of repeated measures, pretest/post-test data, under multivariate null intercept measurement error model (J Biopharm Stat 13(4):763-771, 2003) where the random errors and the unobserved value of the covariate (latent variable) follows a Student t and skew-t distribution, respectively. The results and methods are numerically illustrated with an example in the field of dentistry.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Skew-normal distribution is a class of distributions that includes the normal distributions as a special case. In this paper, we explore the use of Markov Chain Monte Carlo (MCMC) methods to develop a Bayesian analysis in a multivariate, null intercept, measurement error model [R. Aoki, H. Bolfarine, J.A. Achcar, and D. Leao Pinto Jr, Bayesian analysis of a multivariate null intercept error-in -variables regression model, J. Biopharm. Stat. 13(4) (2003b), pp. 763-771] where the unobserved value of the covariate (latent variable) follows a skew-normal distribution. The results and methods are applied to a real dental clinical trial presented in [A. Hadgu and G. Koch, Application of generalized estimating equations to a dental randomized clinical trial, J. Biopharm. Stat. 9 (1999), pp. 161-178].

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Z-scan technique is employed to obtain the nonlinear refractive index (n (2)) of the Ca(4)REO(BO(3))(3) (RECOB, where RE = Gd and La) single crystals using 30 fs laser pulses centered at 780 nm for the two orthogonal orientations determined by the optical axes (X and Z) relative to the direction of propagation of the laser beam (k//Y// crystallographic b-axis). The large values of n (2) indicate that both GdCOB and LaCOB are potential hosts for Yb:RECOB lasers operating in the Kerr-lens mode locking (KLM) regime.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The propagation of an optical beam through dielectric media induces changes in the refractive index, An, which causes self-focusing or self-defocusing. In the particular case of ion-doped solids, there are thermal and non-thermal lens effects, where the latter is due to the polarizability difference, Delta alpha, between the excited and ground states, the so-called population lens (PL) effect. PL is a pure electronic contribution to the nonlinearity, while the thermal lens (TL) effect is caused by the conversion of part of the absorbed energy into heat. In time-resolved measurements such as Z-scan and TL transient experiments, it is not easy to separate these two contributions to nonlinear refractive index because they usually have similar response times. In this work, we performed time-resolved measurements using both Z-scan and mode mismatched TL in order to discriminate thermal and electronic contributions to the laser-induced refractive index change of the Nd3+-doped Strontium Barium Niobate (SrxBa1-xNb2O6) laser crystal. Combining numerical simulations with experimental results we could successfully distinguish between the two contributions to An. (C) 2007 Elsevier B.V. All rights reserved.