4 resultados para smoothing

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

10.00% 10.00%

Publicador:

Resumo:

Purpose: To evaluate the relationship between glaucomatous structural damage assessed by the Cirrus Spectral Domain OCT (SDOCT) and functional loss as measured by standard automated perimetry (SAP). Methods: Four hundred twenty-two eyes (78 healthy, 210 suspects, 134 glaucomatous) of 250 patients were recruited from the longitudinal Diagnostic Innovations in Glaucoma Study and from the African Descent and Glaucoma Evaluation Study. All eyes underwent testing with the Cirrus SDOCT and SAP within a 6-month period. The relationship between parapapillary retinal nerve fiber layer thickness (RNFL) sectors and corresponding topographic SAP locations was evaluated using locally weighted scatterplot smoothing and regression analysis. SAP sensitivity values were evaluated using both linear as well as logarithmic scales. We also tested the fit of a model (Hood) for structure-function relationship in glaucoma. Results: Structure was significantly related to function for all but the nasal thickness sector. The relationship was strongest for superotemporal RNFL thickness and inferonasal sensitivity (R(2) = 0.314, P < 0.001). The Hood model fitted the data relatively well with 88% of the eyes inside the 95% confidence interval predicted by the model. Conclusions: RNFL thinning measured by the Cirrus SDOCT was associated with correspondent visual field loss in glaucoma.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Electrothermomechanical MEMS are essentially microactuators that operate based on the thermoelastic effect induced by the Joule heating of the structure. They can be easily fabricated and require relatively low excitation voltages. However, the actuation time of an electrothermomechanical microdevice is higher than the actuation times related to electrostatic and piezoelectric actuation principles. Thus, in this research, we propose an optimization framework based on the topology optimization method applied to transient problems, to design electrothermomechanical microactuators for response time reduction. The objective is to maximize the integral of the output displacement of the actuator, which is a function of time. The finite element equations that govern the time response of the actuators are provided. Furthermore, the Solid Isotropic Material with Penalization model and Sequential Linear Programming are employed. Finally, a smoothing filter is implemented to control the solution. Results aiming at two distinct applications suggest the proposed approach can provide more than 50% faster actuators. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In this paper we extend semiparametric mixed linear models with normal errors to elliptical errors in order to permit distributions with heavier and lighter tails than the normal ones. Penalized likelihood equations are applied to derive the maximum penalized likelihood estimates (MPLEs) which appear to be robust against outlying observations in the sense of the Mahalanobis distance. A reweighed iterative process based on the back-fitting method is proposed for the parameter estimation and the local influence curvatures are derived under some usual perturbation schemes to study the sensitivity of the MPLEs. Two motivating examples preliminarily analyzed under normal errors are reanalyzed considering some appropriate elliptical errors. The local influence approach is used to compare the sensitivity of the model estimates.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Abstract Background With the development of DNA hybridization microarray technologies, nowadays it is possible to simultaneously assess the expression levels of thousands to tens of thousands of genes. Quantitative comparison of microarrays uncovers distinct patterns of gene expression, which define different cellular phenotypes or cellular responses to drugs. Due to technical biases, normalization of the intensity levels is a pre-requisite to performing further statistical analyses. Therefore, choosing a suitable approach for normalization can be critical, deserving judicious consideration. Results Here, we considered three commonly used normalization approaches, namely: Loess, Splines and Wavelets, and two non-parametric regression methods, which have yet to be used for normalization, namely, the Kernel smoothing and Support Vector Regression. The results obtained were compared using artificial microarray data and benchmark studies. The results indicate that the Support Vector Regression is the most robust to outliers and that Kernel is the worst normalization technique, while no practical differences were observed between Loess, Splines and Wavelets. Conclusion In face of our results, the Support Vector Regression is favored for microarray normalization due to its superiority when compared to the other methods for its robustness in estimating the normalization curve.