651 resultados para SMOOTHING SPLINE
Resumo:
Given the importance of preserving the water quality of the Guarani Aquifer, the work done in the hydrographic basin of the rivers Jacaré-Guaçú e Jacaré-Pepira, located in the central-northern São Paulo state, was made to map the hydraulic conductivity from the use of some empirical methods associated with granulometric analysis and in situ testing, specifically with the Guelph permeameter. All results were submitted to a correlation analysis and subsequently mapped using the methodology of minimum curvature, based on numerical techniques Spline. These procedures provide for studies of aquifer vulnerability and assist in decision making in environmental projects and guidelines for urban planning
Resumo:
The continuous advance of the Brazilian economy and increased competition in the heavy equipment market, increasingly point to the need for accurate sales forecasting processes, which allow an optimized strategic planning and therefore better overall results. In this manner, we found that the sales forecasting process deserves to be studied and understood, since it has a key role in corporate strategic planning. Accurate forecasting methods enable direction of companies to circumvent the management difficulties and the variations of finished goods inventory, which make companies more competitive. By analyzing the stages of the sales forecasting it was possible to observe that this process is methodical, bureaucratic and demands a lot of training for their managers and professionals. In this paper we applied the modeling method and the selecting process which has been done for Armstrong to select the most appropriate technique for two products of a heavy equipment industry and it has been through this method that the triple exponential smoothing technique has been chosen for both products. The results obtained by prediction with the triple exponential smoothing technique were better than forecasts prepared by the industry experts
Resumo:
Pós-graduação em Engenharia Elétrica - FEB
Resumo:
Pós-graduação em Biometria - IBB
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Objective: To evaluate the long-term effects of the standard (Class II) Balters bionator in growing patients with Class II malocclusion with mandibular retrusion by using morphometrics (thin-plate spline [TPS] analysis). Materials and Methods: Twenty-three Class II patients (8 male, 15 female) were treated consecutively with the Balters bionator (bionator group). The sample was evaluated at T0, start of treatment; T1, end of bionator therapy; and T2, long-term observation (including fixed appliances). Mean age at the start of treatment was 10 years 2 months (T0); at posttreatment, 12 years 3 months (T1); and at long-term follow-up, 18 years 2 months (T2). The control group consisted of 22 subjects (11 male, 11 female) with untreated Class II malocclusion. Lateral cephalograms were analyzed at the three time points for all groups. TPS analysis evaluated statistical differences (permutation tests) in the craniofacial shape and size between the bionator and control groups. Results: TPS analysis showed that treatment with the bionator is able to produce favorable mandibular shape changes (forward and downward displacement) that contribute significantly to the correction of the Class II dentoskeletal imbalance. These results are maintained at a long-term observation after completion of growth. The control group showed no statistically significant differences in the correction of Class II malocclusion. Conclusions: This study suggests that bionator treatment of Class II malocclusion produces favorable results over the long term with a combination of skeletal and dentoalveolar shape changes.
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
The objective of this paper is to model variations in test-day milk yields of first lactations of Holstein cows by RR using B-spline functions and Bayesian inference in order to fit adequate and parsimonious models for the estimation of genetic parameters. They used 152,145 test day milk yield records from 7317 first lactations of Holstein cows. The model established in this study was additive, permanent environmental and residual random effects. In addition, contemporary group and linear and quadratic effects of the age of cow at calving were included as fixed effects. Authors modeled the average lactation curve of the population with a fourth-order orthogonal Legendre polynomial. They concluded that a cubic B-spline with seven random regression coefficients for both the additive genetic and permanent environment effects was to be the best according to residual mean square and residual variance estimates. Moreover they urged a lower order model (quadratic B-spline with seven random regression coefficients for both random effects) could be adopted because it yielded practically the same genetic parameter estimates with parsimony. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Purpose: To evaluate the relationship between glaucomatous structural damage assessed by the Cirrus Spectral Domain OCT (SDOCT) and functional loss as measured by standard automated perimetry (SAP). Methods: Four hundred twenty-two eyes (78 healthy, 210 suspects, 134 glaucomatous) of 250 patients were recruited from the longitudinal Diagnostic Innovations in Glaucoma Study and from the African Descent and Glaucoma Evaluation Study. All eyes underwent testing with the Cirrus SDOCT and SAP within a 6-month period. The relationship between parapapillary retinal nerve fiber layer thickness (RNFL) sectors and corresponding topographic SAP locations was evaluated using locally weighted scatterplot smoothing and regression analysis. SAP sensitivity values were evaluated using both linear as well as logarithmic scales. We also tested the fit of a model (Hood) for structure-function relationship in glaucoma. Results: Structure was significantly related to function for all but the nasal thickness sector. The relationship was strongest for superotemporal RNFL thickness and inferonasal sensitivity (R(2) = 0.314, P < 0.001). The Hood model fitted the data relatively well with 88% of the eyes inside the 95% confidence interval predicted by the model. Conclusions: RNFL thinning measured by the Cirrus SDOCT was associated with correspondent visual field loss in glaucoma.
Resumo:
Electrothermomechanical MEMS are essentially microactuators that operate based on the thermoelastic effect induced by the Joule heating of the structure. They can be easily fabricated and require relatively low excitation voltages. However, the actuation time of an electrothermomechanical microdevice is higher than the actuation times related to electrostatic and piezoelectric actuation principles. Thus, in this research, we propose an optimization framework based on the topology optimization method applied to transient problems, to design electrothermomechanical microactuators for response time reduction. The objective is to maximize the integral of the output displacement of the actuator, which is a function of time. The finite element equations that govern the time response of the actuators are provided. Furthermore, the Solid Isotropic Material with Penalization model and Sequential Linear Programming are employed. Finally, a smoothing filter is implemented to control the solution. Results aiming at two distinct applications suggest the proposed approach can provide more than 50% faster actuators. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Abstract Background With the development of DNA hybridization microarray technologies, nowadays it is possible to simultaneously assess the expression levels of thousands to tens of thousands of genes. Quantitative comparison of microarrays uncovers distinct patterns of gene expression, which define different cellular phenotypes or cellular responses to drugs. Due to technical biases, normalization of the intensity levels is a pre-requisite to performing further statistical analyses. Therefore, choosing a suitable approach for normalization can be critical, deserving judicious consideration. Results Here, we considered three commonly used normalization approaches, namely: Loess, Splines and Wavelets, and two non-parametric regression methods, which have yet to be used for normalization, namely, the Kernel smoothing and Support Vector Regression. The results obtained were compared using artificial microarray data and benchmark studies. The results indicate that the Support Vector Regression is the most robust to outliers and that Kernel is the worst normalization technique, while no practical differences were observed between Loess, Splines and Wavelets. Conclusion In face of our results, the Support Vector Regression is favored for microarray normalization due to its superiority when compared to the other methods for its robustness in estimating the normalization curve.
Resumo:
Máster Universitario en Sistemas Inteligentes y Aplicaciones Numéricas en Ingeniería (SIANI)
Resumo:
[EN] In the last years we have developed some methods for 3D reconstruction. First we began with the problem of reconstructing a 3D scene from a stereoscopic pair of images. We developed some methods based on energy functionals which produce dense disparity maps by preserving discontinuities from image boundaries. Then we passed to the problem of reconstructing a 3D scene from multiple views (more than 2). The method for multiple view reconstruction relies on the method for stereoscopic reconstruction. For every pair of consecutive images we estimate a disparity map and then we apply a robust method that searches for good correspondences through the sequence of images. Recently we have proposed several methods for 3D surface regularization. This is a postprocessing step necessary for smoothing the final surface, which could be afected by noise or mismatch correspondences. These regularization methods are interesting because they use the information from the reconstructing process and not only from the 3D surface. We have tackled all these problems from an energy minimization approach. We investigate the associated Euler-Lagrange equation of the energy functional, and we approach the solution of the underlying partial differential equation (PDE) using a gradient descent method.
Resumo:
[EN]The meccano method is a novel and promising mesh generation method for simultaneously creating adaptive tetrahedral meshes and volume parametrizations of a complex solid. We highlight the fact that the method requires minimum user intervention and has a low computational cost. The method builds a 3-D triangulation of the solid as a deformation of an appropriate tetrahedral mesh of the meccano. The new mesh generator combines an automatic parametrization of surface triangulations, a local refinement algorithm for 3-D nested triangulations and a simultaneous untangling and smoothing procedure. At present, the procedure is fully automatic for a genus-zero solid. In this case, the meccano can be a single cube. The efficiency of the proposed technique is shown with several applications...