888 resultados para Least-squares technique
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
A quantitative structure-activity relationship (QSAR) study of 19 quinone compounds with trypanocidal activity was performed by Partial Least Squares (PLS) and Principal Component Regression (PCR) methods with the use of leave-one-out crossvalidation procedure to build the regression models. The trypanocidal activity of the compounds is related to their first cathodic potential (Ep(c1)). The regression PLS and PCR models built in this study were also used to predict the Ep(c1) of six new quinone compounds. The PLS model was built with three principal components that described 96.50% of the total variance and present Q(2) = 0.83 and R-2 = 0.90. The results obtained with the PCR model were similar to those obtained with the PLS model. The PCR model was also built with three principal components that described 96.67% of the total variance with Q(2) = 0.83 and R-2 = 0.90. The most important descriptors for our PLS and PCR models were HOMO-1 (energy of the molecular orbital below HOMO), Q4 (atomic charge at position 4), MAXDN (maximal electrotopological negative difference), and HYF (hydrophilicity index).
Resumo:
A low-cost computer procedure to determine the orbit of an artificial satellite by using short arc data from an onboard GPS receiver is proposed. Pseudoranges are used as measurements to estimate the orbit via recursive least squares method. The algorithm applies orthogonal Givens rotations for solving recursive and sequential orbit determination problems. To assess the procedure, it was applied to the TOPEX/POSEIDON satellite for data batches of one orbital period (approximately two hours), and force modelling, due to the full JGM-2 gravity field model, was considered. When compared with the reference Precision Orbit Ephemeris (POE) of JPL/NASA, the results have indicated that precision better than 9 m is easily obtained, even when short batches of data are used. Copyright (c) 2007.
Resumo:
The quantitative structure-activity relationship of a set of 19 flavonoid compounds presenting antioxidant activity was studied by means of PLS (Partial Least Squares) regression. The optimization of the structures and calculation of electronic properties were done by using the semiempirical method AMI. A reliable model (r(2) = 0.806 and q(2) = 0.730) was obtained and from this model it was possible to consider some aspects of the structure of the flavonoid compounds studied that are related with their free radical scavenging ability. The quality of the PLS model obtained in this work indicates that it can be used in order to design new flavonoid compounds that present ability to scavenge free radicals.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Practical methods for land grading design of a plane surface for rectangular and irregularly shaped fields based on a least squares analysis are presented. The least squares procedure leads to a system of three linear equations with three unknowns for determination of the best-fit plane. The equations can be solved by determinants (Cramer's rule) using a procedure capable of solution by many programmable calculators. The detailed computational process for determining the equation of the plane and a simple method to find the centroid location of an irregular field are also given. An illustrative example and design instructions are included to demonstrate the application of the design procedure.
Resumo:
In this work simulations of incompressible fluid flows have been done by a Least Squares Finite Element Method (LSFEM) using velocity-pressure-vorticity and velocity-pressure-stress formulations, named u-p-ω) and u-p-τ formulations respectively. These formulations are preferred because the resulting equations are partial differential equations of first order, which is convenient for implementation by LSFEM. The main purposes of this work are the numerical computation of laminar, transitional and turbulent fluid flows through the application of large eddy simulation (LES) methodology using the LSFEM. The Navier-Stokes equations in u-p-ω and u-p-τ formulations are filtered and the eddy viscosity model of Smagorinsky is used for modeling the sub-grid-scale stresses. Some benchmark problems are solved for validate the numerical code and the preliminary results are presented and compared with available results from the literature. Copyright © 2005 by ABCM.
Resumo:
ABSTRACT: This paper presents a performance comparison between known propagation Models through least squares tuning algorithm for 5.8 GHz frequency band. The studied environment is based on the 12 cities located in Amazon Region. After adjustments and simulations, SUI Model showed the smaller RMS error and standard deviation when compared with COST231-Hata and ECC-33 models.
Resumo:
Dimensionality reduction is employed for visual data analysis as a way to obtaining reduced spaces for high dimensional data or to mapping data directly into 2D or 3D spaces. Although techniques have evolved to improve data segregation on reduced or visual spaces, they have limited capabilities for adjusting the results according to user's knowledge. In this paper, we propose a novel approach to handling both dimensionality reduction and visualization of high dimensional data, taking into account user's input. It employs Partial Least Squares (PLS), a statistical tool to perform retrieval of latent spaces focusing on the discriminability of the data. The method employs a training set for building a highly precise model that can then be applied to a much larger data set very effectively. The reduced data set can be exhibited using various existing visualization techniques. The training data is important to code user's knowledge into the loop. However, this work also devises a strategy for calculating PLS reduced spaces when no training data is available. The approach produces increasingly precise visual mappings as the user feeds back his or her knowledge and is capable of working with small and unbalanced training sets.
Resumo:
In this work we study a polyenergetic and multimaterial model for the breast image reconstruction in Digital Tomosynthesis, taking into consideration the variety of the materials forming the object and the polyenergetic nature of the X-rays beam. The modelling of the problem leads to the resolution of a high-dimensional nonlinear least-squares problem that, due to its nature of inverse ill-posed problem, needs some kind of regularization. We test two main classes of methods: the Levenberg-Marquardt method (together with the Conjugate Gradient method for the computation of the descent direction) and two limited-memory BFGS-like methods (L-BFGS). We perform some experiments for different values of the regularization parameter (constant or varying at each iteration), tolerances and stop conditions. Finally, we analyse the performance of the several methods comparing relative errors, iterations number, times and the qualities of the reconstructed images.
Resumo:
The advances in computational biology have made simultaneous monitoring of thousands of features possible. The high throughput technologies not only bring about a much richer information context in which to study various aspects of gene functions but they also present challenge of analyzing data with large number of covariates and few samples. As an integral part of machine learning, classification of samples into two or more categories is almost always of interest to scientists. In this paper, we address the question of classification in this setting by extending partial least squares (PLS), a popular dimension reduction tool in chemometrics, in the context of generalized linear regression based on a previous approach, Iteratively ReWeighted Partial Least Squares, i.e. IRWPLS (Marx, 1996). We compare our results with two-stage PLS (Nguyen and Rocke, 2002A; Nguyen and Rocke, 2002B) and other classifiers. We show that by phrasing the problem in a generalized linear model setting and by applying bias correction to the likelihood to avoid (quasi)separation, we often get lower classification error rates.