57 resultados para the least squares distance method

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sea surface gradients derived from the Geosat and ERS-1 satellite altimetry geodetic missions were integrated with marine gravity data from the National Geophysical Data Center and Brazilian national surveys. Using the least squares collocation method, models of free-air gravity anomaly and geoid height were calculated for the coast of Brazil with a resolution of 2` x 2`. The integration of satellite and shipborne data showed better statistical results in regions near the coast than using satellite data only, suggesting an improvement when compared to the state-of-the-art global gravity models. Furthermore, these results were obtained with considerably less input information than was used by those reference models. The least squares collocation presented a very low content of high-frequency noise in the predicted gravity anomalies. This may be considered essential to improve the high resolution representation of the gravity field in regions of ocean-continent transition. (C) 2010 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This research presents a method for frequency estimation in power systems using an adaptive filter based on the Least Mean Square Algorithm (LMS). In order to analyze a power system, three-phase voltages were converted into a complex signal applying the alpha beta-transform and the results were used in an adaptive filtering algorithm. Although the use of the complex LMS algorithm is described in the literature, this paper deals with some practical aspects of the algorithm implementation. In order to reduce computing time, a coefficient generator was implemented. For the algorithm validation, a computing simulation of a power system was carried Out using the ATP software. Many different situations were Simulated for the performance analysis of the proposed methodology. The results were compared to a commercial relay for validation, showing the advantages of the new method. (C) 2009 Elsevier Ltd. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The representation of interfaces by means of the algebraic moving-least-squares (AMLS) technique is addressed. This technique, in which the interface is represented by an unconnected set of points, is interesting for evolving fluid interfaces since there is]to surface connectivity. The position of the surface points can thus be updated without concerns about the quality of any surface triangulation. We introduce a novel AMLS technique especially designed for evolving-interfaces applications that we denote RAMLS (for Robust AMLS). The main advantages with respect to previous AMLS techniques are: increased robustness, computational efficiency, and being free of user-tuned parameters. Further, we propose a new front-tracking method based on the Lagrangian advection of the unconnected point set that defines the RAMLS surface. We assume that a background Eulerian grid is defined with some grid spacing h. The advection of the point set makes the surface evolve in time. The point cloud can be regenerated at any time (in particular, we regenerate it each time step) by intersecting the gridlines with the evolved surface, which guarantees that the density of points on the surface is always well balanced. The intersection algorithm is essentially a ray-tracing algorithm, well-studied in computer graphics, in which a line (ray) is traced so as to detect all intersections with a surface. Also, the tracing of each gridline is independent and can thus be performed in parallel. Several tests are reported assessing first the accuracy of the proposed RAMLS technique, and then of the front-tracking method based on it. Comparison with previous Eulerian, Lagrangian and hybrid techniques encourage further development of the proposed method for fluid mechanics applications. (C) 2008 Elsevier Inc. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this study was to compare REML/BLUP and Least Square procedures in the prediction and estimation of genetic parameters and breeding values in soybean progenies. F(2:3) and F(4:5) progenies were evaluated in the 2005/06 growing season and the F(2:4) and F(4:6) generations derived thereof were evaluated in 2006/07. These progenies were originated from two semi-early, experimental lines that differ in grain yield. The experiments were conducted in a lattice design and plots consisted of a 2 m row, spaced 0.5 m apart. The trait grain yield per plot was evaluated. It was observed that early selection is more efficient for the discrimination of the best lines from the F(4) generation onwards. No practical differences were observed between the least square and REML/BLUP procedures in the case of the models and simplifications for REML/BLUP used here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context. We present spectroscopic ground-based observations of the early Be star HD 49330 obtained simultaneously with the CoRoT-LRA1 run just before the burst observed in the CoRoT data. Aims. Ground-based spectroscopic observations of the early Be star HD 49330 obtained during the precursor phase and just before the start of an outburst allow us to disantangle stellar and circumstellar contributions and identify modes of stellar pulsations in this rapidly rotating star. Methods. Time series analysis (TSA) is performed on photospheric line profiles of He I and Si III by means of the least squares method. Results. We find two main frequencies f1 = 11.86 c d(-1) and f2 = 16.89 c d(-1) which can be associated with high order p-mode pulsations. We also detect a frequency f3 = 1.51 c d(-1) which can be associated with a low order g-mode. Moreover we show that the stellar line profile variability changed over the spectroscopic run. These results are in agreement with the results of the CoRoT data analysis, as shown in Huat et al. (2009). Conclusions. Our study of mid-and short-term spectroscopic variability allows the identification of p-and g-modes in HD 49330. It also allows us to display changes in the line profile variability before the start of an outburst. This brings new constraints for the seimic modelling of this star.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a novel array RLS algorithm with forgetting factor that circumvents the problem of fading regularization, inherent to the standard exponentially-weighted RLS, by allowing for time-varying regularization matrices with generic structure. Simulations in finite precision show the algorithm`s superiority as compared to alternative algorithms in the context of adaptive beamforming.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The general objective of this study was to evaluate the ordered weighted averaging (OWA) method, integrated to a geographic information systems (GIS), in the definition of priority areas for forest conservation in a Brazilian river basin, aiming at to increase the regional biodiversity. We demonstrated how one could obtain a range of alternatives by applying OWA, including the one obtained by the weighted linear combination method and, also the use of the analytic hierarchy process (AHP) to structure the decision problem and to assign the importance to each criterion. The criteria considered important to this study were: proximity to forest patches; proximity among forest patches with larger core area; proximity to surface water; distance from roads: distance from urban areas; and vulnerability to erosion. OWA requires two sets of criteria weights: the weights of relative criterion importance and the order weights. Thus, Participatory Technique was used to define the criteria set and the criterion importance (based in AHP). In order to obtain the second set of weights we considered the influence of each criterion, as well as the importance of each one, on this decision-making process. The sensitivity analysis indicated coherence among the criterion importance weights, the order weights, and the solution. According to this analysis, only the proximity to surface water criterion is not important to identify priority areas for forest conservation. Finally, we can highlight that the OWA method is flexible, easy to be implemented and, mainly, it facilitates a better understanding of the alternative land-use suitability patterns. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper describes a chemotaxonomic analysis of a database of triterpenoid compounds from the Celastraceae family using principal component analysis (PCA). The numbers of occurrences of thirty types of triterpene skeleton in different tribes of the family were used as variables. The study shows that PCA applied to chemical data can contribute to an intrafamilial classification of Celastraceae, once some questionable taxa affinity was observed, from chemotaxonomic inferences about genera and they are in agreement with the phylogeny previously proposed. The inclusion of Hippocrateaceae within Celastraceae is supported by the triterpene chemistry.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

State of Sao Paulo Research Foundation (FAPESP)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This work presents, with the aid of the natural approach, an extension of the force density method for the initial shape finding of cable and membrane structures, which leads to the solution of a system of linear equations. This method, here called the natural force density method, preserves the linearity which characterizes the original force density method. At the same time, it overcomes the difficulties that the original procedure presents to cope with irregular triangular finite element meshes. Furthermore, if this method is applied iteratively in the lines prescribed herewith, it leads to a viable initial configuration with a uniform, isotropic plane Cauchy stress state. This means that a minimal surface for the membrane can be achieved through a succession of equilibrated configurations. Several numerical examples illustrate the simplicity and robustness of the method. (C) 2008 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Golgi method has been used for over a century to describe the general morphology of neurons in the nervous system of different species. The ""single-section"" Golgi method of Gabbott and Somogyi (1984) and the modifications made by Izzo et al. (1987) are able to produce consistent results. Here, we describe procedures to show cortical and subcortical neurons of human brains immersed in formalin for months or even years. The tissue was sliced with a vibratome, post-fixed in a combination of paraformaldehyde and picric acid in phosphate buffer, followed by osmium tetroxide and potassium dicromate, ""sandwiched"" between cover slips, and immersed in silver nitrate. The whole procedure takes between 5 and 11 days to achieve good results. The Golgi method has its characteristic pitfalls but, with this procedure, neurons and glia appear well-impregnated, allowing qualitative and quantitative studies under light microscopy. This contribution adds to the basic techniques for the study of human nervous tissue with the same advantages described for the ""single-section"" Golgi method in other species; it is easy and fast, requires minimal equipment, and provides consistent results. (C) 2010 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Purpose: The purpose of this study was to evaluate the amount of dentifrice applied to the toothbrush by school children using a liquid dentifrice (drop technique), when compared to toothpaste. Materials and Methods: A total of 178 school children (4-8 years old) from two cities in Brazil (Bauru and Bariri) participated in the present two-part crossover study. Children from Bauru received training regarding tooth-brushing techniques and use of dentifrice before data collection. In each phase, the amount of toothpaste or liquid dentifrice applied by the children to the toothbrush was measured, using a portable analytical balance (+/- 0.01 g). Data were tested by analysis of covariance (Ancova) and linear regression (p < 0.05). Results: The mean (+/- standard deviation) amounts of toothpaste and liquid dentifrice applied to the toothbrushes for children from Bauru were 0.41 +/- 0.20 g and 0.15 +/- 0.06 g, respectively. For children from Bariri, the amounts applied were and 0.48 +/- 0.24 g and 0.14 +/- 0.05 g, respectively. The amount of toothpaste applied was significantly larger than the amount of liquid dentifrice for both cities. Children from Bariri applied a significantly larger amount of toothpaste, when compared to those from Bauru. However, for the liquid dentifrice, there was no statistically significant difference between the cities. A significant correlation between the amount of toothpaste applied and the age of the children was verified, but the same was not found for the liquid dentifrice. Conclusion: The use of the drop technique reduced and standardised the amount of dentifrice applied to the toothbrush, which could reduce the risk of dental fluorosis for young children.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The DNA Checkerboard method enables the simultaneous identification of distinct microorganisms in a large number of samples and employs up to 45 whole genomic DNA probes to gram-negative and gram-positive bacterial species present in subgingival biofilms. Collectively, they account for 55%-60% of the bacteria in subgingival biofilms. In this study, we present the DNA Checkerboard hybridization as an alternative method for the detection and quantitation of Candida species in oral cavities. Our results reveal that DNA Checkerboard is sensitive enough and constitutes a powerful and appropriate method for detecting and quantifying Candida species found in the oral cavity.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of this study was to test a device developed to improve the functionality, accuracy and precision of the original technique for sweating rate measurements proposed by Schleger and Turner [Schleger AV, Turner HG (1965) Aust J Agric Res 16:92-106]. A device was built for this purpose and tested against the original Schleger and Turner technique. Testing was performed by measuring sweating rates in an experiment involving six Mertolenga heifers subjected to four different thermal levels in a climatic chamber. The device exhibited no functional problems and the results obtained with its use were more consistent than with the Schleger and Turner technique. There was no difference in the reproducibility of the two techniques (same accuracy), but measurements performed with the new device had lower repeatability, corresponding to lower variability and, consequently, to higher precision. When utilizing this device, there is no need for physical contact between the operator and the animal to maintain the filter paper discs in position. This has important advantages: the animals stay quieter, and several animals can be evaluated simultaneously. This is a major advantage because it allows more measurements to be taken in a given period of time, increasing the precision of the observations and diminishing the error associated with temporal hiatus (e.g., the solar angle during field studies). The new device has higher functional versatility when taking measurements in large-scale studies (many animals) under field conditions. The results obtained in this study suggest that the technique using the device presented here could represent an advantageous alternative to the original technique described by Schleger and Turner.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Partition of Unity Implicits (PUI) has been recently introduced for surface reconstruction from point clouds. In this work, we propose a PUI method that employs a set of well-observed solutions in order to produce geometrically pleasant results without requiring time consuming or mathematically overloaded computations. One feature of our technique is the use of multivariate orthogonal polynomials in the least-squares approximation, which allows the recursive refinement of the local fittings in terms of the degree of the polynomial. However, since the use of high-order approximations based only on the number of available points is not reliable, we introduce the concept of coverage domain. In addition, the method relies on the use of an algebraically defined triangulation to handle two important tasks in PUI: the spatial decomposition and an adaptive polygonization. As the spatial subdivision is based on tetrahedra, the generated mesh may present poorly-shaped triangles that are improved in this work by means a specific vertex displacement technique. Furthermore, we also address sharp features and raw data treatment. A further contribution is based on the PUI locality property that leads to an intuitive scheme for improving or repairing the surface by means of editing local functions.