45 resultados para Interpolation accuracy
em Universidad Politécnica de Madrid
Resumo:
La presente Tesis Doctoral aborda la aplicación de métodos meshless, o métodos sin malla, a problemas de autovalores, fundamentalmente vibraciones libres y pandeo. En particular, el estudio se centra en aspectos tales como los procedimientos para la resolución numérica del problema de autovalores con estos métodos, el coste computacional y la viabilidad de la utilización de matrices de masa o matrices de rigidez geométrica no consistentes. Además, se acomete en detalle el análisis del error, con el objetivo de determinar sus principales fuentes y obtener claves que permitan la aceleración de la convergencia. Aunque en la actualidad existe una amplia variedad de métodos meshless en apariencia independientes entre sí, se han analizado las diferentes relaciones entre ellos, deduciéndose que el método Element-Free Galerkin Method [Método Galerkin Sin Elementos] (EFGM) es representativo de un amplio grupo de los mismos. Por ello se ha empleado como referencia en este análisis. Muchas de las fuentes de error de un método sin malla provienen de su algoritmo de interpolación o aproximación. En el caso del EFGM ese algoritmo es conocido como Moving Least Squares [Mínimos Cuadrados Móviles] (MLS), caso particular del Generalized Moving Least Squares [Mínimos Cuadrados Móviles Generalizados] (GMLS). La formulación de estos algoritmos indica que la precisión de los mismos se basa en los siguientes factores: orden de la base polinómica p(x), características de la función de peso w(x) y forma y tamaño del soporte de definición de esa función. Se ha analizado la contribución individual de cada factor mediante su reducción a un único parámetro cuantificable, así como las interacciones entre ellos tanto en distribuciones regulares de nodos como en irregulares. El estudio se extiende a una serie de problemas estructurales uni y bidimensionales de referencia, y tiene en cuenta el error no sólo en el cálculo de autovalores (frecuencias propias o carga de pandeo, según el caso), sino también en términos de autovectores. This Doctoral Thesis deals with the application of meshless methods to eigenvalue problems, particularly free vibrations and buckling. The analysis is focused on aspects such as the numerical solving of the problem, computational cost and the feasibility of the use of non-consistent mass or geometric stiffness matrices. Furthermore, the analysis of the error is also considered, with the aim of identifying its main sources and obtaining the key factors that enable a faster convergence of a given problem. Although currently a wide variety of apparently independent meshless methods can be found in the literature, the relationships among them have been analyzed. The outcome of this assessment is that all those methods can be grouped in only a limited amount of categories, and that the Element-Free Galerkin Method (EFGM) is representative of the most important one. Therefore, the EFGM has been selected as a reference for the numerical analyses. Many of the error sources of a meshless method are contributed by its interpolation/approximation algorithm. In the EFGM, such algorithm is known as Moving Least Squares (MLS), a particular case of the Generalized Moving Least Squares (GMLS). The accuracy of the MLS is based on the following factors: order of the polynomial basis p(x), features of the weight function w(x), and shape and size of the support domain of this weight function. The individual contribution of each of these factors, along with the interactions among them, has been studied in both regular and irregular arrangement of nodes, by means of a reduction of each contribution to a one single quantifiable parameter. This assessment is applied to a range of both one- and two-dimensional benchmarking cases, and includes not only the error in terms of eigenvalues (natural frequencies or buckling load), but also of eigenvectors
Resumo:
The classical Kramer sampling theorem, which provides a method for obtaining orthogonal sampling formulas, can be formulated in a more general nonorthogonal setting. In this setting, a challenging problem is to characterize the situations when the obtained nonorthogonal sampling formulas can be expressed as Lagrange-type interpolation series. In this article a necessary and sufficient condition is given in terms of the zero removing property. Roughly speaking, this property concerns the stability of the sampled functions on removing a finite number of their zeros.
Resumo:
The classical Kramer sampling theorem provides a method for obtaining orthogonal sampling formulas. In particular, when the involved kernel is analytic in the sampling parameter it can be stated in an abstract setting of reproducing kernel Hilbert spaces of entire functions which includes as a particular case the classical Shannon sampling theory. This abstract setting allows us to obtain a sort of converse result and to characterize when the sampling formula associated with an analytic Kramer kernel can be expressed as a Lagrange-type interpolation series. On the other hand, the de Branges spaces of entire functions satisfy orthogonal sampling formulas which can be written as Lagrange-type interpolation series. In this work some links between all these ideas are established.
Resumo:
There exists an interest in performing pin-by-pin calculations coupled with thermal hydraulics so as to improve the accuracy of nuclear reactor analysis. In the framework of the EU NURISP project, INRNE and UPM have generated an experimental version of a few group diffusion cross sections library with discontinuity factors intended for VVER analysis at the pin level with the COBAYA3 code. The transport code APOLLO2 was used to perform the branching calculations. As a first proof of principle the library was created for fresh fuel and covers almost the full parameter space of steady state and transient conditions. The main objective is to test the calculation schemes and post-processing procedures, including multi-pin branching calculations. Two library options are being studied: one based on linear table interpolation and another one using a functional fitting of the cross sections. The libraries generated with APOLLO2 have been tested with the pin-by-pin diffusion model in COBAYA3 including discontinuity factors; first comparing 2D results against the APOLLO2 reference solutions and afterwards using the libraries to compute a 3D assembly problem coupled with a simplified thermal-hydraulic model.
Resumo:
This paper discusses the target localization problem of wireless visual sensor networks. Specifically, each node with a low-resolution camera extracts multiple feature points to represent the target at the sensor node level. A statistical method of merging the position information of different sensor nodes to select the most correlated feature point pair at the base station is presented. This method releases the influence of the accuracy of target extraction on the accuracy of target localization in universal coordinate system. Simulations show that, compared with other relative approach, our proposed method can generate more desirable target localization's accuracy, and it has a better trade-off between camera node usage and localization accuracy.
Resumo:
The aim of this paper Is lo discuss the influence of the selection of the interpolation kernel in the accuracy of the modeling of the internal viscous dissipation in Tree surface Hows, Simulations corresponding to a standing wave* for which an analytic solution available, are presented. Wendland and renormalized Gaussian kernels are considered. The differences in the flow pattern* and Internal dissipation mechanisms are documented for a range of Reynolds numbers. It is shown that the simulations with Wendland kernels replicate the dissipation mechanisms more accurately than those with a renormalized Gaussian kernel. Although some explanations are hinted we have Tailed to clarify which the core structural reasons for Mich differences are*
Resumo:
The accuracy of Tomás López´s historical cartography of the Canary Islands included in the “Atlas Particular” of the Kingdoms of Spain, Portugal and Adjacent Islands” is analyzed. For this purpose, we propose a methodology based on Geographic Information Systems (GIS), a comparison of digitized historical cartography population centres with current ones. This study shows that the lineal error value is small for the smaller islands: Lanzarote, El Hierro, La Palma and La Gomera. In the large islands of Tenerife, Fuerteventura and Gran Canaria, the error is smaller in central zones but increases towards the coast. This indicates that Tomás López began his cartography starting from central island zones, accumulating errors due to lack of geodetic references as he moved toward the coast.
Resumo:
La presente tesis doctoral tiene por objeto el estudio y análisis de técnicas y modelos de obtención de parámetros biofísicos e indicadores ambientales, de manera automatizada a partir de imágenes procedentes de satélite de alta resolución temporal. En primer lugar se revisan los diferentes programas espaciales de observación del territorio, con especial atención a los que proporcionan dicha resolución. También se han revisado las metodologías y procesos que permiten la obtención de diferentes parámetros cuantitativos y documentos cualitativos, relacionados con diversos aspectos de las cubiertas terrestres, atendiendo a su adaptabilidad a las particularidades de los datos. En segundo lugar se propone un modelo de obtención de parámetros ambientales, que integra información proveniente de sensores espaciales y de otras fuentes auxiliares utilizando, en cierta medida, las metodologías presentadas en apartados anteriores y optimizando algunas de las referidas o proponiendo otras nuevas, de manera que se permita dicha obtención de manera eficiente, a partir de los datos disponibles y de forma sistemática. Tras esta revisión de metodologías y propuesta del modelo, se ha procedido a la realización de experimentos, con la finalidad de comprobar su comportamiento en diferentes casos prácticos, depurar los flujos de datos y procesos, así como establecer las situaciones que pueden afectar a los resultados. De todo ello se deducirá la evaluación del referido modelo. Los sensores considerados en este trabajo han sido MODIS, de alta resolución temporal y Thematic Mapper (TM), de media resolución espacial, por tratarse de instrumentos de referencia en la realización de estudios ambientales. También por la duración de sus correspondientes misiones de registro de datos, lo que permite realizar estudios de evolución temporal de ciertos parámetros biofísicos, durante amplios periodos de tiempo. Así mismo. es de destacar que la continuidad de los correspondientes programas parece estar asegurada. Entre los experimentos realizados, se ha ensayado una metodología para la integración de datos procedentes de ambos sensores. También se ha analizado un método de interpolación temporal que permite obtener imágenes sintéticas con la resolución espacial de TM (30 m) y la temporal de MODIS (1 día), ampliando el rango de aplicación de este último sensor. Asimismo, se han analizado algunos de los factores que afectan a los datos registrados, tal como la geometría de la toma de los mismos y los episodios de precipitación, los cuales alteran los resultados obtenidos. Por otro lado, se ha comprobado la validez del modelo propuesto en el estudio de fenómenos ambientales dinámicos, en concreto la contaminación orgánica de aguas embalsadas. Finalmente, se ha demostrado un buen comportamiento del modelo en todos los casos ensayados, así como su flexibilidad, lo que le permite adaptarse a nuevos orígenes de datos, o nuevas metodologías de cálculo. Abstract This thesis aims to the study and analysis of techniques and models, in order to obtain biophysical parameters and environmental indicators in an automated way, using high temporal resolution satellite data. Firstly we have reviewed the main Earth Observation Programs, paying attention to those that provide high temporal resolution. Also have reviewed the methodologies and process flow diagrams in order to obtain quantitative parameters and qualitative documents, relating to various aspects of land cover, according to their adaptability to the peculiarities of the data. In the next stage, a model which allows obtaining environmental parameters, has been proposed. This structure integrates information from space sensors and ancillary data sources, using the methodologies presented in previous sections that permits the parameters calculation in an efficient and automated way. After this review of methodologies and the proposal of the model, we proceeded to carry out experiments, in order to check the behavior of the structure in real situations. From this, we derive the accuracy of the model. The sensors used in this work have been MODIS, which is a high temporal resolution sensor, and Thematic Mapper (TM), which is a medium spatial resolution instrument. This choice was motivated because they are reference sensors in environmental studies, as well as for the duration of their corresponding missions of data logging, and whose continuity seems assured. Among the experiments, we tested a methodology that allows the integration of data from cited sensors, we discussed a proposal for a temporal interpolation method for obtaining synthetic images with spatial resolution of TM (30 m) and temporal of MODIS (1 day), extending the application range of this one. Furthermore, we have analyzed some of the factors that affect the recorded data, such as the relative position of the satellite with the ground point, and the rainfall events, which alter the obtained results. On the other hand, we have proven the validity of the proposed model in the study of the organic contamination in inland water bodies. Finally, we have demonstrated a good performance of the proposed model in all cases tested, as well as its flexibility and adaptability.
Resumo:
The singularities which arise when there is a sudden change of boundary conditions are modelled using spectral shape interpolation functions. The procedure can be used for elasticity as well as potential theory and to any degree of accuracy with respect to the smooth part of the curve.
Resumo:
Desarrollo de algoritmo de interpolación basado en descomposición octree y funciones radiales de soporte compacto para movimiento de mallas en problemas aerolásticos
Resumo:
The aim of this study was to examine the effect of positioning on the correctness of decision making of top-class referees and assistant referees during international games. Match analyses were carried out during the Fe´de´ration Internationale de Football Association (FIFA) Confederations Cup 2009 and 380 foul play incidents and 165 offside situations were examined. The error percentage for the referees when indicating the incidents averaged 14%. The lowest error percentage occurred in the central area of the field, where the collaboration of the assistant referee is limited, and was achieved when indicating the incidents from a distance of 11–15 m, whereas this percentage peaked (23%) in the last 15-min match period. The error rate for the assistant referees was 13%. Distance of the assistant referee to the offside line did not have an impact on the quality of the offside decision. The risk of making incorrect decisions was reduced when the assistant referees viewed the offside situations from an angle between 46 and 608. Incorrect offside decisions occurred twice as often in the second as in the first half of the games. Perceptual-cognitive training sessions specific to the requirements of the game should be implemented in the weekly schedule of football officials to reduce the overall error rate.
Resumo:
In pressure irrigation-water distribution networks, pressure regulating devices for controlling the discharged flow rate by irrigation units are needed due to the variability of flow rate. In addition, applied water volume is used controlled operating the valve during a calculated time interval, and assuming constant flow rate. In general, a pressure regulating valve PRV is the commonly used pressure regulating device in a hydrant, which, also, executes the open and close function. A hydrant feeds several irrigation units, requiring a wide range in flow rate. In addition, some flow meters are also available, one as a component of the hydrant and the rest are placed downstream. Every land owner has one flow meter for each group of field plots downstream the hydrant. Its lecture could be used for refining the water balance but its accuracy must be taken into account. Ideal PRV performance would maintain a constant downstream pressure. However, the true performance depends on both upstream pressure and the discharged flow rate. The objective of this work is to asses the influence of the performance on the applied volume during the whole irrigation events in a year. The results of the study have been obtained introducing the flow rate into a PRV model. Variations on flow rate are simulated by taking into account the consequences of variations on climate conditions and also decisions in irrigation operation, such us duration and frequency application. The model comprises continuity, dynamic and energy equations of the components of the PRV.
Resumo:
The interpolation of points by means of Information Technology programs appears as a technical tool of some relevancy in the hydrogeology in general and in the study of the humid zones in particular. Our approach has been the determination of the 3-D geometry of the humid zones of major depth of the Rabasa Lakes. To estimate the topography of the lake bed, we proceed to acquire information in the field by means of sonar and GPS equipment. A total of 335 points were measured both on the perimeter and in the lake bed. In a second stage, this information was used in a kriging program to obtain the bathymetry of the wetland. This methodology is demonstrated as one of the most reliable and cost-efficient for the 3-D analysis of this type of water masses. The bathymetric study of the zone allows us to characterize the mid- and long-term hydrological evolution of the lakes by means of depth-area-volume curves.
Resumo:
Se establece un metodología para evaluar la cartografía de capas GIS
Resumo:
Se proponen novedosas fórmulas para evaluar la certeza de la cartografía