995 resultados para INTERPOLATION METHODS


Relevância:

60.00% 60.00%

Publicador:

Resumo:

This study examines the use of di erent features derived from remotely sensed data in segmentation of forest stands. Surface interpolation methods were applied to LiDAR points in order to represent data in the form of grayscale images. Median and mean shift ltering was applied to the data for noise reduction. The ability of di erent compositions of rasters obtained from LiDAR data and an aerial image to maximize stand homogeneity in the segmentation was evaluated. The quality of forest stand delineations was assessed by the Akaike information criterion. The research was performed in co-operation with Arbonaut Ltd., Joensuu, Finland.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Super Resolution problem is an inverse problem and refers to the process of producing a High resolution (HR) image, making use of one or more Low Resolution (LR) observations. It includes up sampling the image, thereby, increasing the maximum spatial frequency and removing degradations that arise during the image capture namely aliasing and blurring. The work presented in this thesis is based on learning based single image super-resolution. In learning based super-resolution algorithms, a training set or database of available HR images are used to construct the HR image of an image captured using a LR camera. In the training set, images are stored as patches or coefficients of feature representations like wavelet transform, DCT, etc. Single frame image super-resolution can be used in applications where database of HR images are available. The advantage of this method is that by skilfully creating a database of suitable training images, one can improve the quality of the super-resolved image. A new super resolution method based on wavelet transform is developed and it is better than conventional wavelet transform based methods and standard interpolation methods. Super-resolution techniques based on skewed anisotropic transform called directionlet transform are developed to convert a low resolution image which is of small size into a high resolution image of large size. Super-resolution algorithm not only increases the size, but also reduces the degradations occurred during the process of capturing image. This method outperforms the standard interpolation methods and the wavelet methods, both visually and in terms of SNR values. Artifacts like aliasing and ringing effects are also eliminated in this method. The super-resolution methods are implemented using, both critically sampled and over sampled directionlets. The conventional directionlet transform is computationally complex. Hence lifting scheme is used for implementation of directionlets. The new single image super-resolution method based on lifting scheme reduces computational complexity and thereby reduces computation time. The quality of the super resolved image depends on the type of wavelet basis used. A study is conducted to find the effect of different wavelets on the single image super-resolution method. Finally this new method implemented on grey images is extended to colour images and noisy images

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The shuttle radar topography mission (SRTM), was flow on the space shuttle Endeavour in February 2000, with the objective of acquiring a digital elevation model of all land between 60 degrees north latitude and 56 degrees south latitude, using interferometric synthetic aperture radar (InSAR) techniques. The SRTM data are distributed at horizontal resolution of 1 arc-second (similar to 30m) for areas within the USA and at 3 arc-second (similar to 90m) resolution for the rest of the world. A resolution of 90m can be considered suitable for the small or medium-scale analysis, but it is too coarse for more detailed purposes. One alternative is to interpolate the SRTM data at a finer resolution; it will not increase the level of detail of the original digital elevation model (DEM), but it will lead to a surface where there is the coherence of angular properties (i.e. slope, aspect) between neighbouring pixels, which is an important characteristic when dealing with terrain analysis. This work intents to show how the proper adjustment of variogram and kriging parameters, namely the nugget effect and the maximum distance within which values are used in interpolation, can be set to achieve quality results on resampling SRTM data from 3"" to 1"". We present for a test area in western USA, which includes different adjustment schemes (changes in nugget effect value and in the interpolation radius) and comparisons with the original 1"" model of the area, with the national elevation dataset (NED) DEMs, and with other interpolation methods (splines and inverse distance weighted (IDW)). The basic concepts for using kriging to resample terrain data are: (i) working only with the immediate neighbourhood of the predicted point, due to the high spatial correlation of the topographic surface and omnidirectional behaviour of variogram in short distances; (ii) adding a very small random variation to the coordinates of the points prior to interpolation, to avoid punctual artifacts generated by predicted points with the same location than original data points and; (iii) using a small value of nugget effect, to avoid smoothing that can obliterate terrain features. Drainages derived from the surfaces interpolated by kriging and by splines have a good agreement with streams derived from the 1"" NED, with correct identification of watersheds, even though a few differences occur in the positions of some rivers in flat areas. Although the 1"" surfaces resampled by kriging and splines are very similar, we consider the results produced by kriging as superior, since the spline-interpolated surface still presented some noise and linear artifacts, which were removed by kriging.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The current work was developed on the dune systems of the Parque das Dunas and Barreira do Inferno. These places are located in the cities of Natal and Parnamirim (RN, Brazil), respectively. This project has the purpose of developing the deterministic model on a specific blowout at Parque das Dunas, based in the geophysical interpretations of the lines gotten with the Ground Penetration Radar and the planialtimetric acquisitions of the topographical surface of the land. Also analyses of the vulnerability/susceptibility of these dune systems had been done in relation to the human pressures. To develop its deterministic model, it is necessary to acquire inner and outer geometries of the cited blowout. In order to depict inner geometries underneath the surface are used the GPR observing the altimetric control for topographical correction of the GPR lines. As for the outer geometries, the geodesic GPS gives us the planialtimetric points (x, y and z points) with milimetric precision, resulting in high-resolution surfaces. Using interpolation methods of the planialtimetric points was possible create Digital Elevations Models (DEM´s) of these surfaces. As a result, 1,161.4 meters of GPR lines were acquired on the blowout at the Parque das Dunas and 3,735.27 meters on the blowout at the Barreira do Inferno. These lines had been acquired with a 200 MHz antenna, except the 7 and 8 lines, for which we had been used a 100 MHz antenna. The gotten data had been processed and interpreted, being possible to identify boundary surfaces of first, second and third order. The first order boundary surface is related with the contact of the rocks of the Barreiras Group with the aeolian deposits. These deposits had been divided in two groups (Group 1 and Group 2) which are related with the geometry of stratum and the dip of its stratifications. Group 1 presented stratum of sigmoidal and irregular geometries and involved bodies where the reflectors had presented dips that had varied of 20 to the 28 degrees for the Parque das Dunas blowout and of 22 to the 29 degrees for the Barreira do Inferno blowout. Usually, it was limited in the base for the first order surface and in the top for the second order surface. Group 2 presented stratum of trough, wedge or lens geometries, limited in the base for the second order vi surface, where the corresponding deposits had more shown smoothed reflectors or with dips of low angle. The Deterministic and Digital Elevation Models had been developed from the integration and interpretation of the 2D data with the GOCAD® program. In Digital Elevations Models it was possible to see, for the localities, corridor or trough-shaped blowouts. In Deterministic Model it was possible to see first and second order boundary surfaces. For the vulnerability/susceptibility of the dune systems it was applied the methodology proposal by Boderè al (1991); however the same one did not show adequate because it evaluates actual coastal dunes. Actual coastal dunes are dunes that are presented in balance with the current environmental conditions. Therefore, a new methodology was proposal which characterizes the supplying and activity sedimentary, as well as the human pressures. For the methodology developed in this work, both the localities had presented a good management. The Parque das Dunas was characterized as a relic dune system and the Barreira do Inferno was characterized as a palimpsestic dune system. Also two Thematic Maps had been elaborated for the environmental characterization of the studied dune systems, with software ArcGis 8.3, and its respective data bases

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The application of agricultural fertilizers using variable rates along the field can be made through fertility maps previously elaborated or through real-time sensors. In most of the cases applies maps previously elaborated. These maps are identified from analyzes done in soil samples collected regularly (a sample for each field cell) or irregularly along the field. At the moment, mathematical interpolation methods such as nearest neighbor, local average, weighted inverse distance, contouring and kriging are used for predicting the variables involved with elaboration of fertility maps. However, some of these methods present deficiencies that can generate different fertility maps for a same data set. Moreover, such methods can generate inprecise maps to be used in precision farming. In this paper, artificial neural networks have been applied for elaboration and identification of precise fertility maps which can reduce the production costs and environmental impacts.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The obtaining of the correct space distribution for attributes of the soil is relevant in the agricultural planning, in what concerns to the installation and maintenance of the cultures. The objective of that work was to compare statistical interpolation methods (ordinary krigagem) and deterministic methods (inverse square distance) in the estimate of CTC and V% in a distrophic yellow-red Latossolo. The study was accomplished in the State of Experimental Hands on of the Instituto Capixaba de Pesquisa, Assistência Técnica e Extensão Rural (INCAPER), in an irregular grading with 109 points. The data were collected in the layer of 0 - 0,20 m in the projection of the cup of the plants, in the superior part of the slope. The performance of the interpolators was obtained and compared using the criterion of the medium mistake. The observations are dependent in space until a maximum reach of 14,1 m, considering the isotropy. IDW presented larger mistake in the estimate of the data; however its difference in relation to KRIG was small for both variables.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

There are many methods used to estimate values in places no sampled for construction of contours maps. The aim of this study was to use the methods of interpolation kriging, inverse of the square of the distance and polynomial in the representation of the spatial variability of the pH of the soil in the organic and conventional management in the culture of the coffee plantation. For that, irregular meshes were built for soil sampling in the depth of 0-0,10 meters, totaling 40 points sampling in each area. For gauging of the interpolation methods they were solitary 10% of the total of points, for each area. Initially, the data were appraised through the classic statistics (descriptive and exploratory) and spatial analysis. The method inverse square of the distance and kriging has low error in estimating dados. The method of kriging presented low variation around the average in different managements.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Traditional methods of submerged aquatic vegetation (SAV) survey last long and then, they are high cost. Optical remote sensing is an alternative, but it has some limitations in the aquatic environment. The use of echosounder techniques is efficient to detect submerged targets. Therefore, the aim of this study is to evaluate different kinds of interpolation approach applied on SAV sample data collected by echosounder. This study case was performed in a region of Uberaba River - Brazil. The interpolation methods evaluated in this work follow: Nearest Neighbor, Weighted Average, Triangular Irregular Network (TIN) and ordinary kriging. Better results were carried out with kriging interpolation. Thus, it is recommend the use of geostatistics for spatial inference of SAV from sample data surveyed with echosounder techniques. © 2012 IEEE.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Ciência do Solo) - FCAV

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O objetivo do trabalho foi determinar o tamanho adequado de amostra para estimar o volume de fustes de espécies florestais de uma população de árvores a serem cortadas no sistema de manejo florestal da empresa Cikel Brasil Verde Madeiras - Pará. Utilizaram-se as metodologias da amostragem sistemática e do estimador geoestatístico da krigagem ordinária com simulação sequencial, respectivamente para a escolha das amostras e estimação dos volumes dos fustes das árvores. Os resultados mostraram que os métodos podem ser utilizados no cálculo dos volumes de fustes de árvores. Entretanto, o método da krigagem apresenta um efeito de suavização, tendo como conseqüência uma subestimação dos volumes calculados. Neste caso, um fator de correção foi aplicado para minimizar o efeito da suavização. A simulação sequencial indicativa apresentou resultados mais precisos em relação à krigagem, uma vez que tal método apresentou algumas vantagens, tal como a não exigência de amostras com distribuições normais e ausência de efeito de suavização, característico dos métodos de interpolação.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

O uso da técnica da camada equivalente na interpolação de dados de campo potencial permite levar em consideração que a anomalia, gravimétrica ou magnética, a ser interpolada é uma função harmônica. Entretanto, esta técnica tem aplicação computacional restrita aos levantamentos com pequeno número de dados, uma vez que ela exige a solução de um problema de mínimos quadrados com ordem igual a este número. Para viabilizar a aplicação da técnica da camada equivalente aos levantamentos com grande número de dados, nós desenvolvemos o conceito de observações equivalentes e o método EGTG, que, respectivamente, diminui a demanda em memória do computador e otimiza as avaliações dos produtos internos inerentes à solução dos problemas de mínimos quadrados. Basicamente, o conceito de observações equivalentes consiste em selecionar algumas observações, entre todas as observações originais, tais que o ajuste por mínimos quadrados, que ajusta as observações selecionadas, ajusta automaticamente (dentro de um critério de tolerância pré-estabelecido) todas as demais que não foram escolhidas. As observações selecionadas são denominadas observações equivalentes e as restantes são denominadas observações redundantes. Isto corresponde a partir o sistema linear original em dois sistemas lineares com ordens menores. O primeiro com apenas as observações equivalentes e o segundo apenas com as observações redundantes, de tal forma que a solução de mínimos quadrados, obtida a partir do primeiro sistema linear, é também a solução do segundo sistema. Este procedimento possibilita ajustar todos os dados amostrados usando apenas as observações equivalentes (e não todas as observações originais) o que reduz a quantidade de operações e a utilização de memória pelo computador. O método EGTG consiste, primeiramente, em identificar o produto interno como sendo uma integração discreta de uma integral analítica conhecida e, em seguida, em substituir a integração discreta pela avaliação do resultado da integral analítica. Este método deve ser aplicado quando a avaliação da integral analítica exigir menor quantidade de cálculos do que a exigida para computar a avaliação da integral discreta. Para determinar as observações equivalentes, nós desenvolvemos dois algoritmos iterativos denominados DOE e DOEg. O primeiro algoritmo identifica as observações equivalentes do sistema linear como um todo, enquanto que o segundo as identifica em subsistemas disjuntos do sistema linear original. Cada iteração do algoritmo DOEg consiste de uma aplicação do algoritmo DOE em uma partição do sistema linear original. Na interpolação, o algoritmo DOE fornece uma superfície interpoladora que ajusta todos os dados permitindo a interpolação na forma global. O algoritmo DOEg, por outro lado, otimiza a interpolação na forma local uma vez que ele emprega somente as observações equivalentes, em contraste com os algoritmos existentes para a interpolação local que empregam todas as observações. Os métodos de interpolação utilizando a técnica da camada equivalente e o método da mínima curvatura foram comparados quanto às suas capacidades de recuperar os valores verdadeiros da anomalia durante o processo de interpolação. Os testes utilizaram dados sintéticos (produzidos por modelos de fontes prismáticas) a partir dos quais os valores interpolados sobre a malha regular foram obtidos. Estes valores interpolados foram comparados com os valores teóricos, calculados a partir do modelo de fontes sobre a mesma malha, permitindo avaliar a eficiência do método de interpolação em recuperar os verdadeiros valores da anomalia. Em todos os testes realizados o método da camada equivalente recuperou mais fielmente o valor verdadeiro da anomalia do que o método da mínima curvatura. Particularmente em situações de sub-amostragem, o método da mínima curvatura se mostrou incapaz de recuperar o valor verdadeiro da anomalia nos lugares em que ela apresentou curvaturas mais pronunciadas. Para dados adquiridos em níveis diferentes o método da mínima curvatura apresentou o seu pior desempenho, ao contrário do método da camada equivalente que realizou, simultaneamente, a interpolação e o nivelamento. Utilizando o algoritmo DOE foi possível aplicar a técnica da camada equivalente na interpolação (na forma global) dos 3137 dados de anomalia ar-livre de parte do levantamento marinho Equant-2 e 4941 dados de anomalia magnética de campo total de parte do levantamento aeromagnético Carauari-Norte. Os números de observações equivalentes identificados em cada caso foram, respectivamente, iguais a 294 e 299. Utilizando o algoritmo DOEg nós otimizamos a interpolação (na forma local) da totalidade dos dados de ambos os levantamentos citados. Todas as interpolações realizadas não seriam possíveis sem a aplicação do conceito de observações equivalentes. A proporção entre o tempo de CPU (rodando os programas no mesmo espaço de memória) gasto pelo método da mínima curvatura e pela camada equivalente (interpolação global) foi de 1:31. Esta razão para a interpolação local foi praticamente de 1:1.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Pós-graduação em Agronomia (Irrigação e Drenagem) - FCA

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Categorical data cannot be interpolated directly because they are outcomes of discrete random variables. Thus, types of categorical variables are transformed into indicator functions that can be handled by interpolation methods. Interpolated indicator values are then backtransformed to the original types of categorical variables. However, aspects such as variability and uncertainty of interpolated values of categorical data have never been considered. In this paper we show that the interpolation variance can be used to map an uncertainty zone around boundaries between types of categorical variables. Moreover, it is shown that the interpolation variance is a component of the total variance of the categorical variables, as measured by the coefficient of unalikeability. (C) 2011 Elsevier Ltd. All rights reserved.