916 resultados para digital terrain analysis


Relevância:

90.00% 90.00%

Publicador:

Resumo:

O presente trabalho apresenta a aplicação das fórmulas de Vincenty nos cálculos das correções do terreno e do efeito indireto, que desempenham papel relevante na construção de cartas geoidais. Implementa-se um programa de processamento que realiza a integração numérica sobre o modelo digital do terreno, discretizado em células triangulares de Delaunay. O sistema foi desenvolvido com a linguagem de programação FORTRAN, para a execução de intensos algoritmos numéricos usando compiladores livres e robustos. Para o cálculo do efeito indireto, considera-se a redução gravimétrica efetuada com base no segundo método de condensação de Helmert, face ao pequeno valor de efeito indireto no cálculo do geóide, em função da mudança que este produz no potencial da gravidade devido ao deslocamento da massa topográfica. Utiliza-se, o sistema geodésico SIRGAS 2000 como sistema de referência para o cômputo das correções. Simplificando o exame dos resultados alcançados, distingue-se o processamento e desenvolvimento do trabalho em etapas como a escolha de ferramentas geodésicas para máxima precisão dos resultados, elaboração de subrotinas e comparação de resultados com cálculos anteriores. Os resultados encontrados foram de geração sadia e satisfatória e podem ser perfeitamente empregados no cálculo do geóide em qualquer área do globo.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

[ES] El artículo trata el tema de las estructuras tumulares, su variabilidad formal y funcional (funeraria, lugar de habitación, etc.), así como su distinta cronología. Se describe los casos concretos de Txoritegi (Zerain —Gipuzkoa—) y Galardi (Txoritegi —Gipuzkoa—), unas estructuras tumulares tipo mota utilizadas, posiblemente, a modo de atalaya de vigía en el contexto del período de conflictividad bajomedieval. Es un elemento novedoso en el territorio para cuya interpretación se ha efectuado un análisis de las cuencas visuales mediante SIG.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Esta tese propôs uma metodologia para detecção de áreas susceptíveis a deslizamentos de terra a partir de imagens aéreas, culminando no desenvolvimento de uma ferramenta computacional, denominada SASD/T, para testar a metodologia. Para justificar esta pesquisa, um levantamento sobre os desastres naturais da história brasileira relacionada a deslizamentos de terra e as metodologias utilizadas para a detecção e análise de áreas susceptíveis a deslizamentos de terra foi realizado. Estudos preliminares de visualização 3D e conceitos relacionados ao mapeamento 3D foram realizados. Estereoscopia foi implementada para visualizar tridimensionalmente a região selecionada. As altitudes foram encontradas através de paralaxe, a partir dos pontos homólogos encontrados pelo algoritmo SIFT. Os experimentos foram realizados com imagens da cidade de Nova Friburgo. O experimento inicial mostrou que o resultado obtido utilizando SIFT em conjunto com o filtro proposto, foi bastante significativo ao ser comparado com os resultados de Fernandes (2008) e Carmo (2010), devido ao número de pontos homólogos encontrados e da superfície gerada. Para detectar os locais susceptíveis a deslizamentos, informações como altitude, declividade, orientação e curvatura foram extraídas dos pares estéreos e, em conjunto com as variáveis inseridas pelo usuário, forneceram uma análise de quão uma determinada área é susceptível a deslizamentos. A metodologia proposta pode ser estendida para a avaliação e previsão de riscos de deslizamento de terra de qualquer outra região, uma vez que permite a interação com o usuário, de modo que este especifique as características, os itens e as ponderações necessárias à análise em questão.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

O presente trabalho teve como objetivo avaliar volumetricamente a presença de poros em três cimentos obturadores. Para a análise de porosidade, quatro cilindros de cada cimento foram preparados e escaneados usando um microtomógrafo de alta resolução (Skyscam 1174, Kontich, Bélgica). O cálculo da porosidade foi realizado por meio de análise de imagens digitalizadas dos blocos de cimentos. Os quais foram microtomografados para criação de modelos tridimensionais. A presença de poros e vacúolos foi então avaliada por inferência do software CT analiser. Os resultados mostraram que o i-Root SP teve o menor índice de porosidade (0,07%), enquanto o AH plus e o MTA Fillapex não apresentaram diferenças estatisticamente significativas entre si (p≥0,05). Apesar dos resultados para o índice de porosidade total do MTA Fillapex e do AH plus não terem diferenças significativas(p>0,05), os achados do presente trabalho mostraram que o MTA fillapex obteve resultados significativamente maiores nos volumes médios individuais dos poros internos que o Ah plus e o i-Root SP. Os resultados foram tabulados e analisados estatisticamete através do teste Anova ao nível de significância de 5%.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Despite the increased applications of the composite materials in aerospace due to their exceptional physical and mechanical properties, the machining of composites remains a challenge. Fibre reinforced laminated composites are prone to different damages during machining process such as delamination, fibre pull-out, microcracks, thermal damages. Optimization of the drilling process parameters can reduces the probability of these damages. In the current research, a 3D finite element (FE) model is developed of the process of drilling in the carbon fibre reinforced composite (CFC). The FE model is used to investigate the effects of cutting speed and feed rate on thrust force, torque and delamination in the drilling of carbon fiber reinforced laminated composite. A mesoscale FE model taking into account of the different oriented plies and interfaces has been proposed to predict different damage modes in the plies and delamination. For validation purposes, experimental drilling tests have been performed and compared to the results of the finite element analysis. Using Matlab a digital image analysis code has been developed to assess the delamination factor produced in CFC as a result of drilling. © Springer Science+Business Media B.V. 2011.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Relative sea-level rise has been a major factor driving the evolution of reef systems during the Holocene. Most models of reef evolution suggest that reefs preferentially grow vertically during rising sea level then laterally from windward to leeward, once the reef flat reaches sea level. Continuous lagoonal sedimentation ("bucket fill") and sand apron progradation eventually lead to reef systems with totally filled lagoons. Lagoonal infilling of One Tree Reef (southern Great Barrier Reef) through sand apron accretion was examined in the context of late Holocene relative sea-level change. This analysis was conducted using sedimentological and digital terrain data supported by 50 radiocarbon ages from fossil microatolls, buried patch reefs, foraminifera and shells in sediment cores, and recalibrated previously published radiocarbon ages. This data set challenges the conceptual model of geologically continuous sediment infill during the Holocene through sand apron accretion. Rapid sand apron accretion occurred between 6000 and 3000 calibrated yr before present B.P. (cal. yr B.P.); followed by only small amounts of sedimentation between 3000 cal. yr B.P. and present, with no significant sand apron accretion in the past 2 k.y. This hiatus in sediment infill coincides with a sea-level fall of similar to 1-1.3 m during the late Holocene (ca. 2000 cal. yr B.P.), which would have caused the turn-off of highly productive live coral growth on the reef flats currently dominated by less productive rubble and algal flats, resulting in a reduced sediment input to back-reef environments and the cessation in sand apron accretion. Given that relative sea-level variations of similar to 1 m were common throughout the Holocene, we suggest that this mode of sand apron development and carbonate production is applicable to most reef systems.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Rapid in situ diagnosis of damage is a key issue in the preservation of stone-built cultural heritage. This is evident in the increasing number of congresses, workshops and publications dealing with this issue. With this increased activity has come, however, the realisation that for many culturally significant artefacts it is not possible either to remove samples for analysis or to affix surface markers for measurement. It is for this reason that there has been a growth of interest in non-destructive and minimally invasive techniques for characterising internal and external stone condition. With this interest has come the realisation that no single technique can adequately encompass the wide variety of parameters to be assessed or provide the range of information required to identify appropriate conservation. In this paper we describe a strategy to address these problems through the development of an integrated `tool kit' of measurement and analytical techniques aimed specifically at linking object-specific research to appropriate intervention. The strategy is based initially upon the acquisition of accurate three-dimensional models of stone-built heritage at different scales using a combination of millimetre accurate LiDAR and sub-millimetre accurate Object Scanning that can be exported into a GIS or directly into CAD. These are currently used to overlay information on stone characteristics obtained through a combination of Ground Penetrating Radar, Surface Permeametry, Colorimetry and X-ray Fluorescence, but the possibility exists for adding to this array of techniques as appropriate. In addition to the integrated three-dimensional data array provided by superimposition upon Digital Terrain Models, there is the capability of accurate re-measurement to show patterns of surface loss and changes in material condition over time. Thus it is possible to both record and base-line condition and to identify areas that require either preventive maintenance or more significant pre-emptive intervention. In pursuit of these goals the authors are developing, through a UK Government supported collaboration between University Researchers and Conservation Architects, commercially viable protocols for damage diagnosis, condition monitoring and eventually mechanisms for prioritizing repairs to stone-built heritage. The understanding is, however, that such strategies are not age-constrained and can ultimately be applied to structures of any age.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Trabalho Final de Mestrado para obtenção do grau de Mestre em Engenharia Civil na Área de Especialização de Hidráulica

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper describes HidroGIS, a GIS platform developed by Water Resources Program at Universidad Nacional de Colombia at Medellín. HidroSIG is a tool for hydrological variables visualization and analysis, using a set of modules that make this software a powerful tool for hydrological modeling. HidroSIG has tools for digital terrain models processing, water supply estimation using long term water balance in watersheds, a rainfall-runoff model, a model for landslide susceptibility estimation, an one-dimensional pollutant transport model, tools for homogeneity analysis in time series and tools for satellite images classification. The tools in development status are also described

Relevância:

90.00% 90.00%

Publicador:

Resumo:

La present tesi proposa una metodología per a la simulació probabilística de la fallada de la matriu en materials compòsits reforçats amb fibres de carboni, basant-se en l'anàlisi de la distribució aleatòria de les fibres. En els primers capítols es revisa l'estat de l'art sobre modelització matemàtica de materials aleatoris, càlcul de propietats efectives i criteris de fallada transversal en materials compòsits. El primer pas en la metodologia proposada és la definició de la determinació del tamany mínim d'un Element de Volum Representatiu Estadístic (SRVE) . Aquesta determinació es du a terme analitzant el volum de fibra, les propietats elàstiques efectives, la condició de Hill, els estadístics de les components de tensió i defromació, la funció de densitat de probabilitat i les funcions estadístiques de distància entre fibres de models d'elements de la microestructura, de diferent tamany. Un cop s'ha determinat aquest tamany mínim, es comparen un model periòdic i un model aleatori, per constatar la magnitud de les diferències que s'hi observen. Es defineix, també, una metodologia per a l'anàlisi estadístic de la distribució de la fibra en el compòsit, a partir d'imatges digitals de la secció transversal. Aquest anàlisi s'aplica a quatre materials diferents. Finalment, es proposa un mètode computacional de dues escales per a simular la fallada transversal de làmines unidireccionals, que permet obtenir funcions de densitat de probabilitat per a les variables mecàniques. Es descriuen algunes aplicacions i possibilitats d'aquest mètode i es comparen els resultats obtinguts de la simulació amb valors experimentals.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Flood modelling of urban areas is still at an early stage, partly because until recently topographic data of sufficiently high resolution and accuracy have been lacking in urban areas. However, Digital Surface Models (DSMs) generated from airborne scanning laser altimetry (LiDAR) having sub-metre spatial resolution have now become available, and these are able to represent the complexities of urban topography. The paper describes the development of a LiDAR post-processor for urban flood modelling based on the fusion of LiDAR and digital map data. The map data are used in conjunction with LiDAR data to identify different object types in urban areas, though pattern recognition techniques are also employed. Post-processing produces a Digital Terrain Model (DTM) for use as model bathymetry, and also a friction parameter map for use in estimating spatially-distributed friction coefficients. In vegetated areas, friction is estimated from LiDAR-derived vegetation height, and (unlike most vegetation removal software) the method copes with short vegetation less than ~1m high, which may occupy a substantial fraction of even an urban floodplain. The DTM and friction parameter map may also be used to help to generate an unstructured mesh of a vegetated urban floodplain for use by a 2D finite element model. The mesh is decomposed to reflect floodplain features having different frictional properties to their surroundings, including urban features such as buildings and roads as well as taller vegetation features such as trees and hedges. This allows a more accurate estimation of local friction. The method produces a substantial node density due to the small dimensions of many urban features.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Sea-level rise (SLR) from global warming may have severe consequences for coastal cities, particularly when combined with predicted increases in the strength of tidal surges. Predicting the regional impact of SLR flooding is strongly dependent on the modelling approach and accuracy of topographic data. Here, the areas under risk of sea water flooding for London boroughs were quantified based on the projected SLR scenarios reported in Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5) and UK climatic projections 2009 (UKCP09) using a tidally-adjusted bathtub modelling approach. Medium- to very high-resolution digital elevation models (DEMs) are used to evaluate inundation extents as well as uncertainties. Depending on the SLR scenario and DEMs used, it is estimated that 3%–8% of the area of Greater London could be inundated by 2100. The boroughs with the largest areas at risk of flooding are Newham, Southwark, and Greenwich. The differences in inundation areas estimated from a digital terrain model and a digital surface model are much greater than the root mean square error differences observed between the two data types, which may be attributed to processing levels. Flood models from SRTM data underestimate the inundation extent, so their results may not be reliable for constructing flood risk maps. This analysis provides a broad-scale estimate of the potential consequences of SLR and uncertainties in the DEM-based bathtub type flood inundation modelling for London boroughs.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The shuttle radar topography mission (SRTM), was flow on the space shuttle Endeavour in February 2000, with the objective of acquiring a digital elevation model of all land between 60 degrees north latitude and 56 degrees south latitude, using interferometric synthetic aperture radar (InSAR) techniques. The SRTM data are distributed at horizontal resolution of 1 arc-second (similar to 30m) for areas within the USA and at 3 arc-second (similar to 90m) resolution for the rest of the world. A resolution of 90m can be considered suitable for the small or medium-scale analysis, but it is too coarse for more detailed purposes. One alternative is to interpolate the SRTM data at a finer resolution; it will not increase the level of detail of the original digital elevation model (DEM), but it will lead to a surface where there is the coherence of angular properties (i.e. slope, aspect) between neighbouring pixels, which is an important characteristic when dealing with terrain analysis. This work intents to show how the proper adjustment of variogram and kriging parameters, namely the nugget effect and the maximum distance within which values are used in interpolation, can be set to achieve quality results on resampling SRTM data from 3"" to 1"". We present for a test area in western USA, which includes different adjustment schemes (changes in nugget effect value and in the interpolation radius) and comparisons with the original 1"" model of the area, with the national elevation dataset (NED) DEMs, and with other interpolation methods (splines and inverse distance weighted (IDW)). The basic concepts for using kriging to resample terrain data are: (i) working only with the immediate neighbourhood of the predicted point, due to the high spatial correlation of the topographic surface and omnidirectional behaviour of variogram in short distances; (ii) adding a very small random variation to the coordinates of the points prior to interpolation, to avoid punctual artifacts generated by predicted points with the same location than original data points and; (iii) using a small value of nugget effect, to avoid smoothing that can obliterate terrain features. Drainages derived from the surfaces interpolated by kriging and by splines have a good agreement with streams derived from the 1"" NED, with correct identification of watersheds, even though a few differences occur in the positions of some rivers in flat areas. Although the 1"" surfaces resampled by kriging and splines are very similar, we consider the results produced by kriging as superior, since the spline-interpolated surface still presented some noise and linear artifacts, which were removed by kriging.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

Objective: To develop a method for objective quantification of PD motor symptoms related to Off episodes and peak dose dyskinesias, using spiral data gathered by using a touch screen telemetry device. The aim was to objectively characterize predominant motor phenotypes (bradykinesia and dyskinesia), to help in automating the process of visual interpretation of movement anomalies in spirals as rated by movement disorder specialists. Background: A retrospective analysis was conducted on recordings from 65 patients with advanced idiopathic PD from nine different clinics in Sweden, recruited from January 2006 until August 2010. In addition to the patient group, 10 healthy elderly subjects were recruited. Upper limb movement data were collected using a touch screen telemetry device from home environments of the subjects. Measurements with the device were performed four times per day during week-long test periods. On each test occasion, the subjects were asked to trace pre-drawn Archimedean spirals, using the dominant hand. The pre-drawn spiral was shown on the screen of the device. The spiral test was repeated three times per test occasion and they were instructed to complete it within 10 seconds. The device had a sampling rate of 10Hz and measured both position and time-stamps (in milliseconds) of the pen tip. Methods: Four independent raters (FB, DH, AJ and DN) used a web interface that animated the spiral drawings and allowed them to observe different kinematic features during the drawing process and to rate task performance. Initially, a number of kinematic features were assessed including ‘impairment’, ‘speed’, ‘irregularity’ and ‘hesitation’ followed by marking the predominant motor phenotype on a 3-category scale: tremor, bradykinesia and/or choreatic dyskinesia. There were only 2 test occasions for which all the four raters either classified them as tremor or could not identify the motor phenotype. Therefore, the two main motor phenotype categories were bradykinesia and dyskinesia. ‘Impairment’ was rated on a scale from 0 (no impairment) to 10 (extremely severe) whereas ‘speed’, ‘irregularity’ and ‘hesitation’ were rated on a scale from 0 (normal) to 4 (extremely severe). The proposed data-driven method consisted of the following steps. Initially, 28 spatiotemporal features were extracted from the time series signals before being presented to a Multilayer Perceptron (MLP) classifier. The features were based on different kinematic quantities of spirals including radius, angle, speed and velocity with the aim of measuring the severity of involuntary symptoms and discriminate between PD-specific (bradykinesia) and/or treatment-induced symptoms (dyskinesia). A Principal Component Analysis was applied on the features to reduce their dimensions where 4 relevant principal components (PCs) were retained and used as inputs to the MLP classifier. Finally, the MLP classifier mapped these components to the corresponding visually assessed motor phenotype scores for automating the process of scoring the bradykinesia and dyskinesia in PD patients whilst they draw spirals using the touch screen device. For motor phenotype (bradykinesia vs. dyskinesia) classification, the stratified 10-fold cross validation technique was employed. Results: There were good agreements between the four raters when rating the individual kinematic features with intra-class correlation coefficient (ICC) of 0.88 for ‘impairment’, 0.74 for ‘speed’, 0.70 for ‘irregularity’, and moderate agreements when rating ‘hesitation’ with an ICC of 0.49. When assessing the two main motor phenotype categories (bradykinesia or dyskinesia) in animated spirals the agreements between the four raters ranged from fair to moderate. There were good correlations between mean ratings of the four raters on individual kinematic features and computed scores. The MLP classifier classified the motor phenotype that is bradykinesia or dyskinesia with an accuracy of 85% in relation to visual classifications of the four movement disorder specialists. The test-retest reliability of the four PCs across the three spiral test trials was good with Cronbach’s Alpha coefficients of 0.80, 0.82, 0.54 and 0.49, respectively. These results indicate that the computed scores are stable and consistent over time. Significant differences were found between the two groups (patients and healthy elderly subjects) in all the PCs, except for the PC3. Conclusions: The proposed method automatically assessed the severity of unwanted symptoms and could reasonably well discriminate between PD-specific and/or treatment-induced motor symptoms, in relation to visual assessments of movement disorder specialists. The objective assessments could provide a time-effect summary score that could be useful for improving decision-making during symptom evaluation of individualized treatment when the goal is to maximize functional On time for patients while minimizing their Off episodes and troublesome dyskinesias.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A challenge for the clinical management of advanced Parkinson’s disease (PD) patients is the emergence of fluctuations in motor performance, which represents a significant source of disability during activities of daily living of the patients. There is a lack of objective measurement of treatment effects for in-clinic and at-home use that can provide an overview of the treatment response. The objective of this paper was to develop a method for objective quantification of advanced PD motor symptoms related to off episodes and peak dose dyskinesia, using spiral data gathered by a touch screen telemetry device. More specifically, the aim was to objectively characterize motor symptoms (bradykinesia and dyskinesia), to help in automating the process of visual interpretation of movement anomalies in spirals as rated by movement disorder specialists. Digitized upper limb movement data of 65 advanced PD patients and 10 healthy (HE) subjects were recorded as they performed spiral drawing tasks on a touch screen device in their home environment settings. Several spatiotemporal features were extracted from the time series and used as inputs to machine learning methods. The methods were validated against ratings on animated spirals scored by four movement disorder specialists who visually assessed a set of kinematic features and the motor symptom. The ability of the method to discriminate between PD patients and HE subjects and the test-retest reliability of the computed scores were also evaluated. Computed scores correlated well with mean visual ratings of individual kinematic features. The best performing classifier (Multilayer Perceptron) classified the motor symptom (bradykinesia or dyskinesia) with an accuracy of 84% and area under the receiver operating characteristics curve of 0.86 in relation to visual classifications of the raters. In addition, the method provided high discriminating power when distinguishing between PD patients and HE subjects as well as had good test-retest reliability. This study demonstrated the potential of using digital spiral analysis for objective quantification of PD-specific and/or treatment-induced motor symptoms.