973 resultados para Coral reefs and islands -- Remote sensing
Resumo:
Soil surveys are the main source of spatial information on soils and have a range of different applications, mainly in agriculture. The continuity of this activity has however been severely compromised, mainly due to a lack of governmental funding. The purpose of this study was to evaluate the feasibility of two different classifiers (artificial neural networks and a maximum likelihood algorithm) in the prediction of soil classes in the northwest of the state of Rio de Janeiro. Terrain attributes such as elevation, slope, aspect, plan curvature and compound topographic index (CTI) and indices of clay minerals, iron oxide and Normalized Difference Vegetation Index (NDVI), derived from Landsat 7 ETM+ sensor imagery, were used as discriminating variables. The two classifiers were trained and validated for each soil class using 300 and 150 samples respectively, representing the characteristics of these classes in terms of the discriminating variables. According to the statistical tests, the accuracy of the classifier based on artificial neural networks (ANNs) was greater than of the classic Maximum Likelihood Classifier (MLC). Comparing the results with 126 points of reference showed that the resulting ANN map (73.81 %) was superior to the MLC map (57.94 %). The main errors when using the two classifiers were caused by: a) the geological heterogeneity of the area coupled with problems related to the geological map; b) the depth of lithic contact and/or rock exposure, and c) problems with the environmental correlation model used due to the polygenetic nature of the soils. This study confirms that the use of terrain attributes together with remote sensing data by an ANN approach can be a tool to facilitate soil mapping in Brazil, primarily due to the availability of low-cost remote sensing data and the ease by which terrain attributes can be obtained.
Resumo:
Since different pedologists will draw different soil maps of a same area, it is important to compare the differences between mapping by specialists and mapping techniques, as for example currently intensively discussed Digital Soil Mapping. Four detailed soil maps (scale 1:10.000) of a 182-ha sugarcane farm in the county of Rafard, São Paulo State, Brazil, were compared. The area has a large variation of soil formation factors. The maps were drawn independently by four soil scientists and compared with a fifth map obtained by a digital soil mapping technique. All pedologists were given the same set of information. As many field expeditions and soil pits as required by each surveyor were provided to define the mapping units (MUs). For the Digital Soil Map (DSM), spectral data were extracted from Landsat 5 Thematic Mapper (TM) imagery as well as six terrain attributes from the topographic map of the area. These data were summarized by principal component analysis to generate the map designs of groups through Fuzzy K-means clustering. Field observations were made to identify the soils in the MUs and classify them according to the Brazilian Soil Classification System (BSCS). To compare the conventional and digital (DSM) soil maps, they were crossed pairwise to generate confusion matrices that were mapped. The categorical analysis at each classification level of the BSCS showed that the agreement between the maps decreased towards the lower levels of classification and the great influence of the surveyor on both the mapping and definition of MUs in the soil map. The average correspondence between the conventional and DSM maps was similar. Therefore, the method used to obtain the DSM yielded similar results to those obtained by the conventional technique, while providing additional information about the landscape of each soil, useful for applications in future surveys of similar areas.
Resumo:
Peatlands are soil environments that store carbon and large amounts of water, due to their composition (90 % water), low hydraulic conductivity and a sponge-like behavior. It is estimated that peat bogs cover approximately 4.2 % of the Earth's surface and stock 28.4 % of the soil carbon of the planet. Approximately 612 000 ha of peatlands have been mapped in Brazil, but the peat bogs in the Serra do Espinhaço Meridional (SdEM) were not included. The objective of this study was to map the peat bogs of the northern part of the SdEM and estimate the organic matter pools and water volume they stock. The peat bogs were pre-identified and mapped by GIS and remote sensing techniques, using ArcGIS 9.3, ENVI 4.5 and GPS Track Maker Pro software and the maps validated in the field. Six peat bogs were mapped in detail (1:20,000 and 1:5,000) by transects spaced 100 m and each transect were determined every 20 m, the UTM (Universal Transverse Mercator) coordinates, depth and samples collected for characterization and determination of organic matter, according to the Brazilian System of Soil Classification. In the northern part of SdEM, 14,287.55 ha of peatlands were mapped, distributed over 1,180,109 ha, representing 1.2 % of the total area. These peatlands have an average volume of 170,021,845.00 m³ and stock 6,120,167 t (428.36 t ha-1) of organic matter and 142,138,262 m³ (9,948 m³ ha-1) of water. In the peat bogs of the Serra do Espinhaço Meridional, advanced stages of decomposing (sapric) organic matter predominate, followed by the intermediate stage (hemic). The vertical growth rate of the peatlands ranged between 0.04 and 0.43 mm year-1, while the carbon accumulation rate varied between 6.59 and 37.66 g m-2 year-1. The peat bogs of the SdEM contain the headwaters of important water bodies in the basins of the Jequitinhonha and San Francisco Rivers and store large amounts of organic carbon and water, which is the reason why the protection and preservation of these soil environments is such an urgent and increasing need.
Resumo:
In this study we propose an evaluation of the angular effects altering the spectral response of the land-cover over multi-angle remote sensing image acquisitions. The shift in the statistical distribution of the pixels observed in an in-track sequence of WorldView-2 images is analyzed by means of a kernel-based measure of distance between probability distributions. Afterwards, the portability of supervised classifiers across the sequence is investigated by looking at the evolution of the classification accuracy with respect to the changing observation angle. In this context, the efficiency of various physically and statistically based preprocessing methods in obtaining angle-invariant data spaces is compared and possible synergies are discussed.
Resumo:
The epithelial Na(+) channel (ENaC) and the acid-sensing ion channels (ASICs) form subfamilies within the ENaC/degenerin family of Na(+) channels. ENaC mediates transepithelial Na(+) transport, thereby contributing to Na(+) homeostasis and the maintenance of blood pressure and the airway surface liquid level. ASICs are H(+)-activated channels found in central and peripheral neurons, where their activation induces neuronal depolarization. ASICs are involved in pain sensation, the expression of fear, and neurodegeneration after ischemia, making them potentially interesting drug targets. This review summarizes the biophysical properties, cellular functions, and physiologic and pathologic roles of the ASIC and ENaC subfamilies. The analysis of the homologies between ENaC and ASICs and the relation between functional and structural information shows many parallels between these channels, suggesting that some mechanisms that control channel activity are shared between ASICs and ENaC. The available crystal structures and the discovery of animal toxins acting on ASICs provide a unique opportunity to address the molecular mechanisms of ENaC and ASIC function to identify novel strategies for the modulation of these channels by pharmacologic ligands.
Resumo:
The objective of this work was to verify if reflected energy of soils can characterize and discriminate them. A spectroradiometer (Spectral reflectance between: 400-2,500 nm) was utilized in laboratory. The soils evaluated are located in Bauru region, SP, Brazil, and are classified as Typic Argiudoll (TR), Typic Eutrorthox (LR), Typic Argiudoll (PE), Typic Haplortox (LE), Typic Paleudalf (PV) and Typic Quartzipsamment (AQ). They were characterized by their spectral reflectance as for descriptive conventional methods (Brazilian and International) according to the types of spectral curves. A method for the spectral descriptive evaluation of soils was established. It was possible to characterize and discriminate the soils by their spectral reflectance, with exception for LR and TR. The spectral differences were better identified by the general shape of spectral curves, by the intensity of band absorption and angle tendencies. These characteristics were mainly influenced by organic matter, iron, granulometry and mineralogy constituents. A reduction of iron and clay contents, which influenced higher reflectance intensity and shape variations, occurred on the soils LR/TR, PE, LE, PV and AQ, on that sequence. Soils of the same group with different clay textures could be discriminated. The conventional descriptive evaluation of spectral curves was less efficient on discriminating soils. Simulated orbital data discriminated soils mainly by bands 5 and 7.
Resumo:
We investigate the relevance of morphological operators for the classification of land use in urban scenes using submetric panchromatic imagery. A support vector machine is used for the classification. Six types of filters have been employed: opening and closing, opening and closing by reconstruction, and opening and closing top hat. The type and scale of the filters are discussed, and a feature selection algorithm called recursive feature elimination is applied to decrease the dimensionality of the input data. The analysis performed on two QuickBird panchromatic images showed that simple opening and closing operators are the most relevant for classification at such a high spatial resolution. Moreover, mixed sets combining simple and reconstruction filters provided the best performance. Tests performed on both images, having areas characterized by different architectural styles, yielded similar results for both feature selection and classification accuracy, suggesting the generalization of the feature sets highlighted.
Resumo:
Of the approximately 25,000 bridges in Iowa, 28% are classified as structurally deficient, functionally obsolete, or both. The state of Iowa thus follows the national trend of an aging infrastructure in dire need of repair or replacement with a relatively limited funding base. Therefore, there is a need to develop new materials with properties that may lead to longer life spans and reduced life-cycle costs. In addition, new methods for determining the condition of structures are needed to monitor the structures effectively and identify when the useful life of the structure has expired or other maintenance is needed. High-performance steel (HPS) has emerged as a material with enhanced weldability, weathering capabilities, and fracture toughness compared to conventional structural steels. In 2004, the Iowa Department of Transportation opened Iowa's first HPS girder bridge, the East 12th Street Bridge over I-235 in Des Moines, Iowa. The objective of this project was to evaluate HPS as a viable option for use in Iowa bridges with a continuous structural health monitoring (SHM) system. The scope of the project included documenting the construction of the East 12th Street Bridge and concurrently developing a remote, continuous SHM system using fiber-optic sensing technology to evaluate the structural performance of the bridge. The SHM system included bridge evaluation parameters, similar to design parameters used by bridge engineers, for evaluating the structure. Through the successful completion of this project, a baseline of bridge performance was established that can be used for continued long-term monitoring of the structure. In general, the structural performance of the HPS bridge exceeded the design parameters and is performing well. Although some problems were encountered with the SHM system, the system functions well and recommendations for improving the system have been made.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
In this paper, an advanced technique for the generation of deformation maps using synthetic aperture radar (SAR) data is presented. The algorithm estimates the linear and nonlinear components of the displacement, the error of the digital elevation model (DEM) used to cancel the topographic terms, and the atmospheric artifacts from a reduced set of low spatial resolution interferograms. The pixel candidates are selected from those presenting a good coherence level in the whole set of interferograms and the resulting nonuniform mesh tessellated with the Delauney triangulation to establish connections among them. The linear component of movement and DEM error are estimated adjusting a linear model to the data only on the connections. Later on, this information, once unwrapped to retrieve the absolute values, is used to calculate the nonlinear component of movement and atmospheric artifacts with alternate filtering techniques in both the temporal and spatial domains. The method presents high flexibility with respect to the required number of images and the baselines length. However, better results are obtained with large datasets of short baseline interferograms. The technique has been tested with European Remote Sensing SAR data from an area of Catalonia (Spain) and validated with on-field precise leveling measurements.
Resumo:
A recently developed technique, polarimetric radar interferometry, is applied to tackle the problem of the detection of buried objects embedded in surface clutter. An experiment with a fully polarimetric radar in an anechoic chamber has been carried out using different frequency bands and baselines. The processed results show the ability of this technique to detect buried plastic mines and to measure their depth. This technique enables the detection of plastic mines even if their backscatter response is much lower than that of the surface clutter.
Resumo:
The objective of this work was to evaluate the seasonal variation of soil cover and rainfall erosivity, and their influences on the revised universal soil loss equation (Rusle), in order to estimate watershed soil losses in a temporal scale. Twenty-two TM Landsat 5 images from 1986 to 2009 were used to estimate soil use and management factor (C factor). A corresponding rainfall erosivity factor (R factor) was considered for each image, and the other factors were obtained using the standard Rusle method. Estimated soil losses were grouped into classes and ranged from 0.13 Mg ha-1 on May 24, 2009 (dry season) to 62.0 Mg ha-1 on March 11, 2007 (rainy season). In these dates, maximum losses in the watershed were 2.2 and 781.5 Mg ha-1 , respectively. Mean annual soil loss in the watershed was 109.5 Mg ha-1 , but the central area, with a loss of nearly 300.0 Mg ha-1 , was characterized as a site of high water-erosion risk. The use of C factor obtained from remote sensing data, associated to corresponding R factor, was fundamental to evaluate the soil erosion estimated by the Rusle in different seasons, unlike of other studies which keep these factors constant throughout time.
Resumo:
In 1903, more than 30 million m3 of rock fell from the east slopes of Turtle Mountain in Alberta, Canada, causing a rock avalanche that killed about 70 people in the town of Frank. The Alberta Government, in response to continuing instabilities at the crest of the mountain, established a sophisticated field laboratory where state-of-the-art monitoring techniques have been installed and tested as part of an early-warning system. In this chapter, we provide an overview of the causes, trigger, and extreme mobility of the landslide. We then present new data relevant to the characterization and detection of the present-day instabilities on Turtle Mountain. Fourteen potential instabilities have been identified through field mapping and remote sensing. Lastly, we provide a detailed review of the different in-situ and remote monitoring systems that have been installed on the mountain. The implications of the new data for the future stability of Turtle Mountain and related landslide runout, and for monitoring strategies and risk management, are discussed.
Resumo:
This work presents new, efficient Markov chain Monte Carlo (MCMC) simulation methods for statistical analysis in various modelling applications. When using MCMC methods, the model is simulated repeatedly to explore the probability distribution describing the uncertainties in model parameters and predictions. In adaptive MCMC methods based on the Metropolis-Hastings algorithm, the proposal distribution needed by the algorithm learns from the target distribution as the simulation proceeds. Adaptive MCMC methods have been subject of intensive research lately, as they open a way for essentially easier use of the methodology. The lack of user-friendly computer programs has been a main obstacle for wider acceptance of the methods. This work provides two new adaptive MCMC methods: DRAM and AARJ. The DRAM method has been built especially to work in high dimensional and non-linear problems. The AARJ method is an extension to DRAM for model selection problems, where the mathematical formulation of the model is uncertain and we want simultaneously to fit several different models to the same observations. The methods were developed while keeping in mind the needs of modelling applications typical in environmental sciences. The development work has been pursued while working with several application projects. The applications presented in this work are: a winter time oxygen concentration model for Lake Tuusulanjärvi and adaptive control of the aerator; a nutrition model for Lake Pyhäjärvi and lake management planning; validation of the algorithms of the GOMOS ozone remote sensing instrument on board the Envisat satellite of European Space Agency and the study of the effects of aerosol model selection on the GOMOS algorithm.