960 resultados para Hyperspectral remote sensing


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, mixed spectral-structural kernel machines are proposed for the classification of very-high resolution images. The simultaneous use of multispectral and structural features (computed using morphological filters) allows a significant increase in classification accuracy of remote sensing images. Subsequently, weighted summation kernel support vector machines are proposed and applied in order to take into account the multiscale nature of the scene considered. Such classifiers use the Mercer property of kernel matrices to compute a new kernel matrix accounting simultaneously for two scale parameters. Tests on a Zurich QuickBird image show the relevance of the proposed method : using the mixed spectral-structural features, the classification accuracy increases of about 5%, achieving a Kappa index of 0.97. The multikernel approach proposed provide an overall accuracy of 98.90% with related Kappa index of 0.985.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Land use/cover classification is one of the most important applications in remote sensing. However, mapping accurate land use/cover spatial distribution is a challenge, particularly in moist tropical regions, due to the complex biophysical environment and limitations of remote sensing data per se. This paper reviews experiments related to land use/cover classification in the Brazilian Amazon for a decade. Through comprehensive analysis of the classification results, it is concluded that spatial information inherent in remote sensing data plays an essential role in improving land use/cover classification. Incorporation of suitable textural images into multispectral bands and use of segmentation‑based method are valuable ways to improve land use/cover classification, especially for high spatial resolution images. Data fusion of multi‑resolution images within optical sensor data is vital for visual interpretation, but may not improve classification performance. In contrast, integration of optical and radar data did improve classification performance when the proper data fusion method was used. Among the classification algorithms available, the maximum likelihood classifier is still an important method for providing reasonably good accuracy, but nonparametric algorithms, such as classification tree analysis, have the potential to provide better results. However, they often require more time to achieve parametric optimization. Proper use of hierarchical‑based methods is fundamental for developing accurate land use/cover classification, mainly from historical remotely sensed data.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this work was to evaluate a simple, semi‑automated methodology for mapping cropland areas in the state of Mato Grosso, Brazil. A Fourier transform was applied over a time series of vegetation index products from the moderate resolution imaging spectroradiometer (Modis) sensor. This procedure allows for the evaluation of the amplitude of the periodic changes in vegetation response through time and the identification of areas with strong seasonal variation related to crop production. Annual cropland masks from 2006 to 2009 were generated and municipal cropland areas were estimated through remote sensing. We observed good agreement with official statistics on planted area, especially for municipalities with more than 10% of cropland cover (R² = 0.89), but poor agreement in municipalities with less than 5% crop cover (R² = 0.41). The assessed methodology can be used for annual cropland mapping over large production areas in Brazil.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, an advanced technique for the generation of deformation maps using synthetic aperture radar (SAR) data is presented. The algorithm estimates the linear and nonlinear components of the displacement, the error of the digital elevation model (DEM) used to cancel the topographic terms, and the atmospheric artifacts from a reduced set of low spatial resolution interferograms. The pixel candidates are selected from those presenting a good coherence level in the whole set of interferograms and the resulting nonuniform mesh tessellated with the Delauney triangulation to establish connections among them. The linear component of movement and DEM error are estimated adjusting a linear model to the data only on the connections. Later on, this information, once unwrapped to retrieve the absolute values, is used to calculate the nonlinear component of movement and atmospheric artifacts with alternate filtering techniques in both the temporal and spatial domains. The method presents high flexibility with respect to the required number of images and the baselines length. However, better results are obtained with large datasets of short baseline interferograms. The technique has been tested with European Remote Sensing SAR data from an area of Catalonia (Spain) and validated with on-field precise leveling measurements.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In May 1999, the European Space Agency (ESA) selected the Earth Explorer Opportunity Soil Moisture and Ocean Salinity (SMOS) mission to obtain global and frequent soil moisture and ocean salinity maps. SMOS' single payload is the Microwave Imaging Radiometer by Aperture Synthesis (MIRAS), an L-band two-dimensional aperture synthesis radiometer with multiangular observation capabilities. At L-band, the brightness temperature sensitivity to the sea surface salinity (SSS) is low, approximately 0.5 K/psu at 20/spl deg/C, decreasing to 0.25 K/psu at 0/spl deg/C, comparable to that to the wind speed /spl sim/0.2 K/(m/s) at nadir. However, at a given time, the sea state does not depend only on local winds, but on the local wind history and the presence of waves traveling from far distances. The Wind and Salinity Experiment (WISE) 2000 and 2001 campaigns were sponsored by ESA to determine the impact of oceanographic and atmospheric variables on the L-band brightness temperature at vertical and horizontal polarizations. This paper presents the results of the analysis of three nonstationary sea state conditions: growing and decreasing sea, and the presence of swell. Measured sea surface spectra are compared with the theoretical ones, computed using the instantaneous wind speed. Differences can be minimized using an "effective wind speed" that makes the theoretical spectrum best match the measured one. The impact on the predicted brightness temperatures is then assessed using the small slope approximation/small perturbation method (SSA/SPM).

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Compared to synthetic aperture radars (SARs), the angular resolution of microwave radiometers is quite poor. Traditionally, it has been limited by the physical size of the antenna. However, the angular resolution can be improved by means of aperture synthesis interferometric techniques. A narrow beam is synthesized during the image formation processing of the cross-correlations measured at zero-lag between pairs of signals collected by an array of antennas. The angular resolution is then determined by the maximum antenna spacing normalized to the wavelength (baseline). The next step in improving the angular resolution is the Doppler-Radiometer, somehow related to the super-synthesis radiometers and the Radiometer-SAR. This paper presents the concept of a three-antenna Doppler-Radiometer for 2D imaging. The performance of this instrument is evaluated in terms of angular/spatial resolution and radiometric sensitivity, and an L-band illustrative example is presented.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a model of the Stokes emission vector from the ocean surface. The ocean surface is described as an ensemble of facets with Cox and Munk's (1954) Gram-Charlier slope distribution. The study discusses the impact of different up-wind and cross-wind rms slopes, skewness, peakedness, foam cover models and atmospheric effects on the azimuthal variation of the Stokes vector, as well as the limitations of the model. Simulation results compare favorably, both in mean value and azimuthal dependence, with SSM/I data at 53° incidence angle and with JPL's WINDRAD measurements at incidence angles from 30° to 65°, and at wind speeds from 2.5 to 11 m/s.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

A recently developed technique, polarimetric radar interferometry, is applied to tackle the problem of the detection of buried objects embedded in surface clutter. An experiment with a fully polarimetric radar in an anechoic chamber has been carried out using different frequency bands and baselines. The processed results show the ability of this technique to detect buried plastic mines and to measure their depth. This technique enables the detection of plastic mines even if their backscatter response is much lower than that of the surface clutter.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The problem of synthetic aperture radar interferometric phase noise reduction is addressed. A new technique based on discrete wavelet transforms is presented. This technique guarantees high resolution phase estimation without using phase image segmentation. Areas containing only noise are hardly processed. Tests with synthetic and real interferograms are reported.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The inverse scattering problem concerning the determination of the joint time-delayDoppler-scale reflectivity density characterizing continuous target environments is addressed by recourse to the generalized frame theory. A reconstruction formula,involving the echoes of a frame of outgoing signals and its corresponding reciprocalframe, is developed. A ‘‘realistic’’ situation with respect to the transmission ofa finite number of signals is further considered. In such a case, our reconstruction formula is shown to yield the orthogonal projection of the reflectivity density onto a subspace generated by the transmitted signals.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The objective of this work was to evaluate the seasonal variation of soil cover and rainfall erosivity, and their influences on the revised universal soil loss equation (Rusle), in order to estimate watershed soil losses in a temporal scale. Twenty-two TM Landsat 5 images from 1986 to 2009 were used to estimate soil use and management factor (C factor). A corresponding rainfall erosivity factor (R factor) was considered for each image, and the other factors were obtained using the standard Rusle method. Estimated soil losses were grouped into classes and ranged from 0.13 Mg ha-1 on May 24, 2009 (dry season) to 62.0 Mg ha-1 on March 11, 2007 (rainy season). In these dates, maximum losses in the watershed were 2.2 and 781.5 Mg ha-1 , respectively. Mean annual soil loss in the watershed was 109.5 Mg ha-1 , but the central area, with a loss of nearly 300.0 Mg ha-1 , was characterized as a site of high water-erosion risk. The use of C factor obtained from remote sensing data, associated to corresponding R factor, was fundamental to evaluate the soil erosion estimated by the Rusle in different seasons, unlike of other studies which keep these factors constant throughout time.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In this paper, we consider active sampling to label pixels grouped with hierarchical clustering. The objective of the method is to match the data relationships discovered by the clustering algorithm with the user's desired class semantics. The first is represented as a complete tree to be pruned and the second is iteratively provided by the user. The active learning algorithm proposed searches the pruning of the tree that best matches the labels of the sampled points. By choosing the part of the tree to sample from according to current pruning's uncertainty, sampling is focused on most uncertain clusters. This way, large clusters for which the class membership is already fixed are no longer queried and sampling is focused on division of clusters showing mixed labels. The model is tested on a VHR image in a multiclass classification setting. The method clearly outperforms random sampling in a transductive setting, but cannot generalize to unseen data, since it aims at optimizing the classification of a given cluster structure.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

This paper presents a novel image classification scheme for benthic coral reef images that can be applied to both single image and composite mosaic datasets. The proposed method can be configured to the characteristics (e.g., the size of the dataset, number of classes, resolution of the samples, color information availability, class types, etc.) of individual datasets. The proposed method uses completed local binary pattern (CLBP), grey level co-occurrence matrix (GLCM), Gabor filter response, and opponent angle and hue channel color histograms as feature descriptors. For classification, either k-nearest neighbor (KNN), neural network (NN), support vector machine (SVM) or probability density weighted mean distance (PDWMD) is used. The combination of features and classifiers that attains the best results is presented together with the guidelines for selection. The accuracy and efficiency of our proposed method are compared with other state-of-the-art techniques using three benthic and three texture datasets. The proposed method achieves the highest overall classification accuracy of any of the tested methods and has moderate execution time. Finally, the proposed classification scheme is applied to a large-scale image mosaic of the Red Sea to create a completely classified thematic map of the reef benthos

Relevância:

80.00% 80.00%

Publicador:

Resumo:

In 1903, more than 30 million m3 of rock fell from the east slopes of Turtle Mountain in Alberta, Canada, causing a rock avalanche that killed about 70 people in the town of Frank. The Alberta Government, in response to continuing instabilities at the crest of the mountain, established a sophisticated field laboratory where state-of-the-art monitoring techniques have been installed and tested as part of an early-warning system. In this chapter, we provide an overview of the causes, trigger, and extreme mobility of the landslide. We then present new data relevant to the characterization and detection of the present-day instabilities on Turtle Mountain. Fourteen potential instabilities have been identified through field mapping and remote sensing. Lastly, we provide a detailed review of the different in-situ and remote monitoring systems that have been installed on the mountain. The implications of the new data for the future stability of Turtle Mountain and related landslide runout, and for monitoring strategies and risk management, are discussed.