123 resultados para Ore deposits Remote-sensing maps


Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents results of the AQL2004 project, which has been develope within the GOFC-GOLD Latin American network of remote sensing and forest fires (RedLatif). The project intended to obtain monthly burned-land maps of the entire region, from Mexico to Patagonia, using MODIS (moderate-resolution imaging spectroradiometer) reflectance data. The project has been organized in three different phases: acquisition and preprocessing of satellite data; discrimination of burned pixels; and validation of results. In the first phase, input data consisting of 32-day composites of MODIS 500-m reflectance data generated by the Global Land Cover Facility (GLCF) of the University of Maryland (College Park, Maryland, U.S.A.) were collected and processed. The discrimination of burned areas was addressed in two steps: searching for "burned core" pixels using postfire spectral indices and multitemporal change detection and mapping of burned scars using contextual techniques. The validation phase was based on visual analysis of Landsat and CBERS (China-Brazil Earth Resources Satellite) images. Validation of the burned-land category showed an agreement ranging from 30% to 60%, depending on the ecosystem and vegetation species present. The total burned area for the entire year was estimated to be 153 215 km2. The most affected countries in relation to their territory were Cuba, Colombia, Bolivia, and Venezuela. Burned areas were found in most land covers; herbaceous vegetation (savannas and grasslands) presented the highest proportions of burned area, while perennial forest had the lowest proportions. The importance of croplands in the total burned area should be taken with reserve, since this cover presented the highest commission errors. The importance of generating systematic products of burned land areas for different ecological processes is emphasized.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land cover plays a key role in global to regional monitoring and modeling because it affects and is being affected by climate change and thus became one of the essential variables for climate change studies. National and international organizations require timely and accurate land cover information for reporting and management actions. The North American Land Change Monitoring System (NALCMS) is an international cooperation of organizations and entities of Canada, the United States, and Mexico to map land cover change of North America's changing environment. This paper presents the methodology to derive the land cover map of Mexico for the year 2005 which was integrated in the NALCMS continental map. Based on a time series of 250 m Moderate Resolution Imaging Spectroradiometer (MODIS) data and an extensive sample data base the complexity of the Mexican landscape required a specific approach to reflect land cover heterogeneity. To estimate the proportion of each land cover class for every pixel several decision tree classifications were combined to obtain class membership maps which were finally converted to a discrete map accompanied by a confidence estimate. The map yielded an overall accuracy of 82.5% (Kappa of 0.79) for pixels with at least 50% map confidence (71.3% of the data). An additional assessment with 780 randomly stratified samples and primary and alternative calls in the reference data to account for ambiguity indicated 83.4% overall accuracy (Kappa of 0.80). A high agreement of 83.6% for all pixels and 92.6% for pixels with a map confidence of more than 50% was found for the comparison between the land cover maps of 2005 and 2006. Further wall-to-wall comparisons to related land cover maps resulted in 56.6% agreement with the MODIS land cover product and a congruence of 49.5 with Globcover.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Sea-level rise (SLR) from global warming may have severe consequences for coastal cities, particularly when combined with predicted increases in the strength of tidal surges. Predicting the regional impact of SLR flooding is strongly dependent on the modelling approach and accuracy of topographic data. Here, the areas under risk of sea water flooding for London boroughs were quantified based on the projected SLR scenarios reported in Intergovernmental Panel on Climate Change (IPCC) fifth assessment report (AR5) and UK climatic projections 2009 (UKCP09) using a tidally-adjusted bathtub modelling approach. Medium- to very high-resolution digital elevation models (DEMs) are used to evaluate inundation extents as well as uncertainties. Depending on the SLR scenario and DEMs used, it is estimated that 3%–8% of the area of Greater London could be inundated by 2100. The boroughs with the largest areas at risk of flooding are Newham, Southwark, and Greenwich. The differences in inundation areas estimated from a digital terrain model and a digital surface model are much greater than the root mean square error differences observed between the two data types, which may be attributed to processing levels. Flood models from SRTM data underestimate the inundation extent, so their results may not be reliable for constructing flood risk maps. This analysis provides a broad-scale estimate of the potential consequences of SLR and uncertainties in the DEM-based bathtub type flood inundation modelling for London boroughs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Recent interest in the validation of general circulation models (GCMs) has been devoted to objective methods. A small number of authors have used the direct synoptic identification of phenomena together with a statistical analysis to perform the objective comparison between various datasets. This paper describes a general method for performing the synoptic identification of phenomena that can be used for an objective analysis of atmospheric, or oceanographic, datasets obtained from numerical models and remote sensing. Methods usually associated with image processing have been used to segment the scene and to identify suitable feature points to represent the phenomena of interest. This is performed for each time level. A technique from dynamic scene analysis is then used to link the feature points to form trajectories. The method is fully automatic and should be applicable to a wide range of geophysical fields. An example will be shown of results obtained from this method using data obtained from a run of the Universities Global Atmospheric Modelling Project GCM.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data assimilation – the set of techniques whereby information from observing systems and models is combined optimally – is rapidly becoming prominent in endeavours to exploit Earth Observation for Earth sciences, including climate prediction. This paper explains the broad principles of data assimilation, outlining different approaches (optimal interpolation, three-dimensional and four-dimensional variational methods, the Kalman Filter), together with the approximations that are often necessary to make them practicable. After pointing out a variety of benefits of data assimilation, the paper then outlines some practical applications of the exploitation of Earth Observation by data assimilation in the areas of operational oceanography, chemical weather forecasting and carbon cycle modelling. Finally, some challenges for the future are noted.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The Convective Storm Initiation Project (CSIP) is an international project to understand precisely where, when, and how convective clouds form and develop into showers in the mainly maritime environment of southern England. A major aim of CSIP is to compare the results of the very high resolution Met Office weather forecasting model with detailed observations of the early stages of convective clouds and to use the newly gained understanding to improve the predictions of the model. A large array of ground-based instruments plus two instrumented aircraft, from the U.K. National Centre for Atmospheric Science (NCAS) and the German Institute for Meteorology and Climate Research (IMK), Karlsruhe, were deployed in southern England, over an area centered on the meteorological radars at Chilbolton, during the summers of 2004 and 2005. In addition to a variety of ground-based remote-sensing instruments, numerous rawin-sondes were released at one- to two-hourly intervals from six closely spaced sites. The Met Office weather radar network and Meteosat satellite imagery were used to provide context for the observations made by the instruments deployed during CSIP. This article presents an overview of the CSIP field campaign and examples from CSIP of the types of convective initiation phenomena that are typical in the United Kingdom. It shows the way in which certain kinds of observational data are able to reveal these phenomena and gives an explanation of how the analyses of data from the field campaign will be used in the development of an improved very high resolution NWP model for operational use.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

An operational dust forecasting model is developed by including the Met Office Hadley Centre climate model dust parameterization scheme, within a Met Office regional numerical weather prediction (NWP) model. The model includes parameterizations for dust uplift, dust transport, and dust deposition in six discrete size bins and provides diagnostics such as the aerosol optical depth. The results are compared against surface and satellite remote sensing measurements and against in situ measurements from the Facility for Atmospheric Airborne Measurements for a case study when a strong dust event was forecast. Comparisons are also performed against satellite and surface instrumentation for the entire month of August. The case study shows that this Saharan dust NWP model can provide very good guidance of dust events, as much as 42 h ahead. The analysis of monthly data suggests that the mean and variability in the dust model is also well represented.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Data assimilation provides techniques for combining observations and prior model forecasts to create initial conditions for numerical weather prediction (NWP). The relative weighting assigned to each observation in the analysis is determined by its associated error. Remote sensing data usually has correlated errors, but the correlations are typically ignored in NWP. Here, we describe three approaches to the treatment of observation error correlations. For an idealized data set, the information content under each simplified assumption is compared with that under correct correlation specification. Treating the errors as uncorrelated results in a significant loss of information. However, retention of an approximated correlation gives clear benefits.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Flooding is a major hazard in both rural and urban areas worldwide, but it is in urban areas that the impacts are most severe. An investigation of the ability of high resolution TerraSAR-X Synthetic Aperture Radar (SAR) data to detect flooded regions in urban areas is described. The study uses a TerraSAR-X image of a 1 in 150 year flood near Tewkesbury, UK, in 2007, for which contemporaneous aerial photography exists for validation. The DLR SAR End-To-End simulator (SETES) was used in conjunction with airborne scanning laser altimetry (LiDAR) data to estimate regions of the image in which water would not be visible due to shadow or layover caused by buildings and taller vegetation. A semi-automatic algorithm for the detection of floodwater in urban areas is described, together with its validation using the aerial photographs. 76% of the urban water pixels visible to TerraSAR-X were correctly detected, with an associated false positive rate of 25%. If all urban water pixels were considered, including those in shadow and layover regions, these figures fell to 58% and 19% respectively. The algorithm is aimed at producing urban flood extents with which to calibrate and validate urban flood inundation models, and these findings indicate that TerraSAR-X is capable of providing useful data for this purpose.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Measurements of the top‐of‐the‐atmosphere outgoing longwave radiation (OLR) for July 2003 from Meteosat‐7 are used to assess the performance of the numerical weather prediction version of the Met Office Unified Model. A significant difference is found over desert regions of northern Africa where the model emits too much OLR by up to 35 Wm−2 in the monthly mean. By cloud‐screening the data we find an error of up to 50 Wm−2 associated with cloud‐free areas, which suggests an error in the model surface temperature, surface emissivity, or atmospheric transmission. By building up a physical model of the radiative properties of mineral dust based on in situ, and surface‐based and satellite remote sensing observations we show that the most plausible explanation for the discrepancy in OLR is due to the neglect of mineral dust in the model. The calculations suggest that mineral dust can exert a longwave radiative forcing by as much as 50 Wm−2 in the monthly mean for 1200 UTC in cloud‐free regions, which accounts for the discrepancy between the model and the Meteosat‐7 observations. This suggests that inclusion of the radiative effects of mineral dust will lead to a significant improvement in the radiation balance of numerical weather prediction models with subsequent improvements in performance.