986 resultados para New Jersey--Remote-sensing maps.
Resumo:
Las actividades agropecuarias ejercen diferentes presiones sobre los recursos naturales. Esto ha llevado, en algunas áreas, a un deterioro del suelo que provoca un impacto sobre la sustentabilidad en los sistemas agropecuarios. Para evaluar la degradación del suelo se han propuesto listas de indicadores, sin embargo, se carece de una herramienta metodológica robusta, adaptada a las condiciones edafoclimáticas regionales. Además, existe una demanda de productores e instituciones interesados en orientar acciones para preservar el suelo. El objetivo de este proyecto es evaluar la degradación física, química y biológica de los suelos en agroecosistemas del centro-sur de Córdoba. Por ello se propone desarrollar una herramienta metodológica que consiste en un set de indicadores físicos, químicos y biológicos, con valores umbrales, integrados en índices de degradación, que asistan a los agentes tomadores de decisiones y productores, en la toma de decisiones respecto de la degradación del suelo. El área de trabajo será una región agrícola del centro-sur de Córdoba con más de 100 años de agricultura. La metodología comienza con la caracterización del uso del territorio y sistemas de manejo, su clasificación y la obtención de mapas base de usos y manejos, mediante sensores remotos y encuestas. Se seleccionarán sitios de muestreo mediante una metodología semi-dirigida usando un SIG, asegurando un mínimo de un punto de muestreo por unidad de mapeo. Se elegirán sitios de referencia lo más cercano a una condición natural. Los indicadores a evaluar surgen de listas propuestas en trabajos previos del grupo, seleccionados en base a criterios internacionales y a adecuados a suelos de la región. Se usarán indicadores núcleo y complementarios. Para la obtención de umbrales, se usarán por un lado valores provenientes de la bibliografía y por otro, umbrales generados a partir de la distribución estadística del indicador en suelos de referencia. Para estandarizar cada indicador se definirá una función de transformación. Luego serán ponderarán mediante análisis estadísticos mulivariados e integrados en índices de degradación física, química y biológica, y un índice general de degradación. El abordaje concluirá con el desarrollo de dos instrumentos para la toma de decisiones: uno a escala regional, que consistirá en mapas de degradación en base a unidades cartográficas ambientales, de uso del territorio y de sistemas de manejo y otro a escala predial que informará sobre la degradación del suelo de un lote en particular, en comparación con suelos de referencia. Los actores interesados contarán con herramientas robustas para la toma de decisiones respecto de la degradación del suelo tanto a escala regional como local. Agricultural activities exert different pressures on natural resources. In some areas this has led to soil degradation and has an impact on agricultural sustainability. To assess soil degradation a robust methodological tool, adapted to regional soil and climatic conditions, is lacking. In addition, there is a demand from farmers and institutions interested in direct actions to preserve the soil. The objective of this project is to assess physical, chemical and biological soil degradation in agroecosystems of Córdoba. We propose to develop a tool that consists of a set of physical, chemical and biological indicators, with threshold values, integrated in soil degradation indices. The study area is a region with more than 100 years of agriculture. The methodology begins with the characterization of land use and management systems and the obtaining of base maps by means of remote sensing and survey. Sampling sites will be selected through a semi-directed methodology using GIS, ensuring at least one sampling point by mapping unit. Reference sites will be chosen as close to a natural condition. The proposed indicators emerge from previous works of the group, selected based on international standards and appropriate for the local soils. To obtain the thresholds, we will use, by one side, values from the literature, and by the other, values generated from the statistical distribution of the indicator in the reference soils. To standardize indicators transformation functions will be defined. Indicators will be weighted by mans of multivariate analysis and integrated in soil degradation indices. The approach concluded with the development of two instruments for decision making: a regional scale one, consisting in degradation maps based on environmental, land use and management systems mapping units; and an instrument at a plot level which will report on soil degradation of a particular plot compared to reference soils.
Resumo:
Report for the scientific sojourn carried out at the University of New South Wales from February to June the 2007. Two different biogeochemical models are coupled to a three dimensional configuration of the Princeton Ocean Model (POM) for the Northwestern Mediterranean Sea (Ahumada and Cruzado, 2007). The first biogeochemical model (BLANES) is the three-dimensional version of the model described by Bahamon and Cruzado (2003) and computes the nitrogen fluxes through six compartments using semi-empirical descriptions of biological processes. The second biogeochemical model (BIOMEC) is the biomechanical NPZD model described in Baird et al. (2004), which uses a combination of physiological and physical descriptions to quantify the rates of planktonic interactions. Physical descriptions include, for example, the diffusion of nutrients to phytoplankton cells and the encounter rate of predators and prey. The link between physical and biogeochemical processes in both models is expressed by the advection-diffusion of the non-conservative tracers. The similarities in the mathematical formulation of the biogeochemical processes in the two models are exploited to determine the parameter set for the biomechanical model that best fits the parameter set used in the first model. Three years of integration have been carried out for each model to reach the so called perpetual year run for biogeochemical conditions. Outputs from both models are averaged monthly and then compared to remote sensing images obtained from sensor MERIS for chlorophyll.
Resumo:
Land cover classification is a key research field in remote sensing and land change science as thematic maps derived from remotely sensed data have become the basis for analyzing many socio-ecological issues. However, land cover classification remains a difficult task and it is especially challenging in heterogeneous tropical landscapes where nonetheless such maps are of great importance. The present study aims to establish an efficient classification approach to accurately map all broad land cover classes in a large, heterogeneous tropical area of Bolivia, as a basis for further studies (e.g., land cover-land use change). Specifically, we compare the performance of parametric (maximum likelihood), non-parametric (k-nearest neighbour and four different support vector machines - SVM), and hybrid classifiers, using both hard and soft (fuzzy) accuracy assessments. In addition, we test whether the inclusion of a textural index (homogeneity) in the classifications improves their performance. We classified Landsat imagery for two dates corresponding to dry and wet seasons and found that non-parametric, and particularly SVM classifiers, outperformed both parametric and hybrid classifiers. We also found that the use of the homogeneity index along with reflectance bands significantly increased the overall accuracy of all the classifications, but particularly of SVM algorithms. We observed that improvements in producer’s and user’s accuracies through the inclusion of the homogeneity index were different depending on land cover classes. Earlygrowth/degraded forests, pastures, grasslands and savanna were the classes most improved, especially with the SVM radial basis function and SVM sigmoid classifiers, though with both classifiers all land cover classes were mapped with producer’s and user’s accuracies of around 90%. Our approach seems very well suited to accurately map land cover in tropical regions, thus having the potential to contribute to conservation initiatives, climate change mitigation schemes such as REDD+, and rural development policies.
Using 3D surface datasets to understand landslide evolution: From analogue models to real case study
Resumo:
Early detection of landslide surface deformation with 3D remote sensing techniques, as TLS, has become a great challenge during last decade. To improve our understanding of landslide deformation, a series of analogue simulation have been carried out on non-rigid bodies coupled with 3D digitizer. All these experiments have been carried out under controlled conditions, as water level and slope angle inclination. We were able to follow 3D surface deformation suffered by complex landslide bodies from precursory deformation still larger failures. These experiments were the basis for the development of a new algorithm for the quantification of surface deformation using automatic tracking method on discrete points of the slope surface. To validate the algorithm, comparisons were made between manually obtained results and algorithm surface displacement results. Outputs will help in understanding 3D deformation during pre-failure stages and failure mechanisms, which are fundamental aspects for future implementation of 3D remote sensing techniques in early warning systems.
Resumo:
Remote sensing and geographical information technologies were used to discriminate areas of high and low risk for contracting kala-azar or visceral leishmaniasis. Satellite data were digitally processed to generate maps of land cover and spectral indices, such as the normalised difference vegetation index and wetness index. To map estimated vector abundance and indoor climate data, local polynomial interpolations were used based on the weightage values. Attribute layers were prepared based on illiteracy and the unemployed proportion of the population and associated with village boundaries. Pearson's correlation coefficient was used to estimate the relationship between environmental variables and disease incidence across the study area. The cell values for each input raster in the analysis were assigned values from the evaluation scale. Simple weighting/ratings based on the degree of favourable conditions for kala-azar transmission were used for all the variables, leading to geo-environmental risk model. Variables such as, land use/land cover, vegetation conditions, surface dampness, the indoor climate, illiteracy rates and the size of the unemployed population were considered for inclusion in the geo-environmental kala-azar risk model. The risk model was stratified into areas of "risk"and "non-risk"for the disease, based on calculation of risk indices. The described approach constitutes a promising tool for microlevel kala-azar surveillance and aids in directing control efforts.
Resumo:
Many regions of the world, including inland lakes, present with suboptimal conditions for the remotely sensed retrieval of optical signals, thus challenging the limits of available satellite data-processing tools, such as atmospheric correction models (ACM) and water constituent-retrieval (WCR) algorithms. Working in such regions, however, can improve our understanding of remote-sensing tools and their applicabil- ity in new contexts, in addition to potentially offering useful information about aquatic ecology. Here, we assess and compare 32 combinations of two ACMs, two WCRs, and three binary categories of data quality standards to optimize a remotely sensed proxy of plankton biomass in Lake Kivu. Each parameter set is compared against the available ground-truth match-ups using Spearman's right-tailed ρ. Focusing on the best sets from each ACM-WCR combination, their performances are discussed with regard to data distribution, sample size, spatial completeness, and seasonality. The results of this study may be of interest both for ecological studies on Lake Kivu and for epidemio- logical studies of disease, such as cholera, the dynamics of which has been associated with plankton biomass in other regions of the world.
Resumo:
Data mining can be defined as the extraction of previously unknown and potentially useful information from large datasets. The main principle is to devise computer programs that run through databases and automatically seek deterministic patterns. It is applied in different fields of application, e.g., remote sensing, biometry, speech recognition, but has seldom been applied to forensic case data. The intrinsic difficulty related to the use of such data lies in its heterogeneity, which comes from the many different sources of information. The aim of this study is to highlight potential uses of pattern recognition that would provide relevant results from a criminal intelligence point of view. The role of data mining within a global crime analysis methodology is to detect all types of structures in a dataset. Once filtered and interpreted, those structures can point to previously unseen criminal activities. The interpretation of patterns for intelligence purposes is the final stage of the process. It allows the researcher to validate the whole methodology and to refine each step if necessary. An application to cutting agents found in illicit drug seizures was performed. A combinatorial approach was done, using the presence and the absence of products. Methods coming from the graph theory field were used to extract patterns in data constituted by links between products and place and date of seizure. A data mining process completed using graphing techniques is called ``graph mining''. Patterns were detected that had to be interpreted and compared with preliminary knowledge to establish their relevancy. The illicit drug profiling process is actually an intelligence process that uses preliminary illicit drug classes to classify new samples. Methods proposed in this study could be used \textit{a priori} to compare structures from preliminary and post-detection patterns. This new knowledge of a repeated structure may provide valuable complementary information to profiling and become a source of intelligence.
Resumo:
Usual image fusion methods inject features from a high spatial resolution panchromatic sensor into every low spatial resolution multispectral band trying to preserve spectral signatures and improve spatial resolution to that of the panchromatic sensor. The objective is to obtain the image that would be observed by a sensor with the same spectral response (i.e., spectral sensitivity and quantum efficiency) as the multispectral sensors and the spatial resolution of the panchromatic sensor. But in these methods, features from electromagnetic spectrum regions not covered by multispectral sensors are injected into them, and physical spectral responses of the sensors are not considered during this process. This produces some undesirable effects, such as resolution overinjection images and slightly modified spectral signatures in some features. The authors present a technique which takes into account the physical electromagnetic spectrum responses of sensors during the fusion process, which produces images closer to the image obtained by the ideal sensor than those obtained by usual wavelet-based image fusion methods. This technique is used to define a new wavelet-based image fusion method.
Resumo:
Peatlands are soil environments that store carbon and large amounts of water, due to their composition (90 % water), low hydraulic conductivity and a sponge-like behavior. It is estimated that peat bogs cover approximately 4.2 % of the Earth's surface and stock 28.4 % of the soil carbon of the planet. Approximately 612 000 ha of peatlands have been mapped in Brazil, but the peat bogs in the Serra do Espinhaço Meridional (SdEM) were not included. The objective of this study was to map the peat bogs of the northern part of the SdEM and estimate the organic matter pools and water volume they stock. The peat bogs were pre-identified and mapped by GIS and remote sensing techniques, using ArcGIS 9.3, ENVI 4.5 and GPS Track Maker Pro software and the maps validated in the field. Six peat bogs were mapped in detail (1:20,000 and 1:5,000) by transects spaced 100 m and each transect were determined every 20 m, the UTM (Universal Transverse Mercator) coordinates, depth and samples collected for characterization and determination of organic matter, according to the Brazilian System of Soil Classification. In the northern part of SdEM, 14,287.55 ha of peatlands were mapped, distributed over 1,180,109 ha, representing 1.2 % of the total area. These peatlands have an average volume of 170,021,845.00 m³ and stock 6,120,167 t (428.36 t ha-1) of organic matter and 142,138,262 m³ (9,948 m³ ha-1) of water. In the peat bogs of the Serra do Espinhaço Meridional, advanced stages of decomposing (sapric) organic matter predominate, followed by the intermediate stage (hemic). The vertical growth rate of the peatlands ranged between 0.04 and 0.43 mm year-1, while the carbon accumulation rate varied between 6.59 and 37.66 g m-2 year-1. The peat bogs of the SdEM contain the headwaters of important water bodies in the basins of the Jequitinhonha and San Francisco Rivers and store large amounts of organic carbon and water, which is the reason why the protection and preservation of these soil environments is such an urgent and increasing need.
Resumo:
Soil information is needed for managing the agricultural environment. The aim of this study was to apply artificial neural networks (ANNs) for the prediction of soil classes using orbital remote sensing products, terrain attributes derived from a digital elevation model and local geology information as data sources. This approach to digital soil mapping was evaluated in an area with a high degree of lithologic diversity in the Serra do Mar. The neural network simulator used in this study was JavaNNS and the backpropagation learning algorithm. For soil class prediction, different combinations of the selected discriminant variables were tested: elevation, declivity, aspect, curvature, curvature plan, curvature profile, topographic index, solar radiation, LS topographic factor, local geology information, and clay mineral indices, iron oxides and the normalized difference vegetation index (NDVI) derived from an image of a Landsat-7 Enhanced Thematic Mapper Plus (ETM+) sensor. With the tested sets, best results were obtained when all discriminant variables were associated with geological information (overall accuracy 93.2 - 95.6 %, Kappa index 0.924 - 0.951, for set 13). Excluding the variable profile curvature (set 12), overall accuracy ranged from 93.9 to 95.4 % and the Kappa index from 0.932 to 0.948. The maps based on the neural network classifier were consistent and similar to conventional soil maps drawn for the study area, although with more spatial details. The results show the potential of ANNs for soil class prediction in mountainous areas with lithological diversity.
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
In this paper, mixed spectral-structural kernel machines are proposed for the classification of very-high resolution images. The simultaneous use of multispectral and structural features (computed using morphological filters) allows a significant increase in classification accuracy of remote sensing images. Subsequently, weighted summation kernel support vector machines are proposed and applied in order to take into account the multiscale nature of the scene considered. Such classifiers use the Mercer property of kernel matrices to compute a new kernel matrix accounting simultaneously for two scale parameters. Tests on a Zurich QuickBird image show the relevance of the proposed method : using the mixed spectral-structural features, the classification accuracy increases of about 5%, achieving a Kappa index of 0.97. The multikernel approach proposed provide an overall accuracy of 98.90% with related Kappa index of 0.985.
Resumo:
In May 1999, the European Space Agency (ESA) selected the Earth Explorer Opportunity Soil Moisture and Ocean Salinity (SMOS) mission to obtain global and frequent soil moisture and ocean salinity maps. SMOS' single payload is the Microwave Imaging Radiometer by Aperture Synthesis (MIRAS), an L-band two-dimensional aperture synthesis radiometer with multiangular observation capabilities. At L-band, the brightness temperature sensitivity to the sea surface salinity (SSS) is low, approximately 0.5 K/psu at 20/spl deg/C, decreasing to 0.25 K/psu at 0/spl deg/C, comparable to that to the wind speed /spl sim/0.2 K/(m/s) at nadir. However, at a given time, the sea state does not depend only on local winds, but on the local wind history and the presence of waves traveling from far distances. The Wind and Salinity Experiment (WISE) 2000 and 2001 campaigns were sponsored by ESA to determine the impact of oceanographic and atmospheric variables on the L-band brightness temperature at vertical and horizontal polarizations. This paper presents the results of the analysis of three nonstationary sea state conditions: growing and decreasing sea, and the presence of swell. Measured sea surface spectra are compared with the theoretical ones, computed using the instantaneous wind speed. Differences can be minimized using an "effective wind speed" that makes the theoretical spectrum best match the measured one. The impact on the predicted brightness temperatures is then assessed using the small slope approximation/small perturbation method (SSA/SPM).
Resumo:
The problem of synthetic aperture radar interferometric phase noise reduction is addressed. A new technique based on discrete wavelet transforms is presented. This technique guarantees high resolution phase estimation without using phase image segmentation. Areas containing only noise are hardly processed. Tests with synthetic and real interferograms are reported.