12 resultados para fibrewise map and homotopy
em CentAUR: Central Archive University of Reading - UK
Resumo:
Land cover maps at different resolutions and mapping extents contribute to modeling and support decision making processes. Because land cover affects and is affected by climate change, it is listed among the 13 terrestrial essential climate variables. This paper describes the generation of a land cover map for Latin America and the Caribbean (LAC) for the year 2008. It was developed in the framework of the project Latin American Network for Monitoring and Studying of Natural Resources (SERENA), which has been developed within the GOFC-GOLD Latin American network of remote sensing and forest fires (RedLaTIF). The SERENA land cover map for LAC integrates: 1) the local expertise of SERENA network members to generate the training and validation data, 2) a methodology for land cover mapping based on decision trees using MODIS time series, and 3) class membership estimates to account for pixel heterogeneity issues. The discrete SERENA land cover product, derived from class memberships, yields an overall accuracy of 84% and includes an additional layer representing the estimated per-pixel confidence. The study demonstrates in detail the use of class memberships to better estimate the area of scarce classes with a scattered spatial distribution. The land cover map is already available as a printed wall map and will be released in digital format in the near future. The SERENA land cover map was produced with a legend and classification strategy similar to that used by the North American Land Change Monitoring System (NALCMS) to generate a land cover map of the North American continent, that will allow to combine both maps to generate consistent data across America facilitating continental monitoring and modeling
Resumo:
Asynchronous Optical Sampling (ASOPS) [1,2] and frequency comb spectrometry [3] based on dual Ti:saphire resonators operated in a master/slave mode have the potential to improve signal to noise ratio in THz transient and IR sperctrometry. The multimode Brownian oscillator time-domain response function described by state-space models is a mathematically robust framework that can be used to describe the dispersive phenomena governed by Lorentzian, Debye and Drude responses. In addition, the optical properties of an arbitrary medium can be expressed as a linear combination of simple multimode Brownian oscillator functions. The suitability of a range of signal processing schemes adopted from the Systems Identification and Control Theory community for further processing the recorded THz transients in the time or frequency domain will be outlined [4,5]. Since a femtosecond duration pulse is capable of persistent excitation of the medium within which it propagates, such approach is perfectly justifiable. Several de-noising routines based on system identification will be shown. Furthermore, specifically developed apodization structures will be discussed. These are necessary because due to dispersion issues, the time-domain background and sample interferograms are non-symmetrical [6-8]. These procedures can lead to a more precise estimation of the complex insertion loss function. The algorithms are applicable to femtosecond spectroscopies across the EM spectrum. Finally, a methodology for femtosecond pulse shaping using genetic algorithms aiming to map and control molecular relaxation processes will be mentioned.
Resumo:
Knowledge of tropical raptor habitat use is limited and yet a thorough understanding is vital when trying to conserve endangered species. We used a well studied, reintroduced population of the vulnerable Mauritius Kestrel Falco punctatus to investigate habitat preferences in a modified landscape. We constructed a high resolution digital habitat map and radiotracked 13 juvenile Kestrels to quantify habitat preferences. We distinguished seven habitat types in our study area and tracked Kestrels from 71 to 130 days old during which they dispersed from their natal territory and settled within a home-range after reaching independence. Mean home-range size was 0.95 km(2) characterized by a bimodal pattern of intensity around the natal site and post-independence home-range. Compositional analysis showed that home-ranges were located non-randomly with respect to habitat but there was no evidence to suggest differential use of habitats within home-ranges. Native and semi-invaded forest and grassland were consistently preferred, whereas agriculture was used significantly less than other habitats. No difference was found between the available length of edge dividing native forest and grassland within a home-range when compared to that available within a 2.35-km buffer around their nest-site, based on the maximum distance a juvenile was found to disperse. Repeating the analysis in three dimensions gave very similar results. Our results suggest that Mauritius Kestrels are not obligate forest dwellers as was once thought but can also exploit open habitats such as grassland. Kestrels may be using isolated mature trees within grassland as vantage points for hunting in the same way as they use the natural stratified forest structure. We suggest that the avoidance of agriculture is partly due to a lack of such vantage points. The conservation importance of forest degradation and agricultural encroachment is highlighted and comparisons with the habitat preferences of other tropical falcons are discussed.
Resumo:
Traditionally functional magnetic resonance imaging (fMRI) has been used to map activity in the human brain by measuring increases in the Blood Oxygenation Level Dependent (BOLD) signal. Often accompanying positive BOLD fMRI signal changes are sustained negative signal changes. Previous studies investigating the neurovascular coupling mechanisms of the negative BOLD phenomenon have used concurrent 2D-optical imaging spectroscopy (2D-OIS) and electrophysiology (Boorman et al., 2010). These experiments suggested that the negative BOLD signal in response to whisker stimulation was a result of an increase in deoxy-haemoglobin and reduced multi-unit activity in the deep cortical layers. However, Boorman et al. (2010) did not measure the BOLD and haemodynamic response concurrently and so could not quantitatively compare either the spatial maps or the 2D-OIS and fMRI time series directly. Furthermore their study utilised a homogeneous tissue model in which is predominantly sensitive to haemodynamic changes in more superficial layers. Here we test whether the 2D-OIS technique is appropriate for studies of negative BOLD. We used concurrent fMRI with 2D-OIS techniques for the investigation of the haemodynamics underlying the negative BOLD at 7 Tesla. We investigated whether optical methods could be used to accurately map and measure the negative BOLD phenomenon by using 2D-OIS haemodynamic data to derive predictions from a biophysical model of BOLD signal changes. We showed that despite the deep cortical origin of the negative BOLD response, if an appropriate heterogeneous tissue model is used in the spectroscopic analysis then 2D-OIS can be used to investigate the negative BOLD phenomenon.
Resumo:
Urbanization related alterations to the surface energy balance impact urban warming (‘heat islands’), the growth of the boundary layer, and many other biophysical processes. Traditionally, in situ heat flux measures have been used to quantify such processes, but these typically represent only a small local-scale area within the heterogeneous urban environment. For this reason, remote sensing approaches are very attractive for elucidating more spatially representative information. Here we use hyperspectral imagery from a new airborne sensor, the Operative Modular Imaging Spectrometer (OMIS), along with a survey map and meteorological data, to derive the land cover information and surface parameters required to map spatial variations in turbulent sensible heat flux (QH). The results from two spatially-explicit flux retrieval methods which use contrasting approaches and, to a large degree, different input data are compared for a central urban area of Shanghai, China: (1) the Local-scale Urban Meteorological Parameterization Scheme (LUMPS) and (2) an Aerodynamic Resistance Method (ARM). Sensible heat fluxes are determined at the full 6 m spatial resolution of the OMIS sensor, and at lower resolutions via pixel aggregation and spatial averaging. At the 6 m spatial resolution, the sensible heat flux of rooftop dominated pixels exceeds that of roads, water and vegetated areas, with values peaking at ∼ 350 W m− 2, whilst the storage heat flux is greatest for road dominated pixels (peaking at around 420 W m− 2). We investigate the use of both OMIS-derived land surface temperatures made using a Temperature–Emissivity Separation (TES) approach, and land surface temperatures estimated from air temperature measures. Sensible heat flux differences from the two approaches over the entire 2 × 2 km study area are less than 30 W m− 2, suggesting that methods employing either strategy maybe practica1 when operated using low spatial resolution (e.g. 1 km) data. Due to the differing methodologies, direct comparisons between results obtained with the LUMPS and ARM methods are most sensibly made at reduced spatial scales. At 30 m spatial resolution, both approaches produce similar results, with the smallest difference being less than 15 W m− 2 in mean QH averaged over the entire study area. This is encouraging given the differing architecture and data requirements of the LUMPS and ARM methods. Furthermore, in terms of mean study QH, the results obtained by averaging the original 6 m spatial resolution LUMPS-derived QH values to 30 and 90 m spatial resolution are within ∼ 5 W m− 2 of those derived from averaging the original surface parameter maps prior to input into LUMPS, suggesting that that use of much lower spatial resolution spaceborne imagery data, for example from Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER) is likely to be a practical solution for heat flux determination in urban areas.
Resumo:
In this work we explore the synergistic use of future MSI instrument on board Sentinel-2 platform and OLCI/SLSTR instruments on board Sentinel-3 platform in order to improve LST products currently derived from the single AATSR instrument on board the ENVI- SAT satellite. For this purpose, the high spatial resolu- tion data from Setinel2/MSI will be used for a good characterization of the land surface sub-pixel heteroge- neity, in particular for a precise parameterization of surface emissivity using a land cover map and spectral mixture techniques. On the other hand, the high spectral resolution of OLCI instrument, suitable for a better characterization of the atmosphere, along with the dual- view available in the SLTSR instrument, will allow a better atmospheric correction through improved aero- sol/water vapor content retrievals and the implementa- tion of novel cloud screening procedures. Effective emissivity and atmospheric corrections will allow accu- rate LST retrievals using the SLSTR thermal bands by developing a synergistic split-window/dual-angle algo- rithm. ENVISAT MERIS and AATSR instruments and different high spatial resolution data (Landsat/TM, Proba/CHRIS, Terra/ASTER) will be used as bench- mark for the future OLCI, SLSTR and MSI instruments. Results will be validated using ground data collected in the framework of different field campaigns organized by ESA.
Resumo:
A traditional method of validating the performance of a flood model when remotely sensed data of the flood extent are available is to compare the predicted flood extent to that observed. The performance measure employed often uses areal pattern-matching to assess the degree to which the two extents overlap. Recently, remote sensing of flood extents using synthetic aperture radar (SAR) and airborne scanning laser altimetry (LIDAR) has made more straightforward the synoptic measurement of water surface elevations along flood waterlines, and this has emphasised the possibility of using alternative performance measures based on height. This paper considers the advantages that can accrue from using a performance measure based on waterline elevations rather than one based on areal patterns of wet and dry pixels. The two measures were compared for their ability to estimate flood inundation uncertainty maps from a set of model runs carried out to span the acceptable model parameter range in a GLUE-based analysis. A 1 in 5-year flood on the Thames in 1992 was used as a test event. As is typical for UK floods, only a single SAR image of observed flood extent was available for model calibration and validation. A simple implementation of a two-dimensional flood model (LISFLOOD-FP) was used to generate model flood extents for comparison with that observed. The performance measure based on height differences of corresponding points along the observed and modelled waterlines was found to be significantly more sensitive to the channel friction parameter than the measure based on areal patterns of flood extent. The former was able to restrict the parameter range of acceptable model runs and hence reduce the number of runs necessary to generate an inundation uncertainty map. A result of this was that there was less uncertainty in the final flood risk map. The uncertainty analysis included the effects of uncertainties in the observed flood extent as well as in model parameters. The height-based measure was found to be more sensitive when increased heighting accuracy was achieved by requiring that observed waterline heights varied slowly along the reach. The technique allows for the decomposition of the reach into sections, with different effective channel friction parameters used in different sections, which in this case resulted in lower r.m.s. height differences between observed and modelled waterlines than those achieved by runs using a single friction parameter for the whole reach. However, a validation of the modelled inundation uncertainty using the calibration event showed a significant difference between the uncertainty map and the observed flood extent. While this was true for both measures, the difference was especially significant for the height-based one. This is likely to be due to the conceptually simple flood inundation model and the coarse application resolution employed in this case. The increased sensitivity of the height-based measure may lead to an increased onus being placed on the model developer in the production of a valid model
Resumo:
The Self-Organizing Map (SOM) is a popular unsupervised neural network able to provide effective clustering and data visualization for multidimensional input datasets. In this paper, we present an application of the simulated annealing procedure to the SOM learning algorithm with the aim to obtain a fast learning and better performances in terms of quantization error. The proposed learning algorithm is called Fast Learning Self-Organized Map, and it does not affect the easiness of the basic learning algorithm of the standard SOM. The proposed learning algorithm also improves the quality of resulting maps by providing better clustering quality and topology preservation of input multi-dimensional data. Several experiments are used to compare the proposed approach with the original algorithm and some of its modification and speed-up techniques.
Resumo:
Intensive land use reduces the diversity and abundance of many soil biota, with consequences for the processes that they govern and the ecosystem services that these processes underpin. Relationships between soil biota and ecosystem processes have mostly been found in laboratory experiments and rarely are found in the field. Here, we quantified, across four countries of contrasting climatic and soil conditions in Europe, how differences in soil food web composition resulting from land use systems (intensive wheat rotation, extensive rotation, and permanent grassland) influence the functioning of soils and the ecosystem services that they deliver. Intensive wheat rotation consistently reduced the biomass of all components of the soil food web across all countries. Soil food web properties strongly and consistently predicted processes of C and N cycling across land use systems and geographic locations, and they were a better predictor of these processes than land use. Processes of carbon loss increased with soil food web properties that correlated with soil C content, such as earthworm biomass and fungal/bacterial energy channel ratio, and were greatest in permanent grassland. In contrast, processes of N cycling were explained by soil food web properties independent of land use, such as arbuscular mycorrhizal fungi and bacterial channel biomass. Our quantification of the contribution of soil organisms to processes of C and N cycling across land use systems and geographic locations shows that soil biota need to be included in C and N cycling models and highlights the need to map and conserve soil biodiversity across the world.
Resumo:
Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.
Resumo:
Purpose – Recognizing the heterogeneity of services, this paper aims to clarify the characteristics of forward and the corresponding reverse supply chains of different services. Design/methodology/approach – The paper develops a two-dimensional typology matrix, representing four main clusters of services according to the degree of input standardization and the degree of output tangibility. Based on this matrix, this paper develops a typology and parsimonious conceptual models illustrating the characteristics of forward and the corresponding reverse supply chains of each cluster of services. Findings – The four main clusters of service supply chains have different characteristics. This provides the basis for the identification, presentation and explanation of the different characteristics of their corresponding reverse service supply chains. Research limitations/implications – The findings of this research can help future researchers to analyse, map and model forward and reverse service supply chains, and to identify potential research gaps in the area. Practical/implications – The findings of the research can help managers of service firms to gain better visibility of their forward and reverse supply chains, and refine their business models to help extend their reverse/closed-loop activities. Furthermore, the findings can help managers to better optimize their service operations to reduce service gaps and potentially secure new value-adding opportunities. Originality/value – This paper is the first, to the authors ' knowledge, to conceptualize the basic structure of the forward and reverse service supply chains while dealing with the high level of heterogeneity of services.