41 resultados para Spatial analysis of submerged macrophytes
Resumo:
This study focuses on the analysis of winter (October-November-December-January-February-March; ONDJFM) storm events and their changes due to increased anthropogenic greenhouse gas concentrations over Europe. In order to assess uncertainties that are due to model formulation, 4 regional climate models (RCMs) with 5 high resolution experiments, and 4 global general circulation models (GCMs) are considered. Firstly, cyclone systems as synoptic scale processes in winter are investigated, as they are a principal cause of the occurrence of extreme, damage-causing wind speeds. This is achieved by use of an objective cyclone identification and tracking algorithm applied to GCMs. Secondly, changes in extreme near-surface wind speeds are analysed. Based on percentile thresholds, the studied extreme wind speed indices allow a consistent analysis over Europe that takes systematic deviations of the models into account. Relative changes in both intensity and frequency of extreme winds and their related uncertainties are assessed and related to changing patterns of extreme cyclones. A common feature of all investigated GCMs is a reduced track density over central Europe under climate change conditions, if all systems are considered. If only extreme (i.e. the strongest 5%) cyclones are taken into account, an increasing cyclone activity for western parts of central Europe is apparent; however, the climate change signal reveals a reduced spatial coherency when compared to all systems, which exposes partially contrary results. With respect to extreme wind speeds, significant positive changes in intensity and frequency are obtained over at least 3 and 20% of the European domain under study (35–72°N and 15°W–43°E), respectively. Location and extension of the affected areas (up to 60 and 50% of the domain for intensity and frequency, respectively), as well as levels of changes (up to +15 and +200% for intensity and frequency, respectively) are shown to be highly dependent on the driving GCM, whereas differences between RCMs when driven by the same GCM are relatively small.
Resumo:
Global NDVI data are routinely derived from the AVHRR, SPOT-VGT, and MODIS/Terra earth observation records for a range of applications from terrestrial vegetation monitoring to climate change modeling. This has led to a substantial interest in the harmonization of multisensor records. Most evaluations of the internal consistency and continuity of global multisensor NDVI products have focused on time-series harmonization in the spectral domain, often neglecting the spatial domain. We fill this void by applying variogram modeling (a) to evaluate the differences in spatial variability between 8-km AVHRR, 1-km SPOT-VGT, and 1-km, 500-m, and 250-m MODIS NDVI products over eight EOS (Earth Observing System) validation sites, and (b) to characterize the decay of spatial variability as a function of pixel size (i.e. data regularization) for spatially aggregated Landsat ETM+ NDVI products and a real multisensor dataset. First, we demonstrate that the conjunctive analysis of two variogram properties – the sill and the mean length scale metric – provides a robust assessment of the differences in spatial variability between multiscale NDVI products that are due to spatial (nominal pixel size, point spread function, and view angle) and non-spatial (sensor calibration, cloud clearing, atmospheric corrections, and length of multi-day compositing period) factors. Next, we show that as the nominal pixel size increases, the decay of spatial information content follows a logarithmic relationship with stronger fit value for the spatially aggregated NDVI products (R2 = 0.9321) than for the native-resolution AVHRR, SPOT-VGT, and MODIS NDVI products (R2 = 0.5064). This relationship serves as a reference for evaluation of the differences in spatial variability and length scales in multiscale datasets at native or aggregated spatial resolutions. The outcomes of this study suggest that multisensor NDVI records cannot be integrated into a long-term data record without proper consideration of all factors affecting their spatial consistency. Hence, we propose an approach for selecting the spatial resolution, at which differences in spatial variability between NDVI products from multiple sensors are minimized. This approach provides practical guidance for the harmonization of long-term multisensor datasets.
Resumo:
The results of coupled high resolution global models (CGCMs) over South America are discussed. HiGEM1.2 and HadGEM1.2 simulations, with horizontal resolution of ~90 and 135 km, respectively, are compared. Precipitation estimations from CMAP (Climate Prediction Center—Merged Analysis of Precipitation), CPC (Climate Prediction Center) and GPCP (Global Precipitation Climatology Project) are used for validation. HiGEM1.2 and HadGEM1.2 simulated seasonal mean precipitation spatial patterns similar to the CMAP. The positioning and migration of the Intertropical Convergence Zone and of the Pacific and Atlantic subtropical highs are correctly simulated by the models. In HiGEM1.2 and HadGEM1.2, the intensity and locations of the South Atlantic Convergence Zone are in agreement with the observed dataset. The simulated annual cycles are in phase with estimations of rainfall for most of the six regions considered. An important result is that HiGEM1.2 and HadGEM1.2 eliminate a common problem of coarse resolution CGCMs, which is the simulation of a semiannual cycle of precipitation due to the semiannual solar forcing. Comparatively, the use of high resolution in HiGEM1.2 reduces the dry biases in the central part of Brazil during austral winter and spring and in most part of the year over an oceanic box in eastern Uruguay.
Energy exchange in a dense urban environment Part II: impact of spatial heterogeneity of the surface
Resumo:
The centre of cities, characterised by spatial and temporal complexity, are challenging environments for micrometeorological research. This paper considers the impact of sensor location and heterogeneity of the urban surface on flux observations in the dense city centre of London, UK. Data gathered at two sites in close vicinity, but with different measurement heights, were analysed to investigate the influence of source area characteristics on long-term radiation and turbulent heat fluxes. Combining consideration of diffuse radiation and effects of specular reflections, the non-Lambertian urban surface is found to impact the measurements of surface albedo. Comparisons of observations from the two sites reveal that turbulent heat fluxes are similar under some flow conditions. However, they mostly observe processes at different scales due to their differing measurement heights, highlighting the critical impact of siting sensors in urban areas. A detailed source area analysis is presented to investigate the surface controls influencing the energy exchanges at the different scales
Resumo:
The hippocampus plays a pivotal role in the formation and consolidation of episodic memories, and in spatial orientation. Historically, the adult hippocampus has been viewed as a very static anatomical region of the mammalian brain. However, recent findings have demonstrated that the dentate gyrus of the hippocampus is an area of tremendous plasticity in adults, involving not only modifications of existing neuronal circuits, but also adult neurogenesis. This plasticity is regulated by complex transcriptional networks, in which the transcription factor NF-κB plays a prominent role. To study and manipulate adult neurogenesis, a transgenic mouse model for forebrain-specific neuronal inhibition of NF-κB activity can be used. In this study, methods are described for the analysis of NF-κB-dependent neurogenesis, including its structural aspects, neuronal apoptosis and progenitor proliferation, and cognitive significance, which was specifically assessed via a dentate gyrus (DG)-dependent behavioral test, the spatial pattern separation-Barnes maze (SPS-BM). The SPS-BM protocol could be simply adapted for use with other transgenic animal models designed to assess the influence of particular genes on adult hippocampal neurogenesis. Furthermore, SPS-BM could be used in other experimental settings aimed at investigating and manipulating DG-dependent learning, for example, using pharmacological agents.
Resumo:
The network paradigm has been highly influential in spatial analysis in the globalisation era. As economies across the world have become increasingly integrated, so-called global cities have come to play a growing role as central nodes in the networked global economy. The idea that a city’s position in global networks benefits its economic performance has resulted in a competitive policy focus on promoting the economic growth of cities by improving their network connectivity. However, in spite of the attention being given to boosting city connectivity little is known about whether this directly translates to improved city economic performance and, if so, how well connected a city needs to be in order to benefit from this. In this paper we test the relationship between network connectivity and economic performance between 2000 and 2008 for cities with over 500,000 inhabitants in Europe and the USA to inform European policy.
Resumo:
Fire activity has varied globally and continuously since the last glacial maximum (LGM) in response to long-term changes in global climate and shorter-term regional changes in climate, vegetation, and human land use. We have synthesized sedimentary charcoal records of biomass burning since the LGM and present global maps showing changes in fire activity for time slices during the past 21,000 years (as differences in charcoal accumulation values compared to pre-industrial). There is strong broad-scale coherence in fire activity after the LGM, but spatial heterogeneity in the signals increases thereafter. In North America, Europe and southern South America, charcoal records indicate less-than-present fire activity during the deglacial period, from 21,000 to ∼11,000 cal yr BP. In contrast, the tropical latitudes of South America and Africa show greater-than-present fire activity from ∼19,000 to ∼17,000 cal yr BP and most sites from Indochina and Australia show greater-than-present fire activity from 16,000 to ∼13,000 cal yr BP. Many sites indicate greater-than-present or near-present activity during the Holocene with the exception of eastern North America and eastern Asia from 8,000 to ∼3,000 cal yr BP, Indonesia and Australia from 11,000 to 4,000 cal yr BP, and southern South America from 6,000 to 3,000 cal yr BP where fire activity was less than present. Regional coherence in the patterns of change in fire activity was evident throughout the post-glacial period. These complex patterns can largely be explained in terms of large-scale climate controls modulated by local changes in vegetation and fuel load
Resumo:
This paper analyses the impact of several avoided deforestation policies within a patchy forested landscape. Central is the idea that deforestation choices in one area influence deforestation decisions in nearby patches. We explore the interplay between forest landscapes comprising heterogeneous patches, localised spatial displacement, and avoided deforestation policies. Avoided deforestation policies at a landscape level are respectively: two Payments for Environmental Services (PES) policies, one focused on deforestation hotspots, the second being equally available to all agents; a conservation area; and, an agglomeration bonus. We demonstrate how the "best" policy, in terms of reduced leakage, depends on landscape heterogeneity. Agglomeration bonuses are shown to be more effective where there is less landscape heterogeneity, whilst conservation areas are most effective where there is more spatial heterogeneity.
Resumo:
Lipidomic analyses of milling and pearling fractions from wheat grain were carried out to determine differences in composition which could relate to the spatial distribution of lipids in the grain. Free fatty acids and triacylglycerols were major components in all fractions, but the relative contents of polar lipids varied, particularly lysophosphatidyl choline and digalactosyldiglyceride, which were enriched in flour fractions. By contrast, minor phospholipids were enriched in bran and offal fractions. The most abundant fatty acids in the analysed acyl lipids were C16:0 and C18:2 and their combinations, including C36:4 and C34:2. Phospholipids and galactolipids have been reported to have beneficial properties for bread making, while free fatty acids and triacylglycerols are considered detrimental. The subtle differences in the compositions of fractions determined in the present study could therefore underpin the production of flour fractions with optimised compositions for different end uses.
Resumo:
Considering the sea ice decline in the Arctic during the last decades, polynyas are of high research interest since these features are core areas of new ice formation. The determination of ice formation requires accurate retrieval of polynya area and thin-ice thickness (TIT) distribution within the polynya.We use an established energy balance model to derive TITs with MODIS ice surface temperatures (Ts) and NCEP/DOE Reanalysis II in the Laptev Sea for two winter seasons. Improvements of the algorithm mainly concern the implementation of an iterative approach to calculate the atmospheric flux components taking the atmospheric stratification into account. Furthermore, a sensitivity study is performed to analyze the errors of the ice thickness. The results are the following: 1) 2-m air temperatures (Ta) and Ts have the highest impact on the retrieved ice thickness; 2) an overestimation of Ta yields smaller ice thickness errors as an underestimation of Ta; 3) NCEP Ta shows often a warm bias; and 4) the mean absolute error for ice thicknesses up to 20 cm is ±4.7 cm. Based on these results, we conclude that, despite the shortcomings of the NCEP data (coarse spatial resolution and no polynyas), this data set is appropriate in combination with MODIS Ts for the retrieval of TITs up to 20 cm in the Laptev Sea region. The TIT algorithm can be applied to other polynya regions and to past and future time periods. Our TIT product is a valuable data set for verification of other model and remote sensing ice thickness data.
Resumo:
Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.