954 resultados para Spatial Information
Resumo:
Current force feedback, haptic interface devices are generally limited to the display of low frequency, high amplitude spatial data. A typical device consists of a low impedance framework of one or more degrees-of-freedom (dof), allowing a user to explore a pre-defined workspace via an end effector such as a handle, thimble, probe or stylus. The movement of the device is then constrained using high gain positional feedback, thus reducing the apparent dof of the device and conveying the illusion of hard contact to the user. Such devices are, however, limited to a narrow bandwidth of frequencies, typically below 30Hz, and are not well suited to the display of surface properties, such as object texture. This paper details a device to augment an existing force feedback haptic display with a vibrotactile display, thus providing a means of conveying low amplitude, high frequency spatial information of object surface properties. 1. Haptics and Haptic Interfaces Haptics is the study of human touch and interaction with the external environment via touch. Information from the human sense of touch can be classified in to two categories, cutaneous and kinesthetic. Cutaneous information is provided via the mechanoreceptive nerve endings in the glabrous skin of the human hand. It is primarily a means of relaying information regarding small-scale details in the form of skin stretch, compression and vibration.
Resumo:
Monitoring Earth's terrestrial water conditions is critically important to many hydrological applications such as global food production; assessing water resources sustainability; and flood, drought, and climate change prediction. These needs have motivated the development of pilot monitoring and prediction systems for terrestrial hydrologic and vegetative states, but to date only at the rather coarse spatial resolutions (∼10–100 km) over continental to global domains. Adequately addressing critical water cycle science questions and applications requires systems that are implemented globally at much higher resolutions, on the order of 1 km, resolutions referred to as hyperresolution in the context of global land surface models. This opinion paper sets forth the needs and benefits for a system that would monitor and predict the Earth's terrestrial water, energy, and biogeochemical cycles. We discuss six major challenges in developing a system: improved representation of surface‐subsurface interactions due to fine‐scale topography and vegetation; improved representation of land‐atmospheric interactions and resulting spatial information on soil moisture and evapotranspiration; inclusion of water quality as part of the biogeochemical cycle; representation of human impacts from water management; utilizing massively parallel computer systems and recent computational advances in solving hyperresolution models that will have up to 109 unknowns; and developing the required in situ and remote sensing global data sets. We deem the development of a global hyperresolution model for monitoring the terrestrial water, energy, and biogeochemical cycles a “grand challenge” to the community, and we call upon the international hydrologic community and the hydrological science support infrastructure to endorse the effort.
Resumo:
Infections involving Salmonella enterica subsp. enterica serovars have serious animal and human health implications; causing gastroenteritis in humans and clinical symptoms, such as diarrhoea and abortion, in livestock. In this study an optical genetic mapping technique was used to screen 20 field isolate strains from four serovars implicated in disease outbreaks. The technique was able to distinguish between the serovars and the available sequenced strains and group them in agreement with similar data from microarrays and PFGE. The optical maps revealed variation in genome maps associated with antimicrobial resistance and prophage content in S. Typhimurium, and separated the S. Newport strains into two clear geographical lineages defined by the presence of prophage sequences. The technique was also able to detect novel insertions that may have had effects on the central metabolism of some strains. Overall optical mapping allowed a greater level of differentiation of genomic content and spatial information than more traditional typing methods.
Resumo:
A favoured method of assimilating information from state-of-the-art climate models into integrated assessment models of climate impacts is to use the transient climate response (TCR) of the climate models as an input, sometimes accompanied by a pattern matching approach to provide spatial information. More recent approaches to the problem use TCR with another independent piece of climate model output: the land-sea surface warming ratio (φ). In this paper we show why the use of φ in addition to TCR has such utility. Multiple linear regressions of surface temperature change onto TCR and φ in 22 climate models from the CMIP3 multi-model database show that the inclusion of φ explains a much greater fraction of the inter-model variance than using TCR alone. The improvement is particularly pronounced in North America and Eurasia in the boreal summer season, and in the Amazon all year round. The use of φ as the second metric is beneficial for three reasons: firstly it is uncorrelated with TCR in state-of-the-art climate models and can therefore be considered as an independent metric; secondly, because of its projected time-invariance, the magnitude of φ is better constrained than TCR in the immediate future; thirdly, the use of two variables is much simpler than approaches such as pattern scaling from climate models. Finally we show how using the latest estimates of φ from climate models with a mean value of 1.6—as opposed to previously reported values of 1.4—can significantly increase the mean time-integrated discounted damage projections in a state-of-the-art integrated assessment model by about 15 %. When compared to damages calculated without the inclusion of the land-sea warming ratio, this figure rises to 65 %, equivalent to almost 200 trillion dollars over 200 years.
Resumo:
Global NDVI data are routinely derived from the AVHRR, SPOT-VGT, and MODIS/Terra earth observation records for a range of applications from terrestrial vegetation monitoring to climate change modeling. This has led to a substantial interest in the harmonization of multisensor records. Most evaluations of the internal consistency and continuity of global multisensor NDVI products have focused on time-series harmonization in the spectral domain, often neglecting the spatial domain. We fill this void by applying variogram modeling (a) to evaluate the differences in spatial variability between 8-km AVHRR, 1-km SPOT-VGT, and 1-km, 500-m, and 250-m MODIS NDVI products over eight EOS (Earth Observing System) validation sites, and (b) to characterize the decay of spatial variability as a function of pixel size (i.e. data regularization) for spatially aggregated Landsat ETM+ NDVI products and a real multisensor dataset. First, we demonstrate that the conjunctive analysis of two variogram properties – the sill and the mean length scale metric – provides a robust assessment of the differences in spatial variability between multiscale NDVI products that are due to spatial (nominal pixel size, point spread function, and view angle) and non-spatial (sensor calibration, cloud clearing, atmospheric corrections, and length of multi-day compositing period) factors. Next, we show that as the nominal pixel size increases, the decay of spatial information content follows a logarithmic relationship with stronger fit value for the spatially aggregated NDVI products (R2 = 0.9321) than for the native-resolution AVHRR, SPOT-VGT, and MODIS NDVI products (R2 = 0.5064). This relationship serves as a reference for evaluation of the differences in spatial variability and length scales in multiscale datasets at native or aggregated spatial resolutions. The outcomes of this study suggest that multisensor NDVI records cannot be integrated into a long-term data record without proper consideration of all factors affecting their spatial consistency. Hence, we propose an approach for selecting the spatial resolution, at which differences in spatial variability between NDVI products from multiple sensors are minimized. This approach provides practical guidance for the harmonization of long-term multisensor datasets.
Resumo:
The urban heat island is a well-known phenomenon that impacts a wide variety of city operations. With greater availability of cheap meteorological sensors, it is possible to measure the spatial patterns of urban atmospheric characteristics with greater resolution. To develop robust and resilient networks, recognizing sensors may malfunction, it is important to know when measurement points are providing additional information and also the minimum number of sensors needed to provide spatial information for particular applications. Here we consider the example of temperature data, and the urban heat island, through analysis of a network of sensors in the Tokyo metropolitan area (Extended METROS). The effect of reducing observation points from an existing meteorological measurement network is considered, using random sampling and sampling with clustering. The results indicated the sampling with hierarchical clustering can yield similar temperature patterns with up to a 30% reduction in measurement sites in Tokyo. The methods presented have broader utility in evaluating the robustness and resilience of existing urban temperature networks and in how networks can be enhanced by new mobile and open data sources.
Resumo:
Land cover data derived from satellites are commonly used to prescribe inputs to models of the land surface. Since such data inevitably contains errors, quantifying how uncertainties in the data affect a model’s output is important. To do so, a spatial distribution of possible land cover values is required to propagate through the model’s simulation. However, at large scales, such as those required for climate models, such spatial modelling can be difficult. Also, computer models often require land cover proportions at sites larger than the original map scale as inputs, and it is the uncertainty in these proportions that this article discusses. This paper describes a Monte Carlo sampling scheme that generates realisations of land cover proportions from the posterior distribution as implied by a Bayesian analysis that combines spatial information in the land cover map and its associated confusion matrix. The technique is computationally simple and has been applied previously to the Land Cover Map 2000 for the region of England and Wales. This article demonstrates the ability of the technique to scale up to large (global) satellite derived land cover maps and reports its application to the GlobCover 2009 data product. The results show that, in general, the GlobCover data possesses only small biases, with the largest belonging to non–vegetated surfaces. In vegetated surfaces, the most prominent area of uncertainty is Southern Africa, which represents a complex heterogeneous landscape. It is also clear from this study that greater resources need to be devoted to the construction of comprehensive confusion matrices.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
This paper aims at presenting an Interactive School Atlas prototype which was developed for cartography and environmental education. The methodology was based on the theoretical study about child mental development of Piaget theory, in order to elaborate strategies that allow the student a better comprehension about the spatial information understanding. It was defined as study case of sixth grade students, because they belong to the Formal Operation stage, in which the children reach the needed mental operations for understanding cartographic key concepts. This Atlas was developed in two stages: cartographic design and Atlas production. The Atlas implementation was developed seeking the use of Multimedia Cartography and animation resources that may attract students and teachers, instigating them to explore the tools and strategies to lead users to a correct interpretation of map contents. The Atlas was implemented by using the Macromedia Flash and Visual Basic softwares and the MapObjects library. Though the map has bot been evaluated yet, one should point out that it was designed according to the theoretical and methodological knowledge of the cognitive development and its relationship to cartographic conceptions, aiming at adapting the product to children cognitive skills.
Resumo:
Brain oscillation are not completely independent, but able to interact with each other through cross-frequency coupling (CFC) in at least four different ways: power-to-power, phase-to-phase, phase-to-frequency and phase-to-power. Recent evidence suggests that not only the rhythms per se, but also their interactions are involved in the execution of cognitive tasks, mainly those requiring selective attention, information flow and memory consolidation. It was recently proposed that fast gamma oscillations (60 150 Hz) convey spatial information from the medial entorhinal cortex to the CA1 region of the hippocampus by means of theta (4-12 Hz) phase coupling. Despite these findings, however, little is known about general characteristics of CFCs in several brain regions. In this work we recorded local field potentials using multielectrode arrays aimed at the CA1 region of the dorsal hippocampus for chronic recording. Cross-frequency coupling was evaluated by using comodulogram analysis, a CFC tool recently developted (Tort et al. 2008, Tort et al. 2010). All data analyses were performed using MATLAB (MathWorks Inc). Here we describe two functionally distinct oscillations within the fast gamma frequency range, both coupled to the theta rhythm during active exploration and REM sleep: an oscillation with peak activity at ~80 Hz, and a faster oscillation centered at ~140 Hz. The two oscillations are differentially modulated by the phase of theta depending on the CA1 layer; theta-80 Hz coupling is strongest at stratum lacunosum-moleculare, while theta-140 Hz coupling is strongest at stratum oriens-alveus. This laminar profile suggests that the ~80 Hz oscillation originates from entorhinal cortex inputs to deeper CA1 layers, while the ~140 Hz oscillation reflects CA1 activity in superficial layers. We further show that the ~140 Hz oscillation differs from sharp-wave associated ripple oscillations in several key characteristics. Our results demonstrate the existence of novel theta-associated high-frequency oscillations, and suggest a redefinition of fast gamma oscillations
Resumo:
In this paper we present the methodological procedures involved in the digital imaging in mesoscale of a block of travertines rock of quaternary age, originating from the city of Acquasanta, located in the Apennines, Italy. This rocky block, called T-Block, was stored in the courtyard of the Laboratório Experimental Petróleo "Kelsen Valente" (LabPetro), of Universidade Estadual de Campinas (UNICAMP), so that from it were performed Scientific studies, mainly for research groups universities and research centers working in brazilian areas of reservoir characterization and 3D digital imaging. The purpose of this work is the development of a Model Solid Digital, from the use of non-invasive techniques of digital 3D imaging of internal and external surfaces of the T-Block. For the imaging of the external surfaces technology has been used LIDAR (Light Detection and Range) and the imaging surface Interior was done using Ground Penetrating Radar (GPR), moreover, profiles were obtained with a Gamma Ray Gamae-spectômetro laptop. The goal of 3D digital imaging involved the identification and parameterization of surface geological and sedimentary facies that could represent heterogeneities depositional mesoscale, based on study of a block rocky with dimensions of approximately 1.60 m x 1.60 m x 2.70 m. The data acquired by means of terrestrial laser scanner made available georeferenced spatial information of the surface of the block (X, Y, Z), and varying the intensity values of the return laser beam and high resolution RGB data (3 mm x 3 mm), total points acquired 28,505,106. This information was used as an aid in the interpretation of radargrams and are ready to be displayed in rooms virtual reality. With the GPR was obtained 15 profiles of 2.3 m and 2 3D grids, each with 24 sections horizontal of 1.3 and 14 m vertical sections of 2.3 m, both the Antenna 900 MHz to about 2600 MHz antenna. Finally, the use of GPR associated with Laser Scanner enabled the identification and 3D mapping of 3 different radarfácies which were correlated with three sedimentary facies as had been defined at the outset. The 6 profiles showed gamma a low amplitude variation in the values of radioactivity. This is likely due to the fact of the sedimentary layers profiled have the same mineralogical composition, being composed by carbonate sediments, with no clay in siliciclastic pellitic layers or other mineral carrier elements radioactive
Resumo:
This paper describes a geostatistical method, known as factorial kriging analysis, which is well suited for analyzing multivariate spatial information. The method involves multivariate variogram modeling, principal component analysis, and cokriging. It uses several separate correlation structures, each corresponding to a specific spatial scale, and yields a set of regionalized factors summarizing the main features of the data for each spatial scale. This method is applied to an area of high manganese-ore mining activity in Amapa State, North Brazil. Two scales of spatial variation (0.33 and 2.0 km) are identified and interpreted. The results indicate that, for the short-range structure, manganese, arsenic, iron, and cadmium are associated with human activities due to the mining work, while for the long-range structure, the high aluminum, selenium, copper, and lead concentrations, seem to be related to the natural environment. At each scale, the correlation structure is analyzed, and regionalized factors are estimated by cokriging and then mapped.