6 resultados para Mapping time
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo
Resumo:
This paper presents the results of electrical resistivity methods in the area delineation that was potentially contaminated by liquefaction products, which are also called putrefactive liquids in Vila Rezende municipal cemetery, Piracicaba, So Paulo, Brazil. The results indicate a depth of water table between 3.1 and 5.1 m, with two groundwater direction flows, one to the SW and another to the SE. Due to the contamination plumes, which have the same groundwater direction flow, as well the conductive anomalies observed in the geoelectric sections, the contamination suspicions in the area were confirmed. The probable plume to the SE extends beyond the limits of the cemetery. The location of the conductive anomalies and the probable contamination plumes showed that the contamination is linked with the depth of the water table and the burial time. Mapping using the geostatistical method of ordinary kriging applied to the work drew structural characteristics of the regional phenomenon and spatial behavior of the electrical resistivity data, resulting in continued surfaces. Thus, this method has proved to be an important tool for mapping contamination plumes in cemeteries.
Resumo:
The dynamics of a driven stadium-like billiard is considered using the formalism of discrete mappings. The model presents a resonant velocity that depends on the rotation number around fixed points and external boundary perturbation which plays an important separation rule in the model. We show that particles exhibiting Fermi acceleration (initial velocity is above the resonant one) are scaling invariant with respect to the initial velocity and external perturbation. However, initial velocities below the resonant one lead the particles to decelerate therefore unlimited energy growth is not observed. This phenomenon may be interpreted as a specific Maxwell's Demon which may separate fast and slow billiard particles. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
The time is ripe for a comprehensive mission to explore and document Earth's species. This calls for a campaign to educate and inspire the next generation of professional and citizen species explorers, investments in cyber-infrastructure and collections to meet the unique needs of the producers and consumers of taxonomic information, and the formation and coordination of a multi-institutional, international, transdisciplinary community of researchers, scholars and engineers with the shared objective of creating a comprehensive inventory of species and detailed map of the biosphere. We conclude that an ambitious goal to describe 10 million species in less than 50 years is attainable based on the strength of 250 years of progress, worldwide collections, existing experts, technological innovation and collaborative teamwork. Existing digitization projects are overcoming obstacles of the past, facilitating collaboration and mobilizing literature, data, images and specimens through cyber technologies. Charting the biosphere is enormously complex, yet necessary expertise can be found through partnerships with engineers, information scientists, sociologists, ecologists, climate scientists, conservation biologists, industrial project managers and taxon specialists, from agrostologists to zoophytologists. Benefits to society of the proposed mission would be profound, immediate and enduring, from detection of early responses of flora and fauna to climate change to opening access to evolutionary designs for solutions to countless practical problems. The impacts on the biodiversity, environmental and evolutionary sciences would be transformative, from ecosystem models calibrated in detail to comprehensive understanding of the origin and evolution of life over its 3.8 billion year history. The resultant cyber-enabled taxonomy, or cybertaxonomy, would open access to biodiversity data to developing nations, assure access to reliable data about species, and change how scientists and citizens alike access, use and think about biological diversity information.
Resumo:
The extraction of information about neural activity timing from BOLD signal is a challenging task as the shape of the BOLD curve does not directly reflect the temporal characteristics of electrical activity of neurons. In this work, we introduce the concept of neural processing time (NPT) as a parameter of the biophysical model of the hemodynamic response function (HRF). Through this new concept we aim to infer more accurately the duration of neuronal response from the highly nonlinear BOLD effect. The face validity and applicability of the concept of NPT are evaluated through simulations and analysis of experimental time series. The results of both simulation and application were compared with summary measures of HRF shape. The experiment that was analyzed consisted of a decision-making paradigm with simultaneous emotional distracters. We hypothesize that the NPT in primary sensory areas, like the fusiform gyrus, is approximately the stimulus presentation duration. On the other hand, in areas related to processing of an emotional distracter, the NPT should depend on the experimental condition. As predicted, the NPT in fusiform gyrus is close to the stimulus duration and the NPT in dorsal anterior cingulate gyrus depends on the presence of an emotional distracter. Interestingly, the NPT in right but not left dorsal lateral prefrontal cortex depends on the stimulus emotional content. The summary measures of HRF obtained by a standard approach did not detect the variations observed in the NPT. Hum Brain Mapp, 2012. (C) 2010 Wiley Periodicals, Inc.
Resumo:
Abstract Background Delignification pretreatments of biomass and methods to assess their efficacy are crucial for biomass-to-biofuels research and technology. Here, we applied confocal and fluorescence lifetime imaging microscopy (FLIM) using one- and two-photon excitation to map the lignin distribution within bagasse fibers pretreated with acid and alkali. The evaluated spectra and decay times are correlated with previously calculated lignin fractions. We have also investigated the influence of the pretreatment on the lignin distribution in the cell wall by analyzing the changes in the fluorescence characteristics using two-photon excitation. Eucalyptus fibers were also analyzed for comparison. Results Fluorescence spectra and variations of the decay time correlate well with the delignification yield and the lignin distribution. The decay dependences are considered two-exponential, one with a rapid (τ1) and the other with a slow (τ2) decay time. The fastest decay is associated to concentrated lignin in the bagasse and has a low sensitivity to the treatment. The fluorescence decay time became longer with the increase of the alkali concentration used in the treatment, which corresponds to lignin emission in a less concentrated environment. In addition, the two-photon fluorescence spectrum is very sensitive to lignin content and accumulation in the cell wall, broadening with the acid pretreatment and narrowing with the alkali one. Heterogeneity of the pretreated cell wall was observed. Conclusions Our results reveal lignin domains with different concentration levels. The acid pretreatment caused a disorder in the arrangement of lignin and its accumulation in the external border of the cell wall. The alkali pretreatment efficiently removed lignin from the middle of the bagasse fibers, but was less effective in its removal from their surfaces. Our results evidenced a strong correlation between the decay times of the lignin fluorescence and its distribution within the cell wall. A new variety of lignin fluorescence states were accessed by two-photon excitation, which allowed an even broader, but complementary, optical characterization of lignocellulosic materials. These results suggest that the lignin arrangement in untreated bagasse fiber is based on a well-organized nanoenvironment that favors a very low level of interaction between the molecules.
Resumo:
Network virtualization is a promising technique for building the Internet of the future since it enables the low cost introduction of new features into network elements. An open issue in such virtualization is how to effect an efficient mapping of virtual network elements onto those of the existing physical network, also called the substrate network. Mapping is an NP-hard problem and existing solutions ignore various real network characteristics in order to solve the problem in a reasonable time frame. This paper introduces new algorithms to solve this problem based on 0–1 integer linear programming, algorithms based on a whole new set of network parameters not taken into account by previous proposals. Approximative algorithms proposed here allow the mapping of virtual networks on large network substrates. Simulation experiments give evidence of the efficiency of the proposed algorithms.