942 resultados para Data Driven Modeling
Resumo:
Specific choices about how to represent complex networks can have a substantial impact on the execution time required for the respective construction and analysis of those structures. In this work we report a comparison of the effects of representing complex networks statically by adjacency matrices or dynamically by adjacency lists. Three theoretical models of complex networks are considered: two types of Erdos-Renyi as well as the Barabasi-Albert model. We investigated the effect of the different representations with respect to the construction and measurement of several topological properties (i.e. degree, clustering coefficient, shortest path length, and betweenness centrality). We found that different forms of representation generally have a substantial effect on the execution time, with the sparse representation frequently resulting in remarkably superior performance. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
As a result of urbanization, stormwater runoff flow rates and volumes are significantly increased due to increasing impervious land cover and the decreased availability of depression storage. Storage tanks are the basic devices to efficiently control the flow rate in drainage systems during wet weather. Presented in the paper conception of vacuum-driven detention tanks allows to increase the storage capacity by usage of space above the free surface water elevation at the inlet channel. Partial vacuum storage makes possible to gain cost savings by reduction of both the horizontal area of the detention tank and necessary depth of foundations. Simulation model of vacuum-driven storage tank has been developed to estimate potential profits of its application in urban drainage system. Although SWMM5 has no direct options for vacuum tanks an existing functions (i.e. control rules) have been used to reflect its operation phases. Rainfall data used in simulations were recorded at raingage in Czestochowa during years 2010÷2012 with time interval of 10minutes. Simulation results gives overview to practical operation and maintenance cost (energy demand) of vacuum driven storage tanks depending of the ratio: vacuum-driven volume to total storage capacity. The following conclusion can be drawn from this investigations: vacuum-driven storage tanks are characterized by uncomplicated construction and control systems, thus can be applied in newly developed as well as in the existing urban drainage systems. the application of vacuum in underground detention facilities makes possible to increase of the storage capacity of existing reservoirs by usage the space above the maximum depth. Possible increase of storage capacity can achieve even a few dozen percent at relatively low investment costs. vacuum driven storage tanks can be included in existing simulation software (i.e. SWMM) using options intended for pumping stations (including control and action rules ).
Resumo:
In this article, proportional hazards and logistic models for grouped survival data were extended to incorporate time-dependent covariates. The extension was motivated by a forestry experiment designed to compare five different water stresses in Eucalyptus grandis seedlings. The response was the seedling lifetime. The data set was grouped since there were just three occasions in which the seedlings was visited by the researcher. In each of these occasions also the shoot height was measured and therefore it is a time-dependent covariate. Both extended models were used in this example, and the results were very similar.
Resumo:
We present a generic spatially explicit modeling framework to estimate carbon emissions from deforestation (INPE-EM). The framework incorporates the temporal dynamics related to the deforestation process and accounts for the biophysical and socioeconomic heterogeneity of the region under study. We build an emission model for the Brazilian Amazon combining annual maps of new clearings, four maps of biomass, and a set of alternative parameters based on the recent literature. The most important results are as follows: (a) Using different biomass maps leads to large differences in estimates of emission; for the entire region of the Brazilian Amazon in the last decade, emission estimates of primary forest deforestation range from 0.21 to 0.26 similar to Pg similar to C similar to yr-1. (b) Secondary vegetation growth presents a small impact on emission balance because of the short duration of secondary vegetation. In average, the balance is only 5% smaller than the primary forest deforestation emissions. (c) Deforestation rates decreased significantly in the Brazilian Amazon in recent years, from 27 similar to Mkm2 in 2004 to 7 similar to Mkm2 in 2010. INPE-EM process-based estimates reflect this decrease even though the agricultural frontier is moving to areas of higher biomass. The decrease is slower than a non-process instantaneous model would estimate as it considers residual emissions (slash, wood products, and secondary vegetation). The average balance, considering all biomass, decreases from 0.28 in 2004 to 0.15 similar to Pg similar to C similar to yr-1 in 2009; the non-process model estimates a decrease from 0.33 to 0.10 similar to Pg similar to C similar to yr-1. We conclude that the INPE-EM is a powerful tool for representing deforestation-driven carbon emissions. Biomass estimates are still the largest source of uncertainty in the effective use of this type of model for informing mechanisms such as REDD+. The results also indicate that efforts to reduce emissions should focus not only on controlling primary forest deforestation but also on creating incentives for the restoration of secondary forests.
Resumo:
The multi-scale synoptic circulation system in the southeastern Brazil (SEBRA) region is presented using a feature-oriented approach. Prevalent synoptic circulation structures, or ""features,"" are identified from previous observational studies. These features include the southward-flowing Brazil Current (BC), the eddies off Cabo Sao Tome (CST - 22 degrees S) and off Cabo Frio (CF - 23 degrees S), and the upwelling region off CF and CST. Their synoptic water-mass (T-S) structures are characterized and parameterized to develop temperature-salinity (T-S) feature models. Following [Gangopadhyay, A., Robinson, A.R., Haley, PJ., Leslie, W.J., Lozano, C.j., Bisagni, J., Yu, Z., 2003. Feature-oriented regional modeling and simulation (forms) in the gulf of maine and georges bank. Cont. Shelf Res. 23 (3-4), 317-353] methodology, a synoptic initialization scheme for feature-oriented regional modeling and simulation (FORMS) of the circulation in this region is then developed. First, the temperature and salinity feature-model profiles are placed on a regional circulation template and objectively analyzed with available background climatology in the deep region. These initialization fields are then used for dynamical simulations via the Princeton Ocean Model (POM). A few first applications of this methodology are presented in this paper. These include the BC meandering, the BC-eddy interaction and the meander-eddy-upwelling system (MEUS) simulations. Preliminary validation results include realistic wave-growth and eddy formation and sustained upwelling. Our future plan includes the application of these feature models with satellite, in-situ data and advanced data-assimilation schemes for nowcasting and forecasting the SEBRA region. (c) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The continental margin of southeast Brazil is elevated. Onshore Tertiary basins and Late Cretaceous/Paleogene intrusions are good evidence for post breakup tectono-magmatic activity. To constrain the impact of post-rift reactivation on the geological history of the area, we carried out a new thermochronological study. Apatite fission track ages range from 60.7 +/- 1.9 Ma to 129.3 +/- 4.3 Ma, mean track lengths from 11.41 +/- 0.23 mu m to 14.31 +/- 0.24 mu m and a subset of the (U-Th)/He ages range from 45.1 +/- 1.5 to 122.4 +/- 2.5 Ma. Results of inverse thermal history modeling generally support the conclusions from an earlier study for a Late Cretaceous phase of cooling. Around the onshore Taubate Basin, for a limited number of samples, the first detectable period of cooling occurred during the Early Tertiary. The inferred thermal histories for many samples also imply subsequent reheating followed by Neogene cooling. Given the uncertainty of the inversion results, we did deterministic forward modeling to assess the range of possibilities of this Tertiary part of the thermal history. The evidence for reheating seems to be robust around the Taubate Basin, but elsewhere the data cannot discriminate between this and a less complex thermal history. However, forward modeling results and geological information support the conclusion that the whole area underwent cooling during the Neogene. The synchronicity of the cooling phases with Andean tectonics and those in NE Brazil leads us to assume a plate-wide compressional stress that reactivated inherited structures. The present-day topographic relief of the margin reflects a contribution from post-breakup reactivation and uplift.
Resumo:
A new method for analysis of scattering data from lamellar bilayer systems is presented. The method employs a form-free description of the cross-section structure of the bilayer and the fit is performed directly to the scattering data, introducing also a structure factor when required. The cross-section structure (electron density profile in the case of X-ray scattering) is described by a set of Gaussian functions and the technique is termed Gaussian deconvolution. The coefficients of the Gaussians are optimized using a constrained least-squares routine that induces smoothness of the electron density profile. The optimization is coupled with the point-of-inflection method for determining the optimal weight of the smoothness. With the new approach, it is possible to optimize simultaneously the form factor, structure factor and several other parameters in the model. The applicability of this method is demonstrated by using it in a study of a multilamellar system composed of lecithin bilayers, where the form factor and structure factor are obtained simultaneously, and the obtained results provided new insight into this very well known system.
Resumo:
With the increasing production of information from e-government initiatives, there is also the need to transform a large volume of unstructured data into useful information for society. All this information should be easily accessible and made available in a meaningful and effective way in order to achieve semantic interoperability in electronic government services, which is a challenge to be pursued by governments round the world. Our aim is to discuss the context of e-Government Big Data and to present a framework to promote semantic interoperability through automatic generation of ontologies from unstructured information found in the Internet. We propose the use of fuzzy mechanisms to deal with natural language terms and present some related works found in this area. The results achieved in this study are based on the architectural definition and major components and requirements in order to compose the proposed framework. With this, it is possible to take advantage of the large volume of information generated from e-Government initiatives and use it to benefit society.
Resumo:
ÈN]A trans-oceanic section at 24.5°N in the North Atlantic has been sampled at a decadal frequency. This work demonstrates that the wind-driven component of the Meridional Overturning Circulation (MOC) may be monitored using autonomous profiling floats deployed in the eastern North Atlantic Subtropical Gyre. More than 500 CTD vertical profiles from the surface to 2000 m depth, spanning one year (from April 2002 to March 2003), are used to compute the geostrophic transport stream function at 24.5°N. The baroclinic transport obtained from the autonomous profiling floats is not statistically different than that from three hydrographic cruises carried out in 1957, 1981 and 1992. A good agreement is found between the geostrophic transport stream function and the transport derived from the wind field through the Sverdrup relation.
Resumo:
[EN] This work makes a theoretical–experimental contribution to the study of ester and alkane solutions. Experimental data of isobaric vapor–liquid equilibria (VLE) are presented at 101.3 kPa for binary systems of methyl ethanoate with six alkanes (from C5 to C10), and of volumes and mixing enthalpies, vE and hE.
Resumo:
Ontology design and population -core aspects of semantic technologies- re- cently have become fields of great interest due to the increasing need of domain-specific knowledge bases that can boost the use of Semantic Web. For building such knowledge resources, the state of the art tools for ontology design require a lot of human work. Producing meaningful schemas and populating them with domain-specific data is in fact a very difficult and time-consuming task. Even more if the task consists in modelling knowledge at a web scale. The primary aim of this work is to investigate a novel and flexible method- ology for automatically learning ontology from textual data, lightening the human workload required for conceptualizing domain-specific knowledge and populating an extracted schema with real data, speeding up the whole ontology production process. Here computational linguistics plays a fundamental role, from automati- cally identifying facts from natural language and extracting frame of relations among recognized entities, to producing linked data with which extending existing knowledge bases or creating new ones. In the state of the art, automatic ontology learning systems are mainly based on plain-pipelined linguistics classifiers performing tasks such as Named Entity recognition, Entity resolution, Taxonomy and Relation extraction [11]. These approaches present some weaknesses, specially in capturing struc- tures through which the meaning of complex concepts is expressed [24]. Humans, in fact, tend to organize knowledge in well-defined patterns, which include participant entities and meaningful relations linking entities with each other. In literature, these structures have been called Semantic Frames by Fill- 6 Introduction more [20], or more recently as Knowledge Patterns [23]. Some NLP studies has recently shown the possibility of performing more accurate deep parsing with the ability of logically understanding the structure of discourse [7]. In this work, some of these technologies have been investigated and em- ployed to produce accurate ontology schemas. The long-term goal is to collect large amounts of semantically structured information from the web of crowds, through an automated process, in order to identify and investigate the cognitive patterns used by human to organize their knowledge.
Resumo:
Hydrothermal fluids are a fundamental resource for understanding and monitoring volcanic and non-volcanic systems. This thesis is focused on the study of hydrothermal system through numerical modeling with the geothermal simulator TOUGH2. Several simulations are presented, and geophysical and geochemical observables, arising from fluids circulation, are analyzed in detail throughout the thesis. In a volcanic setting, fluids feeding fumaroles and hot spring may play a key role in the hazard evaluation. The evolution of the fluids circulation is caused by a strong interaction between magmatic and hydrothermal systems. A simultaneous analysis of different geophysical and geochemical observables is a sound approach for interpreting monitored data and to infer a consistent conceptual model. Analyzed observables are ground displacement, gravity changes, electrical conductivity, amount, composition and temperature of the emitted gases at surface, and extent of degassing area. Results highlight the different temporal response of the considered observables, as well as the different radial pattern of variation. However, magnitude, temporal response and radial pattern of these signals depend not only on the evolution of fluid circulation, but a main role is played by the considered rock properties. Numerical simulations highlight differences that arise from the assumption of different permeabilities, for both homogeneous and heterogeneous systems. Rock properties affect hydrothermal fluid circulation, controlling both the range of variation and the temporal evolution of the observable signals. Low temperature fumaroles and low discharge rate may be affected by atmospheric conditions. Detailed parametric simulations were performed, aimed to understand the effects of system properties, such as permeability and gas reservoir overpressure, on diffuse degassing when air temperature and barometric pressure changes are applied to the ground surface. Hydrothermal circulation, however, is not only a characteristic of volcanic system. Hot fluids may be involved in several mankind problems, such as studies on geothermal engineering, nuclear waste propagation in porous medium, and Geological Carbon Sequestration (GCS). The current concept for large-scale GCS is the direct injection of supercritical carbon dioxide into deep geological formations which typically contain brine. Upward displacement of such brine from deep reservoirs driven by pressure increases resulting from carbon dioxide injection may occur through abandoned wells, permeable faults or permeable channels. Brine intrusion into aquifers may degrade groundwater resources. Numerical results show that pressure rise drives dense water up to the conduits, and does not necessarily result in continuous flow. Rather, overpressure leads to new hydrostatic equilibrium if fluids are initially density stratified. If warm and salty fluid does not cool passing through the conduit, an oscillatory solution is then possible. Parameter studies delineate steady-state (static) and oscillatory solutions.
Resumo:
We use data from about 700 GPS stations in the EuroMediterranen region to investigate the present-day behavior of the the Calabrian subduction zone within the Mediterranean-scale plates kinematics and to perform local scale studies about the strain accumulation on active structures. We focus attenction on the Messina Straits and Crati Valley faults where GPS data show extentional velocity gradients of ∼3 mm/yr and ∼2 mm/yr, respectively. We use dislocation model and a non-linear constrained optimization algorithm to invert for fault geometric parameters and slip-rates and evaluate the associated uncertainties adopting a bootstrap approach. Our analysis suggest the presence of two partially locked normal faults. To investigate the impact of elastic strain contributes from other nearby active faults onto the observed velocity gradient we use a block modeling approach. Our models show that the inferred slip-rates on the two analyzed structures are strongly impacted by the assumed locking width of the Calabrian subduction thrust. In order to frame the observed local deformation features within the present- day central Mediterranean kinematics we realyze a statistical analysis testing the indipendent motion (w.r.t. the African and Eurasias plates) of the Adriatic, Cal- abrian and Sicilian blocks. Our preferred model confirms a microplate like behaviour for all the investigated blocks. Within these kinematic boundary conditions we fur- ther investigate the Calabrian Slab interface geometry using a combined approach of block modeling and χ2ν statistic. Almost no information is obtained using only the horizontal GPS velocities that prove to be a not sufficient dataset for a multi-parametric inversion approach. Trying to stronger constrain the slab geometry we estimate the predicted vertical velocities performing suites of forward models of elastic dislocations varying the fault locking depth. Comparison with the observed field suggest a maximum resolved locking depth of 25 km.
Resumo:
Several countries have acquired, over the past decades, large amounts of area covering Airborne Electromagnetic data. Contribution of airborne geophysics has dramatically increased for both groundwater resource mapping and management proving how those systems are appropriate for large-scale and efficient groundwater surveying. We start with processing and inversion of two AEM dataset from two different systems collected over the Spiritwood Valley Aquifer area, Manitoba, Canada respectively, the AeroTEM III (commissioned by the Geological Survey of Canada in 2010) and the “Full waveform VTEM” dataset, collected and tested over the same survey area, during the fall 2011. We demonstrate that in the presence of multiple datasets, either AEM and ground data, due processing, inversion, post-processing, data integration and data calibration is the proper approach capable of providing reliable and consistent resistivity models. Our approach can be of interest to many end users, ranging from Geological Surveys, Universities to Private Companies, which are often proprietary of large geophysical databases to be interpreted for geological and\or hydrogeological purposes. In this study we deeply investigate the role of integration of several complimentary types of geophysical data collected over the same survey area. We show that data integration can improve inversions, reduce ambiguity and deliver high resolution results. We further attempt to use the final, most reliable output resistivity models as a solid basis for building a knowledge-driven 3D geological voxel-based model. A voxel approach allows a quantitative understanding of the hydrogeological setting of the area, and it can be further used to estimate the aquifers volumes (i.e. potential amount of groundwater resources) as well as hydrogeological flow model prediction. In addition, we investigated the impact of an AEM dataset towards hydrogeological mapping and 3D hydrogeological modeling, comparing it to having only a ground based TEM dataset and\or to having only boreholes data.
Resumo:
A feature represents a functional requirement fulfilled by a system. Since many maintenance tasks are expressed in terms of features, it is important to establish the correspondence between a feature and its implementation in source code. Traditional approaches to establish this correspondence exercise features to generate a trace of runtime events, which is then processed by post-mortem analysis. These approaches typically generate large amounts of data to analyze. Due to their static nature, these approaches do not support incremental and interactive analysis of features. We propose a radically different approach called live feature analysis, which provides a model at runtime of features. Our approach analyzes features on a running system and also makes it possible to grow feature representations by exercising different scenarios of the same feature, and identifies execution elements even to the sub-method level. We describe how live feature analysis is implemented effectively by annotating structural representations of code based on abstract syntax trees. We illustrate our live analysis with a case study where we achieve a more complete feature representation by exercising and merging variants of feature behavior and demonstrate the efficiency or our technique with benchmarks.