10 resultados para Modeling Geomorphological Processes

em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo


Relevância:

40.00% 40.00%

Publicador:

Resumo:

The aim of this work is to use GIS integration data to characterize sedimentary processes in a SubTropical lagoon environment. The study area was the Canan,ia Inlet estuary in the southeastern section of the Canan,ia Lagoon Estuarine System (CLES), state of So Paulo, Brazil (25A degrees 03'S/47A degrees 53'W). The area is formed by the confluence of two estuarine channels forming a bay-shaped water body locally called "Trapand, Bay". The region is surrounded by one of the most preserved tracts of Atlantic Rain Forest in Southwestern Brazil and presents well-developed mangroves and marshes. In this study a methodology was developed using integrated a GIS database based on bottom sediment parameters, geomorphological data, remote sensing images, Hidrodynamical Modeling data and geophysical parameters. The sediment grain size parameters and the bottom morphology of the lagoon were also used to develop models of net sediment transport pathways. It was possible to observe that the sediment transport vectors based on the grain size model had a good correlation with the transport model based on the bottom topography features and Hydrodynamic model, especially in areas with stronger energetic conditions, with a minor contribution of finer sediments. This relation is somewhat less evident near shallower banks and depositional features. In these regions the organic matter contents in the sediments was a good complementary tool for inferring the hydrodynamic and depositional conditions (i.e. primary productivity, sedimentation rates, sources, oxi-reduction rates).

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Over the last few years, Business Process Management (BPM) has achieved increasing popularity and dissemination. An analysis of the underlying assumptions of BPM shows that it pursues two apparently contradicting goals: on the one hand it aims at formalising work practices into business process models; on the other hand, it intends to confer flexibility to the organization - i.e. to maintain its ability to respond to new and unforeseen situations. This paper analyses the relationship between formalisation and flexibility in business process modelling by means of an empirical case study of a BPM project in an aircraft maintenance company. A qualitative approach is adopted based on the Actor-Network Theory. The paper offers two major contributions: (a) it illustrates the sociotechnical complexity involved in BPM initiatives; (b) it points towards a multidimensional understanding of the relation between formalization and flexibility in BPM projects.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Molecular modeling is growing as a research tool in Chemical Engineering studies, as can be seen by a simple research on the latest publications in the field. Molecular investigations retrieve information on properties often accessible only by expensive and time-consuming experimental techniques, such as those involved in the study of radical-based chain reactions. In this work, different quantum chemical techniques were used to study phenol oxidation by hydroxyl radicals in Advanced Oxidation Processes used for wastewater treatment. The results obtained by applying a DFT-based model showed good agreement with experimental values available, as well as qualitative insights into the mechanism of the overall reaction chain. Solvation models were also tried, but were found to be limited for this reaction system within the considered theoretical level without further parameterization.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Molecular modeling is growing as a research tool in Chemical Engineering studies, as can be seen by a simple research on the latest publications in the field. Molecular investigations retrieve information on properties often accessible only by expensive and time-consuming experimental techniques, such as those involved in the study of radical-based chain reactions. In this work, different quantum chemical techniques were used to study phenol oxidation by hydroxyl radicals in Advanced Oxidation Processes used for wastewater treatment. The results obtained by applying a DFT-based model showed good agreement with experimental values available, as well as qualitative insights into the mechanism of the overall reaction chain. Solvation models were also tried, but were found to be limited for this reaction system within the considered theoretical level without further parameterization.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Polynomial Chaos Expansion (PCE) is widely recognized as a flexible tool to represent different types of random variables/processes. However, applications to real, experimental data are still limited. In this article, PCE is used to represent the random time-evolution of metal corrosion growth in marine environments. The PCE coefficients are determined in order to represent data of 45 corrosion coupons tested by Jeffrey and Melchers (2001) at Taylors Beach, Australia. Accuracy of the representation and possibilities for model extrapolation are considered in the study. Results show that reasonably accurate smooth representations of the corrosion process can be obtained. The representation is not better because a smooth model is used to represent non-smooth corrosion data. Random corrosion leads to time-variant reliability problems, due to resistance degradation over time. Time variant reliability problems are not trivial to solve, especially under random process loading. Two example problems are solved herein, showing how the developed PCE representations can be employed in reliability analysis of structures subject to marine corrosion. Monte Carlo Simulation is used to solve the resulting time-variant reliability problems. However, an accurate and more computationally efficient solution is also presented.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Exergy analysis is applied to assess the energy conversion processes that take place in the human body, aiming at developing indicators of health and performance based on the concepts of exergy destroyed rate and exergy efficiency. The thermal behavior of the human body is simulated by a model composed of 15 cylinders with elliptical cross section representing: head, neck, trunk, arms, forearms, hands, thighs, legs, and feet. For each, a combination of tissues is considered. The energy equation is solved for each cylinder, being possible to obtain transitory response from the body due to a variation in environmental conditions. With this model, it is possible to obtain heat and mass flow rates to the environment due to radiation, convection, evaporation and respiration. The exergy balances provide the exergy variation due to heat and mass exchange over the body, and the exergy variation over time for each compartments tissue and blood, the sum of which leads to the total variation of the body. Results indicate that exergy destroyed and exergy efficiency decrease over lifespan and the human body is more efficient and destroys less exergy in lower relative humidities and higher temperatures. (C) 2012 Elsevier Ltd. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Building facilities have become important infrastructures for modern productive plants dedicated to services. In this context, the control systems of intelligent buildings have evolved while their reliability has evidently improved. However, the occurrence of faults is inevitable in systems conceived, constructed and operated by humans. Thus, a practical alternative approach is found to be very useful to reduce the consequences of faults. Yet, only few publications address intelligent building modeling processes that take into consideration the occurrence of faults and how to manage their consequences. In the light of the foregoing, a procedure is proposed for the modeling of intelligent building control systems, considersing their functional specifications in normal operation and in the of the event of faults. The proposed procedure adopts the concepts of discrete event systems and holons, and explores Petri nets and their extensions so as to represent the structure and operation of control systems for intelligent buildings under normal and abnormal situations. (C) 2012 Elsevier B.V. All rights reserved.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The continental margin of southeast Brazil is elevated. Onshore Tertiary basins and Late Cretaceous/Paleogene intrusions are good evidence for post breakup tectono-magmatic activity. To constrain the impact of post-rift reactivation on the geological history of the area, we carried out a new thermochronological study. Apatite fission track ages range from 60.7 +/- 1.9 Ma to 129.3 +/- 4.3 Ma, mean track lengths from 11.41 +/- 0.23 mu m to 14.31 +/- 0.24 mu m and a subset of the (U-Th)/He ages range from 45.1 +/- 1.5 to 122.4 +/- 2.5 Ma. Results of inverse thermal history modeling generally support the conclusions from an earlier study for a Late Cretaceous phase of cooling. Around the onshore Taubate Basin, for a limited number of samples, the first detectable period of cooling occurred during the Early Tertiary. The inferred thermal histories for many samples also imply subsequent reheating followed by Neogene cooling. Given the uncertainty of the inversion results, we did deterministic forward modeling to assess the range of possibilities of this Tertiary part of the thermal history. The evidence for reheating seems to be robust around the Taubate Basin, but elsewhere the data cannot discriminate between this and a less complex thermal history. However, forward modeling results and geological information support the conclusion that the whole area underwent cooling during the Neogene. The synchronicity of the cooling phases with Andean tectonics and those in NE Brazil leads us to assume a plate-wide compressional stress that reactivated inherited structures. The present-day topographic relief of the margin reflects a contribution from post-breakup reactivation and uplift.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this work, we study the performance evaluation of resource-aware business process models. We define a new framework that allows the generation of analytical models for performance evaluation from business process models annotated with resource management information. This framework is composed of a new notation that allows the specification of resource management constraints and a method to convert a business process specification and its resource constraints into Stochastic Automata Networks (SANs). We show that the analysis of the generated SAN model provides several performance indices, such as average throughput of the system, average waiting time, average queues size, and utilization rate of resources. Using the BP2SAN tool - our implementation of the proposed framework - and a SAN solver (such as the PEPS tool) we show through a simple use-case how a business specialist with no skills in stochastic modeling can easily obtain performance indices that, in turn, can help to identify bottlenecks on the model, to perform workload characterization, to define the provisioning of resources, and to study other performance related aspects of the business process.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Background To understand the molecular mechanisms underlying important biological processes, a detailed description of the gene products networks involved is required. In order to define and understand such molecular networks, some statistical methods are proposed in the literature to estimate gene regulatory networks from time-series microarray data. However, several problems still need to be overcome. Firstly, information flow need to be inferred, in addition to the correlation between genes. Secondly, we usually try to identify large networks from a large number of genes (parameters) originating from a smaller number of microarray experiments (samples). Due to this situation, which is rather frequent in Bioinformatics, it is difficult to perform statistical tests using methods that model large gene-gene networks. In addition, most of the models are based on dimension reduction using clustering techniques, therefore, the resulting network is not a gene-gene network but a module-module network. Here, we present the Sparse Vector Autoregressive model as a solution to these problems. Results We have applied the Sparse Vector Autoregressive model to estimate gene regulatory networks based on gene expression profiles obtained from time-series microarray experiments. Through extensive simulations, by applying the SVAR method to artificial regulatory networks, we show that SVAR can infer true positive edges even under conditions in which the number of samples is smaller than the number of genes. Moreover, it is possible to control for false positives, a significant advantage when compared to other methods described in the literature, which are based on ranks or score functions. By applying SVAR to actual HeLa cell cycle gene expression data, we were able to identify well known transcription factor targets. Conclusion The proposed SVAR method is able to model gene regulatory networks in frequent situations in which the number of samples is lower than the number of genes, making it possible to naturally infer partial Granger causalities without any a priori information. In addition, we present a statistical test to control the false discovery rate, which was not previously possible using other gene regulatory network models.