956 resultados para Large modeling projects
Resumo:
Until the early 90s, the simulation of fluid flow in oil reservoir basically used the numerical technique of finite differences. Since then, there was a big development in simulation technology based on streamlines, so that nowadays it is being used in several cases and it can represent the physical mechanisms that influence the fluid flow, such as compressibility, capillarity and gravitational segregation. Streamline-based flow simulation is a tool that can help enough in waterflood project management, because it provides important information not available through traditional simulation of finite differences and shows, in a direct way, the influence between injector well and producer well. This work presents the application of a methodology published in literature for optimizing water injection projects in modeling of a Brazilian Potiguar Basin reservoir that has a large number of wells. This methodology considers changes of injection well rates over time, based on information available through streamline simulation. This methodology reduces injection rates in wells of lower efficiency and increases injection rates in more efficient wells. In the proposed model, the methodology was effective. The optimized alternatives presented higher oil recovery associated with a lower water injection volume. This shows better efficiency and, consequently, reduction in costs. Considering the wide use of the water injection in oil fields, the positive outcome of the modeling is important, because it shows a case study of increasing of oil recovery achieved simply through better distribution of water injection rates
Resumo:
Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines
Resumo:
We present a generic spatially explicit modeling framework to estimate carbon emissions from deforestation (INPE-EM). The framework incorporates the temporal dynamics related to the deforestation process and accounts for the biophysical and socioeconomic heterogeneity of the region under study. We build an emission model for the Brazilian Amazon combining annual maps of new clearings, four maps of biomass, and a set of alternative parameters based on the recent literature. The most important results are as follows: (a) Using different biomass maps leads to large differences in estimates of emission; for the entire region of the Brazilian Amazon in the last decade, emission estimates of primary forest deforestation range from 0.21 to 0.26 similar to Pg similar to C similar to yr-1. (b) Secondary vegetation growth presents a small impact on emission balance because of the short duration of secondary vegetation. In average, the balance is only 5% smaller than the primary forest deforestation emissions. (c) Deforestation rates decreased significantly in the Brazilian Amazon in recent years, from 27 similar to Mkm2 in 2004 to 7 similar to Mkm2 in 2010. INPE-EM process-based estimates reflect this decrease even though the agricultural frontier is moving to areas of higher biomass. The decrease is slower than a non-process instantaneous model would estimate as it considers residual emissions (slash, wood products, and secondary vegetation). The average balance, considering all biomass, decreases from 0.28 in 2004 to 0.15 similar to Pg similar to C similar to yr-1 in 2009; the non-process model estimates a decrease from 0.33 to 0.10 similar to Pg similar to C similar to yr-1. We conclude that the INPE-EM is a powerful tool for representing deforestation-driven carbon emissions. Biomass estimates are still the largest source of uncertainty in the effective use of this type of model for informing mechanisms such as REDD+. The results also indicate that efforts to reduce emissions should focus not only on controlling primary forest deforestation but also on creating incentives for the restoration of secondary forests.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)
Resumo:
Although cluster environments have an enormous potential processing power, real applications that take advantage of this power remain an elusive goal. This is due, in part, to the lack of understanding about the characteristics of the applications best suited for these environments. This paper focuses on Master/Slave applications for large heterogeneous clusters. It defines application, cluster and execution models to derive an analytic expression for the execution time. It defines speedup and derives speedup bounds based on the inherent parallelism of the application and the aggregated computing power of the cluster. The paper derives an analytical expression for efficiency and uses it to define scalability of the algorithm-cluster combination based on the isoefficiency metric. Furthermore, the paper establishes necessary and sufficient conditions for an algorithm-cluster combination to be scalable which are easy to verify and use in practice. Finally, it covers the impact of network contention as the number of processors grow. (C) 2007 Elsevier B.V. All rights reserved.
Resumo:
Genome sequencing efforts are providing us with complete genetic blueprints for hundreds of organisms. We are now faced with assigning, understanding, and modifying the functions of proteins encoded by these genomes. DBMODELING is a relational database of annotated comparative protein structure models and their metabolic pathway characterization, when identified. This procedure was applied to complete genomes such as Mycobacteritum tuberculosis and Xylella fastidiosa. The main interest in the study of metabolic pathways is that some of these pathways are not present in humans, which makes them selective targets for drug design, decreasing the impact of drugs in humans. In the database, there are currently 1116 proteins from two genomes. It can be accessed by any researcher at http://www.biocristalografia.df.ibilce.unesp.br/tools/. This project confirms that homology modeling is a useful tool in structural bioinformatics and that it can be very valuable in annotating genome sequence information, contributing to structural and functional genomics, and analyzing protein-ligand docking.
Resumo:
When the food supply flnishes, or when the larvae of blowflies complete their development and migrate prior to the total removal of the larval substrate, they disperse to find adequate places for pupation, a process known as post-feeding larval dispersal. Based on experimental data of the Initial and final configuration of the dispersion, the reproduction of such spatio-temporal behavior is achieved here by means of the evolutionary search for cellular automata with a distinct transition rule associated with each cell, also known as a nonuniform cellular automata, and with two states per cell in the lattice. Two-dimensional regular lattices and multivalued states will be considered and a practical question is the necessity of discovering a proper set of transition rules. Given that the number of rules is related to the number of cells in the lattice, the search space is very large and an evolution strategy is then considered to optimize the parameters of the transition rules, with two transition rules per cell. As the parameters to be optimized admit a physical interpretation, the obtained computational model can be analyzed to raise some hypothetical explanation of the observed spatiotemporal behavior. © 2006 IEEE.
Resumo:
Includes bibliography
Resumo:
Simulation of large and complex systems, such as computing grids, is a difficult task. Current simulators, despite providing accurate results, are significantly hard to use. They usually demand a strong knowledge of programming, what is not a standard pattern in today's users of grids and high performance computing. The need for computer expertise prevents these users from simulating how the environment will respond to their applications, what may imply in large loss of efficiency, wasting precious computational resources. In this paper we introduce iSPD, iconic Simulator of Parallel and Distributed Systems, which is a simulator where grid models are produced through an iconic interface. We describe the simulator and its intermediate model languages. Results presented here provide an insight in its easy-of-use and accuracy.
Resumo:
This paper describes strategies and techniques to perform modeling and automatic mesh generation of the aorta artery and its tunics (adventitia, media and intima walls), using open source codes. The models were constructed in the Blender package and Python scripts were used to export the data necessary for the mesh generation in TetGen. The strategies proposed are able to provide meshes of complicated and irregular volumes, with a large number of mesh elements involved (12,000,000 tetrahedrons approximately). These meshes can be used to perform computational simulations by Finite Element Method (FEM). © Published under licence by IOP Publishing Ltd.
Resumo:
Cellobiohydrolases hydrolyze cellulose releasing cellobiose units. They are very important for a number of biotechnological applications, such as, for example, production of cellulosic ethanol and cotton fiber processing. The Trichoderma cellobiohydrolase I (CBH1 or Cel7A) is an industrially important exocellulase. It exhibits a typical two domain architecture, with a small C-terminal cellulose-binding domain and a large N-terminal catalytic core domain, connected by an O-glycosylated linker peptide. The mechanism by which the linker mediates the concerted action of the two domains remains a conundrum. Here, we probe the protein shape and domain organization of the CBH1 of Trichoderma harzianum (ThCel7A) by small angle X-ray scattering (SAXS) and structural modeling. Our SAXS data shows that ThCel7A linker is partially-extended in solution. Structural modeling suggests that this linker conformation is stabilized by inter- and intra-molecular interactions involving the linker peptide and its O-glycosylations. © 2013 Springer Science+Business Media Dordrecht.
Resumo:
Studies on innovation and technology management have emphasized the importance of integration between the research and development (R&D) department and others involved with the product development process (PDP) as a relevant practice for the good performance of technological innovation of product activities. This study addresses the topic of transfers of technologies to new product projects and also integration practices between the R&D department and others involved with the PDP. A qualitative study was conducted that was operationalized through two case studies at large high-tech companies: One is Brazilian and the other is a multinational subsidiary in Brazil. Among its main result, this paper represents and analyzes management practices that are favorable to integration in product development projects that demand development and transfer of technologies, such as: participation of R&D personnel in market activities, the adoption of virtual interaction mechanisms, and the application of methods such as technology roadmaps. © Universidad Alberto Hurtado, Facultad de Economía y Negocios.
Resumo:
Fundação de Amparo à Pesquisa do Estado de São Paulo (FAPESP)
Resumo:
A região de Santarém, na última década, apresentou um aumento da área de agricultura de grãos, em especial, arroz, milho e soja. Na base da estrutura fundiária da região, essa dinâmica tem concorrido para a concentração fundiária devido à substituição da pequena propriedade pela grande propriedade capitalizada. Políticas de ordenamento territorial criaram um mosaico de unidades com regras específicas de uso da terra, são unidades de conservação e diferentes modalidades de projetos de assentamentos. Este trabalho tem dois objetivos: estudar o processo de transformação da paisagem após a introdução da agricultura capitalizada de grãos e construir cenários de futuro que analisem alternativas para conter o desflorestamento e a concentração fundiária em curso. O estudo foi realizado utilizando técnicas de sensoriamento remoto e geoprocessamento, a partir de imagens Landsat 5 TM dos anos de 1999, 2004 e 2007. Técnicas de modelagem dinâmica foram empregadas para explorar cenários de futuro (2015) considerando regras de uso do território. Os resultados obtidos mostram que, até 2004, a maior parte da agricultura mecanizada foi implantada em áreas onde anteriormente eram ocupadas pela agropecuária familiar, pastagem e capoeira. Após 2004, a sua expansão se deu, principalmente, sobre áreas de floresta, em especial, dentro de projetos de assentamento. A análise das transições de uso, em diferentes modalidades de assentamento, demonstra que as regras de uso do território estabelecidas por medidas de ordenamento territorial não têm sido seguidas em muitos casos. Este trabalho apresentou como principal contribuição metodológica a incorporação de questões institucionais relacionadas à estrutura fundiária na análise de transformação da paisagem e construção de cenários. Os resultados mostram que tal abordagem é essencial para entender os processos de transformação correntes à região em questão.