11 resultados para visualization tools
em Universidade Federal do Rio Grande do Norte(UFRN)
Resumo:
In the recovering process of oil, rock heterogeneity has a huge impact on how fluids move in the field, defining how much oil can be recovered. In order to study this variability, percolation theory, which describes phenomena involving geometry and connectivity are the bases, is a very useful model. Result of percolation is tridimensional data and have no physical meaning until visualized in form of images or animations. Although a lot of powerful and sophisticated visualization tools have been developed, they focus on generation of planar 2D images. In order to interpret data as they would be in the real world, virtual reality techniques using stereo images could be used. In this work we propose an interactive and helpful tool, named ZSweepVR, based on virtual reality techniques that allows a better comprehension of volumetric data generated by simulation of dynamic percolation. The developed system has the ability to render images using two different techniques: surface rendering and volume rendering. Surface rendering is accomplished by OpenGL directives and volume rendering is accomplished by the Zsweep direct volume rendering engine. In the case of volumetric rendering, we implemented an algorithm to generate stereo images. We also propose enhancements in the original percolation algorithm in order to get a better performance. We applied our developed tools to a mature field database, obtaining satisfactory results. The use of stereoscopic and volumetric images brought valuable contributions for the interpretation and clustering formation analysis in percolation, what certainly could lead to better decisions about the exploration and recovery process in oil fields
Resumo:
VANTI, Nadia. Links hipertextuais na comunicação científica: análise webométrica dos sítios acadêmicos latino-americanos em Ciências Sociais. Porto Alegre, 2007. 292 f. Tese (Doutorado em Comunicação e Informação) – Universidade Federal do Rio Grande do Sul. Porto Alegre, 2007.
Resumo:
The increasing competitiveness of the construction industry, set in an economic environment in which the offer is now greater than the demand , causes the prices of many products and services, are strongly influenced by the processes of production and the final consumer. Thus, to become more competitive in the market and construction companies are seeking new alternatives to reduce and control costs, production processes and tools that allow for close monitoring of the construction schedule, with the consequent compliance deadline with the client. Based on this scenario, the creation of control tools, service management and planning work emerges as an investment opportunity and an area that can promote great benefits to construction companies. The goal of this work is to present a system of planning, service management and costs control that through worksheets provide information relating to the production phase of the work, allowing the visualization of possible irregularities in the planning and cost of the enterprise, enabling the company to take steps to achieve the goals of the enterprise in question, and correct them when necessary. The developed system has been used in a piece of real estate in Rio Grande do Norte, and the results showed that its use together allowed the construction company to accompany their results and take corrective and preventive actions during the production process, efficiently and effective
Resumo:
This study aims to identify, through the application of webometric indicators, which Post-Graduate Courses in Engineering recommended by the Coordination of Improvement of Higher Personnel Education (CAPES) in Brazil stand out in the web space, in relation to the communication process and dissemination of scientific information in the academic environment. For this, we analyzed the structures content of the sites, the use, through the conduct of investigations and searches, the quality of information available, as well as the structure of existent hypertexts in the sites of this universe of search. The tools and methodologies adopted for this study are: search engines (Google, Yahoo), Mapper software (Xenu Link Sleuth) and analysis software and visualization of networks (and Ucinet6 NetDraw). Webometric indicators are also used, such as size of the web sites, visibility, web impact factor, brightness and density of the network. These instruments provide a brief analysis and evaluation for this webometric study. Therefore, from the incursion of the literature used, it appears that there are many advantages of using this type of metric study in the so called Information Society. The obtained results could identify which postgraduate courses in engineering has a better availability of their information on the Web, as well to define which of these courses stands out in relation to the use of their information, which has been outstanding in respect to its impact factor and which offers a greater number of links that serve as a source of information for its users, contributing, in its turn, with the navigability of the same network. In summary, it is asserted that the webometric study presents promising results, which are able to achieve the proposed objectives, as well as identify the factors that contribute significantly to the good visualization of these sites in the network, thus helping the spread of information and scientific communication through the use of the Web.
Resumo:
The last years have presented an increase in the acceptance and adoption of the parallel processing, as much for scientific computation of high performance as for applications of general intention. This acceptance has been favored mainly for the development of environments with massive parallel processing (MPP - Massively Parallel Processing) and of the distributed computation. A common point between distributed systems and MPPs architectures is the notion of message exchange, that allows the communication between processes. An environment of message exchange consists basically of a communication library that, acting as an extension of the programming languages that allow to the elaboration of applications parallel, such as C, C++ and Fortran. In the development of applications parallel, a basic aspect is on to the analysis of performance of the same ones. Several can be the metric ones used in this analysis: time of execution, efficiency in the use of the processing elements, scalability of the application with respect to the increase in the number of processors or to the increase of the instance of the treat problem. The establishment of models or mechanisms that allow this analysis can be a task sufficiently complicated considering parameters and involved degrees of freedom in the implementation of the parallel application. An joined alternative has been the use of collection tools and visualization of performance data, that allow the user to identify to points of strangulation and sources of inefficiency in an application. For an efficient visualization one becomes necessary to identify and to collect given relative to the execution of the application, stage this called instrumentation. In this work it is presented, initially, a study of the main techniques used in the collection of the performance data, and after that a detailed analysis of the main available tools is made that can be used in architectures parallel of the type to cluster Beowulf with Linux on X86 platform being used libraries of communication based in applications MPI - Message Passing Interface, such as LAM and MPICH. This analysis is validated on applications parallel bars that deal with the problems of the training of neural nets of the type perceptrons using retro-propagation. The gotten conclusions show to the potentiality and easinesses of the analyzed tools.
Resumo:
Currently there is still a high demand for quality control in manufacturing processes of mechanical parts. This keeps alive the need for the inspection activity of final products ranging from dimensional analysis to chemical composition of products. Usually this task may be done through various nondestructive and destructive methods that ensure the integrity of the parts. The result generated by these modern inspection tools ends up not being able to geometrically define the real damage and, therefore, cannot be properly displayed on a computing environment screen. Virtual 3D visualization may help identify damage that would hardly be detected by any other methods. One may find some commercial softwares that seek to address the stages of a design and simulation of mechanical parts in order to predict possible damages trying to diminish potential undesirable events. However, the challenge of developing softwares capable of integrating the various design activities, product inspection, results of non-destructive testing as well as the simulation of damage still needs the attention of researchers. This was the motivation to conduct a methodological study for implementation of a versatile CAD/CAE computer kernel capable of helping programmers in developing softwares applied to the activities of design and simulation of mechanics parts under stress. In this research it is presented interesting results obtained from the use of the developed kernel showing that it was successfully applied to case studies of design including parts presenting specific geometries, namely: mechanical prostheses, heat exchangers and piping of oil and gas. Finally, the conclusions regarding the experience of merging CAD and CAE theories to develop the kernel, so as to result in a tool adaptable to various applications of the metalworking industry are presented
Resumo:
Model-oriented strategies have been used to facilitate products customization in the software products lines (SPL) context and to generate the source code of these derived products through variability management. Most of these strategies use an UML (Unified Modeling Language)-based model specification. Despite its wide application, the UML-based model specification has some limitations such as the fact that it is essentially graphic, presents deficiencies regarding the precise description of the system architecture semantic representation, and generates a large model, thus hampering the visualization and comprehension of the system elements. In contrast, architecture description languages (ADLs) provide graphic and textual support for the structural representation of architectural elements, their constraints and interactions. This thesis introduces ArchSPL-MDD, a model-driven strategy in which models are specified and configured by using the LightPL-ACME ADL. Such strategy is associated to a generic process with systematic activities that enable to automatically generate customized source code from the product model. ArchSPLMDD strategy integrates aspect-oriented software development (AOSD), modeldriven development (MDD) and SPL, thus enabling the explicit modeling as well as the modularization of variabilities and crosscutting concerns. The process is instantiated by the ArchSPL-MDD tool, which supports the specification of domain models (the focus of the development) in LightPL-ACME. The ArchSPL-MDD uses the Ginga Digital TV middleware as case study. In order to evaluate the efficiency, applicability, expressiveness, and complexity of the ArchSPL-MDD strategy, a controlled experiment was carried out in order to evaluate and compare the ArchSPL-MDD tool with the GingaForAll tool, which instantiates the process that is part of the GingaForAll UML-based strategy. Both tools were used for configuring the products of Ginga SPL and generating the product source code
Resumo:
Using formal methods, the developer can increase software s trustiness and correctness. Furthermore, the developer can concentrate in the functional requirements of the software. However, there are many resistance in adopting this software development approach. The main reason is the scarcity of adequate, easy to use, and useful tools. Developers typically write code and test it. These tests usually consist of executing the program and checking its output against its requirements. This, however, is not always an exhaustive discipline. On the other side, using formal methods one might be able to investigate the system s properties further. Unfortunately, specification languages do not always have tools like animators or simulators, and sometimes there are no friendly Graphical User Interfaces. On the other hand, specification languages usually have a compiler which normally generates a Labeled Transition System (LTS). This work proposes an application that provides graphical animation for formal specifications using the LTS as input. The application initially supports the languages B, CSP, and Z. However, using a LTS in a specified XML format, it is possible to animate further languages. Additionally, the tool provides traces visualization, the choices the user did, in a graphical tree. The intention is to improve the comprehension of a specification by providing information about errors and animating it, as the developers do for programming languages, such as Java and C++.
Resumo:
In this beginning of the XXI century, the Geology moves for new ways that demand a capacity to work with different information and new tools. It is within this context that the analog characterization has important in the prediction and understanding the lateral changes in the geometry and facies distribution. In the present work was developed a methodology for integration the geological and geophysical data in transitional recent deposits, the modeling of petroliferous reservoirs, the volume calculation and the uncertainties associate with this volume. For this purpose it was carried planialtimetric and geophysics (Ground Penetrating Radar) surveys in three areas of the Parnaíba River. With this information, it was possible to visualize the overlap of different estuary channels and make the delimitation of the channel geometry (width and thickness). For three-dimensional visualization and modeling were used two of the main reservoirs modeling software. These studies were performed with the collected parameters and the data of two reservoirs. The first was created with the Potiguar Basin wells data existents in the literature and corresponding to Açu IV unit. In the second case was used a real database of the Northern Sea. In the procedures of reservoirs modeling different workflows were created and generated five study cases with their volume calculation. Afterwards an analysis was realized to quantify the uncertainties in the geological modeling and their influence in the volume. This analysis was oriented to test the generating see and the analogous data use in the model construction
Resumo:
Logic courses represent a pedagogical challenge and the recorded number of cases of failures and of discontinuity in them is often high. Amont other difficulties, students face a cognitive overload to understand logical concepts in a relevant way. On that track, computational tools for learning are resources that help both in alleviating the cognitive overload scenarios and in allowing for the practical experimenting with theoretical concepts. The present study proposes an interactive tutorial, namely the TryLogic, aimed at teaching to solve logical conjectures either by proofs or refutations. The tool was developed from the architecture of the tool TryOcaml, through support of the communication of the web interface ProofWeb in accessing the proof assistant Coq. The goals of TryLogic are: (1) presenting a set of lessons for applying heuristic strategies in solving problems set in Propositional Logic; (2) stepwise organizing the exposition of concepts related to Natural Deduction and to Propositional Semantics in sequential steps; (3) providing interactive tasks to the students. The present study also aims at: presenting our implementation of a formal system for refutation; describing the integration of our infrastructure with the Virtual Learning Environment Moodle through the IMS Learning Tools Interoperability specification; presenting the Conjecture Generator that works for the tasks involving proving and refuting; and, finally to evaluate the learning experience of Logic students through the application of the conjecture solving task associated to the use of the TryLogic
Resumo:
In the current work are presented the results about the study of digital mapping of analogs referents the fluvial oil reservoirs in the Açu Formation. With the regional recognizing in the south corner of Potiguar Basin was selected a area of 150 Km square in the west of Assu city. In this area was chosen the outcrops for the digital mapping and from the data fields and remote sensors were done the depositional architectural for the fluvial deposits, which it was named coarse meandering fluvial systems. In the deposits were individualized 3 (three) fluvial cycles, which they was separated by bounding surface of fifth order. Such cycles are preferentially sandy, with fining-upward sequence finished in flood plain deposits. Inner of the sandy levels of the filling channels were characterized least cycles, normaly incomplete, constituted by braided sandy bodies and bounding surfaces of fourth order. In the mapped area was chosen a outcrop with great exposition, where it was possible to see tipical deposits of filling channel and was in this outcrop that was done the digital mapping. In this outcrop was used diverse technics and tools, which they integrated sedimentological, altimetric (GPS, Total Station), LIDAR (Light Detection and Ranging), digital photomosaic of high resolution and of the inner geometries (Ground Penetration Radar) data sets. For the integrating, interpretation and visualization of data was used software GoCAD®. The final product of the outcrop digital mapping was the photorealistic model of part of the cliff (or slope) because the observed reflectors in the radargrams were absents. A part of bar oblique accretion was modeled according to GPR gride of 200x200 meters in the alluvial Assu river probable recent analog. With the data of inner geometries was developed the three-dimentional sedimentary architectural, where it was possible characterize sand sheet deposits and many hierarchy of braided channels. At last, simulations of sedimentary geometries and architectures of the Potiguar Basin Fluvial Reservoirs were done with PetBool software, in order to understand the capacity of this program in simulations with a lot of numbers of conditioning wells. In total, 45 simulations was acquired, where the time and the channel numbers increase in relation of the conditioning wells quantity. The deformation of the meanders was detected from the change of simulated dominion dimensions. The presence of this problem was because the relationship between the simulated dominion and the width of the meander