68 resultados para Simulação por computador

em Universidade Federal do Rio Grande do Norte(UFRN)


Relevância:

30.00% 30.00%

Publicador:

Resumo:

Currently there is still a high demand for quality control in manufacturing processes of mechanical parts. This keeps alive the need for the inspection activity of final products ranging from dimensional analysis to chemical composition of products. Usually this task may be done through various nondestructive and destructive methods that ensure the integrity of the parts. The result generated by these modern inspection tools ends up not being able to geometrically define the real damage and, therefore, cannot be properly displayed on a computing environment screen. Virtual 3D visualization may help identify damage that would hardly be detected by any other methods. One may find some commercial softwares that seek to address the stages of a design and simulation of mechanical parts in order to predict possible damages trying to diminish potential undesirable events. However, the challenge of developing softwares capable of integrating the various design activities, product inspection, results of non-destructive testing as well as the simulation of damage still needs the attention of researchers. This was the motivation to conduct a methodological study for implementation of a versatile CAD/CAE computer kernel capable of helping programmers in developing softwares applied to the activities of design and simulation of mechanics parts under stress. In this research it is presented interesting results obtained from the use of the developed kernel showing that it was successfully applied to case studies of design including parts presenting specific geometries, namely: mechanical prostheses, heat exchangers and piping of oil and gas. Finally, the conclusions regarding the experience of merging CAD and CAE theories to develop the kernel, so as to result in a tool adaptable to various applications of the metalworking industry are presented

Relevância:

20.00% 20.00%

Publicador:

Resumo:

LINS, Filipe C. A. et al. Modelagem dinâmica e simulação computacional de poços de petróleo verticais e direcionais com elevação por bombeio mecânico. In: CONGRESSO BRASILEIRO DE PESQUISA E DESENVOLVIMENTO EM PETRÓLEO E GÁS, 5. 2009, Fortaleza, CE. Anais... Fortaleza: CBPDPetro, 2009.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This Masters Degree dissertation seeks to make a comparative study of internal air temperature data, simulated through the thermal computer application DesignBuilder 1.2, and data registered in loco through HOBO® Temp Data Logger, in a Social Housing Prototype (HIS), located at the Central Campus of the Federal University of Rio Grande do Norte UFRN. The prototype was designed and built seeking strategies of thermal comfort recommended for the local climate where the study was carried out, and built with panels of cellular concrete by Construtora DoisA, a collaborator of research project REPESC Rede de Pesquisa em Eficiência Energética de Sistemas Construtivos (Research Network on Energy Efficiency of Construction Systems), an integral part of Habitare program. The methodology employed carefully examined the problem, reviewed the bibliography, analyzing the major aspects related to computer simulations for thermal performance of buildings, such as climate characterization of the region under study and users thermal comfort demands. The DesignBuilder 1.2 computer application was used as a simulation tool, and theoretical alterations were carried out in the prototype, then they were compared with the parameters of thermal comfort adopted, based on the area s current technical literature. Analyses of the comparative studies were performed through graphical outputs for a better understanding of air temperature amplitudes and thermal comfort conditions. The data used for the characterization of external air temperature were obtained from the Test Reference Year (TRY), defined for the study area (Natal-RN). Thus the author also performed comparative studies for TRY data registered in the years 2006, 2007 and 2008, at weather station Davis Precision Station, located at the Instituto Nacional de Pesquisas Espaciais INPE-CRN (National Institute of Space Research), in a neighboring area of UFRN s Central Campus. The conclusions observed from the comparative studies performed among computer simulations, and the local records obtained from the studied prototype, point out that the simulations performed in naturally ventilated buildings is quite a complex task, due to the applications limitations, mainly owed to the complexity of air flow phenomena, the influence of comfort conditions in the surrounding areas and climate records. Lastly, regarding the use of the application DesignBuilder 1.2 in the present study, one may conclude that it is a good tool for computer simulations. However, it needs some adjustments to improve reliability in its use. There is a need for continued research, considering the dedication of users to the prototype, as well as the thermal charges of the equipment, in order to check sensitivity

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Natural air ventilation is the most import passive strategy to provide thermal comfort in hot and humid climates and a significant low energy strategy. However, the natural ventilated building requires more attention with the architectural design than a conventional building with air conditioning systems, and the results are less reliable. Therefore, this thesis focuses on softwares and methods to predict the natural ventilation performance from the point of view of the architect, with limited resource and knowledge of fluid mechanics. A typical prefabricated building was modelled due to its simplified geometry, low cost and occurrence at the local campus. Firstly, the study emphasized the use of computational fluid dynamics (CFD) software, to simulate the air flow outside and inside the building. A series of approaches were developed to make the simulations possible, compromising the results fidelity. Secondly, the results of CFD simulations were used as the input of an energy tool, to simulate the thermal performance under different rates of air renew. Thirdly, the results of temperature were assessed in terms of thermal comfort. Complementary simulations were carried out to detail the analyses. The results show the potentialities of these tools. However the discussions concerning the simplifications of the approaches, the limitations of the tools and the level of knowledge of the average architect are the major contribution of this study

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Na unfolding method of linear intercept distributions and secction área distribution was implemented for structures with spherical grains. Although the unfolding routine depends on the grain shape, structures with spheroidal grains can also be treated by this routine. Grains of non-spheroidal shape can be treated only as approximation. A software was developed with two parts. The first part calculates the probability matrix. The second part uses this matrix and minimizes the chi-square. The results are presented with any number of size classes as required. The probability matrix was determined by means of the linear intercept and section area distributions created by computer simulation. Using curve fittings the probability matrix for spheres of any sizes could be determined. Two kinds of tests were carried out to prove the efficiency of the Technique. The theoretical tests represent ideal cases. The software was able to exactly find the proposed grain size distribution. In the second test, a structure was simulated in computer and images of its slices were used to produce the corresponding linear intercept the section area distributions. These distributions were then unfolded. This test simulates better reality. The results show deviations from the real size distribution. This deviations are caused by statistic fluctuation. The unfolding of the linear intercept distribution works perfectly, but the unfolding of section area distribution does not work due to a failure in the chi-square minimization. The minimization method uses a matrix inversion routine. The matrix generated by this procedure cannot be inverted. Other minimization method must be used

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work focuses on the creation and applications of a dynamic simulation software in order to study the hard metal structure (WC-Co). The technological ground used to increase the GPU hardware capacity was Geforce 9600 GT along with the PhysX chip created to make games more realistic. The software simulates the three-dimensional carbide structure to the shape of a cubic box where tungsten carbide (WC) are modeled as triangular prisms and truncated triangular prisms. The program was proven effective regarding checking testes, ranging from calculations of parameter measures such as the capacity to increase the number of particles simulated dynamically. It was possible to make an investigation of both the mean parameters and distributions stereological parameters used to characterize the carbide structure through cutting plans. Grounded on the cutting plans concerning the analyzed structures, we have investigated the linear intercepts, the intercepts to the area, and the perimeter section of the intercepted grains as well as the binder phase to the structure by calculating the mean value and distribution of the free path. As literature shows almost consensually that the distribution of the linear intercepts is lognormal, this suggests that the grain distribution is also lognormal. Thus, a routine was developed regarding the program which made possible a more detailed research on this issue. We have observed that it is possible, under certain values for the parameters which define the shape and size of the Prismatic grain to find out the distribution to the linear intercepts that approach the lognormal shape. Regarding a number of developed simulations, we have observed that the distribution curves of the linear and area intercepts as well as the perimeter section are consistent with studies on static computer simulation to these parameters.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Until the early 90s, the simulation of fluid flow in oil reservoir basically used the numerical technique of finite differences. Since then, there was a big development in simulation technology based on streamlines, so that nowadays it is being used in several cases and it can represent the physical mechanisms that influence the fluid flow, such as compressibility, capillarity and gravitational segregation. Streamline-based flow simulation is a tool that can help enough in waterflood project management, because it provides important information not available through traditional simulation of finite differences and shows, in a direct way, the influence between injector well and producer well. This work presents the application of a methodology published in literature for optimizing water injection projects in modeling of a Brazilian Potiguar Basin reservoir that has a large number of wells. This methodology considers changes of injection well rates over time, based on information available through streamline simulation. This methodology reduces injection rates in wells of lower efficiency and increases injection rates in more efficient wells. In the proposed model, the methodology was effective. The optimized alternatives presented higher oil recovery associated with a lower water injection volume. This shows better efficiency and, consequently, reduction in costs. Considering the wide use of the water injection in oil fields, the positive outcome of the modeling is important, because it shows a case study of increasing of oil recovery achieved simply through better distribution of water injection rates

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In Brazil and around the world, oil companies are looking for, and expected development of new technologies and processes that can increase the oil recovery factor in mature reservoirs, in a simple and inexpensive way. So, the latest research has developed a new process called Gas Assisted Gravity Drainage (GAGD) which was classified as a gas injection IOR. The process, which is undergoing pilot testing in the field, is being extensively studied through physical scale models and core-floods laboratory, due to high oil recoveries in relation to other gas injection IOR. This process consists of injecting gas at the top of a reservoir through horizontal or vertical injector wells and displacing the oil, taking advantage of natural gravity segregation of fluids, to a horizontal producer well placed at the bottom of the reservoir. To study this process it was modeled a homogeneous reservoir and a model of multi-component fluid with characteristics similar to light oil Brazilian fields through a compositional simulator, to optimize the operational parameters. The model of the process was simulated in GEM (CMG, 2009.10). The operational parameters studied were the gas injection rate, the type of gas injection, the location of the injector and production well. We also studied the presence of water drive in the process. The results showed that the maximum vertical spacing between the two wells, caused the maximum recovery of oil in GAGD. Also, it was found that the largest flow injection, it obtained the largest recovery factors. This parameter controls the speed of the front of the gas injected and determined if the gravitational force dominates or not the process in the recovery of oil. Natural gas had better performance than CO2 and that the presence of aquifer in the reservoir was less influential in the process. In economic analysis found that by injecting natural gas is obtained more economically beneficial than CO2

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This work aims presenting the development of a model and computer simulation of a sucker rod pumping system. This system take into account the well geometry, the flow through the tubing, the dynamic behavior of the rod string and the use of a induction motor model. The rod string were modeled using concentrated parameters, allowing the use of ordinary differential equations systems to simulate it s behavior

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Oil production and exploration techniques have evolved in the last decades in order to increase fluid flows and optimize how the required equipment are used. The base functioning of Electric Submersible Pumping (ESP) lift method is the use of an electric downhole motor to move a centrifugal pump and transport the fluids to the surface. The Electric Submersible Pumping is an option that has been gaining ground among the methods of Artificial Lift due to the ability to handle a large flow of liquid in onshore and offshore environments. The performance of a well equipped with ESP systems is intrinsically related to the centrifugal pump operation. It is the pump that has the function to turn the motor power into Head. In this present work, a computer model to analyze the three-dimensional flow in a centrifugal pump used in Electric Submersible Pumping has been developed. Through the commercial program, ANSYS® CFX®, initially using water as fluid flow, the geometry and simulation parameters have been defined in order to obtain an approximation of what occurs inside the channels of the impeller and diffuser pump in terms of flow. Three different geometry conditions were initially tested to determine which is most suitable to solving the problem. After choosing the most appropriate geometry, three mesh conditions were analyzed and the obtained values were compared to the experimental characteristic curve of Head provided by the manufacturer. The results have approached the experimental curve, the simulation time and the model convergence were satisfactory if it is considered that the studied problem involves numerical analysis. After the tests with water, oil was used in the simulations. The results were compared to a methodology used in the petroleum industry to correct viscosity. In general, for models with water and oil, the results with single-phase fluids were coherent with the experimental curves and, through three-dimensional computer models, they are a preliminary evaluation for the analysis of the two-phase flow inside the channels of centrifugal pump used in ESP systems

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study provides a methodology that gives a predictive character the computer simulations based on detailed models of the geometry of a porous medium. We using the software FLUENT to investigate the flow of a viscous Newtonian fluid through a random fractal medium which simplifies a two-dimensional disordered porous medium representing a petroleum reservoir. This fractal model is formed by obstacles of various sizes, whose size distribution function follows a power law where exponent is defined as the fractal dimension of fractionation Dff of the model characterizing the process of fragmentation these obstacles. They are randomly disposed in a rectangular channel. The modeling process incorporates modern concepts, scaling laws, to analyze the influence of heterogeneity found in the fields of the porosity and of the permeability in such a way as to characterize the medium in terms of their fractal properties. This procedure allows numerically analyze the measurements of permeability k and the drag coefficient Cd proposed relationships, like power law, for these properties on various modeling schemes. The purpose of this research is to study the variability provided by these heterogeneities where the velocity field and other details of viscous fluid dynamics are obtained by solving numerically the continuity and Navier-Stokes equations at pore level and observe how the fractal dimension of fractionation of the model can affect their hydrodynamic properties. This study were considered two classes of models, models with constant porosity, MPC, and models with varying porosity, MPV. The results have allowed us to find numerical relationship between the permeability, drag coefficient and the fractal dimension of fractionation of the medium. Based on these numerical results we have proposed scaling relations and algebraic expressions involving the relevant parameters of the phenomenon. In this study analytical equations were determined for Dff depending on the geometrical parameters of the models. We also found a relation between the permeability and the drag coefficient which is inversely proportional to one another. As for the difference in behavior it is most striking in the classes of models MPV. That is, the fact that the porosity vary in these models is an additional factor that plays a significant role in flow analysis. Finally, the results proved satisfactory and consistent, which demonstrates the effectiveness of the referred methodology for all applications analyzed in this study.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Petroleum evaluation is analyze it using different methodologies, following international standards to know their chemical and physicochemical properties, contaminant levels, composition and especially their ability to generate derivatives. Many of these analyzes consuming a lot of time, large amount of samples , supplies and need an organized transportation logistics, schedule and professionals involved. Looking for alternatives that optimize the evaluation and enable the use of new technologies, seven samples of different centrifuged Brazilian oils previously characterized by Petrobras were analyzed by thermogravimetry in 25-900° C range using heating rates of 05, 10 and 20ºC per minute. With experimental data obtained, characterizations correlations were performed and provided: generation of true boiling point curves (TBP) simulated; comparing fractions generated with appropriate cut standard in temperature ranges; an approach to obtain Watson characterization factor; and compare micro carbon residue formed. The results showed a good chance of reproducing simulated TBP curve from thermogravimetry taking into account the composition, density and other oil properties. Proposed correlations for experimental characterization factor and carbon residue followed Petrobras characterizations, showing that thermogravimetry can be used as a tool on oil evaluation, because your quick analysis, accuracy, and requires a minimum number of samples and consumables

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The present study has as goal to analyze the aplicability of the computational technology as mediator in the english-speaking teaching in the Centro Federal de Educação Tecnológica do Rio Grande do Norte-CEFET/RN. The object of study was centered in the use of the computer incorporated in the teaching of English by four institution groups. The research was ruled metodologically in the study of case, adopting a qualitative and quantitative boarding of interpretative-reflexive mark. We support ourselves on a bibliographical literature revision that cares of the use of the computational technology matter into the class-room, aiming an education new practice, regarding the current reality conceptions in what we live in the technological education. We also use a referencial for a pedagogical action, trying to offer subsidies for a practice that provides the knowledges generation through the interaction, aiming a subject reflexive and critical education. For materialization of this study, we used esrutuctured action, as interviews for the teachers and students, besides the observations of the dayly in class-room, in order to get the necessary datas for analysis. During this study, we oserved that the use of the computer, while pedagogical support instrument in the english-speaking teaching, has acted like mediator of the teaching-learning process. The results demonstrate use of the computer use has been each more a practice adopted by other institution languages teachers. The conclusions confirm the hypothesis showed at the beginning of the work and evidence that the teachers are warden of forming thinking, reflexive and critical subjects. For that, they need to be prepared to face situations in which they can take the pedagogical practice to tune with the technological advances, consequently providing an effective technological education

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study investigates teacher training and cognitive practice of teachers in a Basic Education school that adopted the Project One Computer per Student (OCS) in their school routine. Its relevance consists in provide directions for the continuation of training activities on the Project and guide the teachers with their pedagogical practices using the laptop model one to one. The thesis defended is that the educator formation for social using of digital media (specially the laptops from the Project UCA) gives space to establish new sociotechnical relationships, of new social and professionals practices, new identitary components and a process of reflexivity and knowledge reconstruction to teach. We reaffirm the importance of reflexivity and appropriation of digital culture for the better development of teaching practice using the Information and Communication Technologies (ICTs), giving focus to the aspects of social and professional use of the technology. The study is part of the qualitative aspect and is a procedural tracking based on principles of ethnographic research. As procedures and methodological tools, were used: intensive observation of school environments, documental analysis, focal group, semi-structured questionnaires and semi-structured individual interviews. The research was held in a public school in the city of Parnamirim - RN. The subject sample relates to 17 teachers, coming from the elementary school I and II, Youth and Adult Education and High School, who went through the process of training UCA and having entered the laptops in their teaching. The research corpus is structured based on the messages built into the process of data collection and is analyzed based on principles of Content Analysis, specified by Laurence Bardin (2011). Was taken as theoretical reference studies by Tardif (2000; 2011), Pimenta (2009), Gorz (2004, 2005), Giddens (1991), Dewey, J. (1916), Boudieu (1994; 1999), Freire (1996; 2005), among others. The analysis indicates a process of reconstruction / revision of knowledge to teach and work in digital culture, being these knowledges guided by the experience of the subjects investigated. The reconstructed knowledges will be revealed from a categorization process. The following groups of knowledges: "technical knowledges", "didactic-methodological knowledges and knowledges of professionalization" were built on the assumption of ownership of digital culture in the educational context. The analysis confirms the appearance of new ways of sociability when acquiring other forms of acting and thinking ICTs, despite the environment adverse to the reflexivity shared among the teachers. Also reveals, based on the ownership concept present on the data analysis, the construction of meanings of belonging and transformation of individuals into social routes from the interweaving of the teaching practice with the digital culture. Emphasizes, finally, the importance of a training for use of ICTs that exceeds the instrumentation, in other words, what we call "technical knowledges", but taking on its structural basis the shared reflection, the opening for the ressignificance (new meaning) and reconstruction of new knowledges and practices and that really allows, to the teacher, the living of an experience capable of providing socio-technical transformations of their relationships