130 resultados para Arquitetura acústica - Simulação por computador
Resumo:
This paper examines, through case studies, the organization of the production process of architectural projects in architecture offices in the city of Natal, specifically in relation to building projects. The specifics of the design process in architecture, the production of the project in a professional field in Natal, are studied in light of theories of design and its production process. The survey, in its different phases, was conducted between March 2010 and September 2012 and aimed to identify, understand, and analyze comparatively, by mapping the design process, the organization of production of building projects in two offices in Natal, checking as well the relationships of their agents during the process. The project was based on desk research and exploration, adopting, for both, data collection tools such as forms, questionnaires, and interviews. With the specific aim of mapping the design process, we adopted a technique that allows obtaining the information directly from employee agents involved in the production process. The technique consisted of registering information by completing daily, during or at the end of the workday, an individual virtual agenda, in which all agent collaborators described the tasks performed. The data collected allowed for the identification of the organizational structure of the office, its hierarchy, the responsibilities of agents, as well as the tasks performed by them during the two months of monitoring at each office. The research findings were based on analyses of data collected in the two offices and on comparative studies between the results of these analyses. The end result was a diagnostic evaluation that considered the level of organization and elaborated this perspective, as well as proposed solutions aimed at improving both the organization of the process and the relationships between the agents under the lens analyzed
Resumo:
This essay studies about contemporary architecture in João Pessoa, capital city of Paraiba State, which analyzes the utilization of steel and aluminum materials from 1990 to 2002, period when significant increase of this utilization is seen. As references, the architectural analysis and Engel structures definitions were used to analyze 40 built buildings, universe researched, to bring together subsidies to comprehend the contemporary production, especially which use the metal as structural and esthetic-formal element
Resumo:
A conceptual discussion on architectural type and its role in theory and practice supports the construction of an analytical tool used for recognizing the typological evolution of hospital architecture in Western societies. The same tool is applied to analyze the typological evolution of hospital architecture in Natal, Brazil, through a sample of eighteen hospitals built in the city since the beginnings of 20th century. The conclusion is that typological evolution in Natal is almost the same as occidental one, except for a few singularities that can be explained by local social and economic development
Resumo:
This master thesis aims to assess the influence of the design decisions on the energy building performance of hotels. The research is based on the integration of field study and computer simulation. Firstly, a detailed field study is carried out to identify the characteristics of hotels in Natal, Rio Grande do Norte. The items assessed are occupancies, light and equipment densities, types of air conditioning, total and monthly energy consumption, among others. A second and more comprehensive field study is carried out to identify the range of occurrence of architectural variables, with a larger number of buildings. A base case is modelled in VisualDOE, based on the first field study. Then, a first set of simulations are run to explore the sensitivity of the variables on the energy consumption. The results analyses were the base of a second set of simulations, which combined the most influential variables. The results of 384 models were assessed, and the impacts of design decisions were quantified. The study discusses tendencies and recommendations, as well as the methods advantages and disadvantages
Resumo:
Na unfolding method of linear intercept distributions and secction área distribution was implemented for structures with spherical grains. Although the unfolding routine depends on the grain shape, structures with spheroidal grains can also be treated by this routine. Grains of non-spheroidal shape can be treated only as approximation. A software was developed with two parts. The first part calculates the probability matrix. The second part uses this matrix and minimizes the chi-square. The results are presented with any number of size classes as required. The probability matrix was determined by means of the linear intercept and section area distributions created by computer simulation. Using curve fittings the probability matrix for spheres of any sizes could be determined. Two kinds of tests were carried out to prove the efficiency of the Technique. The theoretical tests represent ideal cases. The software was able to exactly find the proposed grain size distribution. In the second test, a structure was simulated in computer and images of its slices were used to produce the corresponding linear intercept the section area distributions. These distributions were then unfolded. This test simulates better reality. The results show deviations from the real size distribution. This deviations are caused by statistic fluctuation. The unfolding of the linear intercept distribution works perfectly, but the unfolding of section area distribution does not work due to a failure in the chi-square minimization. The minimization method uses a matrix inversion routine. The matrix generated by this procedure cannot be inverted. Other minimization method must be used
Resumo:
This work focuses on the creation and applications of a dynamic simulation software in order to study the hard metal structure (WC-Co). The technological ground used to increase the GPU hardware capacity was Geforce 9600 GT along with the PhysX chip created to make games more realistic. The software simulates the three-dimensional carbide structure to the shape of a cubic box where tungsten carbide (WC) are modeled as triangular prisms and truncated triangular prisms. The program was proven effective regarding checking testes, ranging from calculations of parameter measures such as the capacity to increase the number of particles simulated dynamically. It was possible to make an investigation of both the mean parameters and distributions stereological parameters used to characterize the carbide structure through cutting plans. Grounded on the cutting plans concerning the analyzed structures, we have investigated the linear intercepts, the intercepts to the area, and the perimeter section of the intercepted grains as well as the binder phase to the structure by calculating the mean value and distribution of the free path. As literature shows almost consensually that the distribution of the linear intercepts is lognormal, this suggests that the grain distribution is also lognormal. Thus, a routine was developed regarding the program which made possible a more detailed research on this issue. We have observed that it is possible, under certain values for the parameters which define the shape and size of the Prismatic grain to find out the distribution to the linear intercepts that approach the lognormal shape. Regarding a number of developed simulations, we have observed that the distribution curves of the linear and area intercepts as well as the perimeter section are consistent with studies on static computer simulation to these parameters.
Resumo:
Until the early 90s, the simulation of fluid flow in oil reservoir basically used the numerical technique of finite differences. Since then, there was a big development in simulation technology based on streamlines, so that nowadays it is being used in several cases and it can represent the physical mechanisms that influence the fluid flow, such as compressibility, capillarity and gravitational segregation. Streamline-based flow simulation is a tool that can help enough in waterflood project management, because it provides important information not available through traditional simulation of finite differences and shows, in a direct way, the influence between injector well and producer well. This work presents the application of a methodology published in literature for optimizing water injection projects in modeling of a Brazilian Potiguar Basin reservoir that has a large number of wells. This methodology considers changes of injection well rates over time, based on information available through streamline simulation. This methodology reduces injection rates in wells of lower efficiency and increases injection rates in more efficient wells. In the proposed model, the methodology was effective. The optimized alternatives presented higher oil recovery associated with a lower water injection volume. This shows better efficiency and, consequently, reduction in costs. Considering the wide use of the water injection in oil fields, the positive outcome of the modeling is important, because it shows a case study of increasing of oil recovery achieved simply through better distribution of water injection rates
Resumo:
In Brazil and around the world, oil companies are looking for, and expected development of new technologies and processes that can increase the oil recovery factor in mature reservoirs, in a simple and inexpensive way. So, the latest research has developed a new process called Gas Assisted Gravity Drainage (GAGD) which was classified as a gas injection IOR. The process, which is undergoing pilot testing in the field, is being extensively studied through physical scale models and core-floods laboratory, due to high oil recoveries in relation to other gas injection IOR. This process consists of injecting gas at the top of a reservoir through horizontal or vertical injector wells and displacing the oil, taking advantage of natural gravity segregation of fluids, to a horizontal producer well placed at the bottom of the reservoir. To study this process it was modeled a homogeneous reservoir and a model of multi-component fluid with characteristics similar to light oil Brazilian fields through a compositional simulator, to optimize the operational parameters. The model of the process was simulated in GEM (CMG, 2009.10). The operational parameters studied were the gas injection rate, the type of gas injection, the location of the injector and production well. We also studied the presence of water drive in the process. The results showed that the maximum vertical spacing between the two wells, caused the maximum recovery of oil in GAGD. Also, it was found that the largest flow injection, it obtained the largest recovery factors. This parameter controls the speed of the front of the gas injected and determined if the gravitational force dominates or not the process in the recovery of oil. Natural gas had better performance than CO2 and that the presence of aquifer in the reservoir was less influential in the process. In economic analysis found that by injecting natural gas is obtained more economically beneficial than CO2
Resumo:
This work aims presenting the development of a model and computer simulation of a sucker rod pumping system. This system take into account the well geometry, the flow through the tubing, the dynamic behavior of the rod string and the use of a induction motor model. The rod string were modeled using concentrated parameters, allowing the use of ordinary differential equations systems to simulate it s behavior
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior
Resumo:
Oil production and exploration techniques have evolved in the last decades in order to increase fluid flows and optimize how the required equipment are used. The base functioning of Electric Submersible Pumping (ESP) lift method is the use of an electric downhole motor to move a centrifugal pump and transport the fluids to the surface. The Electric Submersible Pumping is an option that has been gaining ground among the methods of Artificial Lift due to the ability to handle a large flow of liquid in onshore and offshore environments. The performance of a well equipped with ESP systems is intrinsically related to the centrifugal pump operation. It is the pump that has the function to turn the motor power into Head. In this present work, a computer model to analyze the three-dimensional flow in a centrifugal pump used in Electric Submersible Pumping has been developed. Through the commercial program, ANSYS® CFX®, initially using water as fluid flow, the geometry and simulation parameters have been defined in order to obtain an approximation of what occurs inside the channels of the impeller and diffuser pump in terms of flow. Three different geometry conditions were initially tested to determine which is most suitable to solving the problem. After choosing the most appropriate geometry, three mesh conditions were analyzed and the obtained values were compared to the experimental characteristic curve of Head provided by the manufacturer. The results have approached the experimental curve, the simulation time and the model convergence were satisfactory if it is considered that the studied problem involves numerical analysis. After the tests with water, oil was used in the simulations. The results were compared to a methodology used in the petroleum industry to correct viscosity. In general, for models with water and oil, the results with single-phase fluids were coherent with the experimental curves and, through three-dimensional computer models, they are a preliminary evaluation for the analysis of the two-phase flow inside the channels of centrifugal pump used in ESP systems
Resumo:
The present study provides a methodology that gives a predictive character the computer simulations based on detailed models of the geometry of a porous medium. We using the software FLUENT to investigate the flow of a viscous Newtonian fluid through a random fractal medium which simplifies a two-dimensional disordered porous medium representing a petroleum reservoir. This fractal model is formed by obstacles of various sizes, whose size distribution function follows a power law where exponent is defined as the fractal dimension of fractionation Dff of the model characterizing the process of fragmentation these obstacles. They are randomly disposed in a rectangular channel. The modeling process incorporates modern concepts, scaling laws, to analyze the influence of heterogeneity found in the fields of the porosity and of the permeability in such a way as to characterize the medium in terms of their fractal properties. This procedure allows numerically analyze the measurements of permeability k and the drag coefficient Cd proposed relationships, like power law, for these properties on various modeling schemes. The purpose of this research is to study the variability provided by these heterogeneities where the velocity field and other details of viscous fluid dynamics are obtained by solving numerically the continuity and Navier-Stokes equations at pore level and observe how the fractal dimension of fractionation of the model can affect their hydrodynamic properties. This study were considered two classes of models, models with constant porosity, MPC, and models with varying porosity, MPV. The results have allowed us to find numerical relationship between the permeability, drag coefficient and the fractal dimension of fractionation of the medium. Based on these numerical results we have proposed scaling relations and algebraic expressions involving the relevant parameters of the phenomenon. In this study analytical equations were determined for Dff depending on the geometrical parameters of the models. We also found a relation between the permeability and the drag coefficient which is inversely proportional to one another. As for the difference in behavior it is most striking in the classes of models MPV. That is, the fact that the porosity vary in these models is an additional factor that plays a significant role in flow analysis. Finally, the results proved satisfactory and consistent, which demonstrates the effectiveness of the referred methodology for all applications analyzed in this study.
Resumo:
Petroleum evaluation is analyze it using different methodologies, following international standards to know their chemical and physicochemical properties, contaminant levels, composition and especially their ability to generate derivatives. Many of these analyzes consuming a lot of time, large amount of samples , supplies and need an organized transportation logistics, schedule and professionals involved. Looking for alternatives that optimize the evaluation and enable the use of new technologies, seven samples of different centrifuged Brazilian oils previously characterized by Petrobras were analyzed by thermogravimetry in 25-900° C range using heating rates of 05, 10 and 20ºC per minute. With experimental data obtained, characterizations correlations were performed and provided: generation of true boiling point curves (TBP) simulated; comparing fractions generated with appropriate cut standard in temperature ranges; an approach to obtain Watson characterization factor; and compare micro carbon residue formed. The results showed a good chance of reproducing simulated TBP curve from thermogravimetry taking into account the composition, density and other oil properties. Proposed correlations for experimental characterization factor and carbon residue followed Petrobras characterizations, showing that thermogravimetry can be used as a tool on oil evaluation, because your quick analysis, accuracy, and requires a minimum number of samples and consumables
Resumo:
The present study has as goal to analyze the aplicability of the computational technology as mediator in the english-speaking teaching in the Centro Federal de Educação Tecnológica do Rio Grande do Norte-CEFET/RN. The object of study was centered in the use of the computer incorporated in the teaching of English by four institution groups. The research was ruled metodologically in the study of case, adopting a qualitative and quantitative boarding of interpretative-reflexive mark. We support ourselves on a bibliographical literature revision that cares of the use of the computational technology matter into the class-room, aiming an education new practice, regarding the current reality conceptions in what we live in the technological education. We also use a referencial for a pedagogical action, trying to offer subsidies for a practice that provides the knowledges generation through the interaction, aiming a subject reflexive and critical education. For materialization of this study, we used esrutuctured action, as interviews for the teachers and students, besides the observations of the dayly in class-room, in order to get the necessary datas for analysis. During this study, we oserved that the use of the computer, while pedagogical support instrument in the english-speaking teaching, has acted like mediator of the teaching-learning process. The results demonstrate use of the computer use has been each more a practice adopted by other institution languages teachers. The conclusions confirm the hypothesis showed at the beginning of the work and evidence that the teachers are warden of forming thinking, reflexive and critical subjects. For that, they need to be prepared to face situations in which they can take the pedagogical practice to tune with the technological advances, consequently providing an effective technological education
Resumo:
This study investigates teacher training and cognitive practice of teachers in a Basic Education school that adopted the Project One Computer per Student (OCS) in their school routine. Its relevance consists in provide directions for the continuation of training activities on the Project and guide the teachers with their pedagogical practices using the laptop model one to one. The thesis defended is that the educator formation for social using of digital media (specially the laptops from the Project UCA) gives space to establish new sociotechnical relationships, of new social and professionals practices, new identitary components and a process of reflexivity and knowledge reconstruction to teach. We reaffirm the importance of reflexivity and appropriation of digital culture for the better development of teaching practice using the Information and Communication Technologies (ICTs), giving focus to the aspects of social and professional use of the technology. The study is part of the qualitative aspect and is a procedural tracking based on principles of ethnographic research. As procedures and methodological tools, were used: intensive observation of school environments, documental analysis, focal group, semi-structured questionnaires and semi-structured individual interviews. The research was held in a public school in the city of Parnamirim - RN. The subject sample relates to 17 teachers, coming from the elementary school I and II, Youth and Adult Education and High School, who went through the process of training UCA and having entered the laptops in their teaching. The research corpus is structured based on the messages built into the process of data collection and is analyzed based on principles of Content Analysis, specified by Laurence Bardin (2011). Was taken as theoretical reference studies by Tardif (2000; 2011), Pimenta (2009), Gorz (2004, 2005), Giddens (1991), Dewey, J. (1916), Boudieu (1994; 1999), Freire (1996; 2005), among others. The analysis indicates a process of reconstruction / revision of knowledge to teach and work in digital culture, being these knowledges guided by the experience of the subjects investigated. The reconstructed knowledges will be revealed from a categorization process. The following groups of knowledges: "technical knowledges", "didactic-methodological knowledges and knowledges of professionalization" were built on the assumption of ownership of digital culture in the educational context. The analysis confirms the appearance of new ways of sociability when acquiring other forms of acting and thinking ICTs, despite the environment adverse to the reflexivity shared among the teachers. Also reveals, based on the ownership concept present on the data analysis, the construction of meanings of belonging and transformation of individuals into social routes from the interweaving of the teaching practice with the digital culture. Emphasizes, finally, the importance of a training for use of ICTs that exceeds the instrumentation, in other words, what we call "technical knowledges", but taking on its structural basis the shared reflection, the opening for the ressignificance (new meaning) and reconstruction of new knowledges and practices and that really allows, to the teacher, the living of an experience capable of providing socio-technical transformations of their relationships