75 resultados para Tráfego : Poluição : Simulação
Resumo:
The increasing demand for Internet data traffic in wireless broadband access networks requires both the development of efficient, novel wireless broadband access technologies and the allocation of new spectrum bands for that purpose. The introduction of a great number of small cells in cellular networks allied to the complimentary adoption of Wireless Local Area Network (WLAN) technologies in unlicensed spectrum is one of the most promising concepts to attend this demand. One alternative is the aggregation of Industrial, Science and Medical (ISM) unlicensed spectrum to licensed bands, using wireless networks defined by Institute of Electrical and Electronics Engineers (IEEE) and Third Generation Partnership Project (3GPP). While IEEE 802.11 (Wi-Fi) networks are aggregated to Long Term Evolution (LTE) small cells via LTE / WLAN Aggregation (LWA), in proposals like Unlicensed LTE (LTE-U) and LWA the LTE air interface itself is used for transmission on the unlicensed band. Wi-Fi technology is widespread and operates in the same 5 GHz ISM spectrum bands as the LTE proposals, which may bring performance decrease due to the coexistence of both technologies in the same spectrum bands. Besides, there is the need to improve Wi-Fi operation to support scenarios with a large number of neighbor Overlapping Basic Subscriber Set (OBSS) networks, with a large number of Wi-Fi nodes (i.e. dense deployments). It is long known that the overall Wi-Fi performance falls sharply with the increase of Wi-Fi nodes sharing the channel, therefore there is the need for introducing mechanisms to increase its spectral efficiency. This work is dedicated to the study of coexistence between different wireless broadband access systems operating in the same unlicensed spectrum bands, and how to solve the coexistence problems via distributed coordination mechanisms. The problem of coexistence between different networks (i.e. LTE and Wi-Fi) and the problem of coexistence between different networks of the same technology (i.e. multiple Wi-Fi OBSSs) is analyzed both qualitatively and quantitatively via system-level simulations, and the main issues to be faced are identified from these results. From that, distributed coordination mechanisms are proposed and evaluated via system-level simulations, both for the inter-technology coexistence problem and intra-technology coexistence problem. Results indicate that the proposed solutions provide significant gains when compare to the situation without distributed coordination.
Resumo:
LINS, Filipe C. A. et al. Modelagem dinâmica e simulação computacional de poços de petróleo verticais e direcionais com elevação por bombeio mecânico. In: CONGRESSO BRASILEIRO DE PESQUISA E DESENVOLVIMENTO EM PETRÓLEO E GÁS, 5. 2009, Fortaleza, CE. Anais... Fortaleza: CBPDPetro, 2009.
Resumo:
This thesis aims to describe and demonstrate the developed concept to facilitate the use of thermal simulation tools during the building design process. Despite the impact of architectural elements on the performance of buildings, some influential decisions are frequently based solely on qualitative information. Even though such design support is adequate for most decisions, the designer will eventually have doubts concerning the performance of some design decisions. These situations will require some kind of additional knowledge to be properly approached. The concept of designerly ways of simulating focuses on the formulation and solution of design dilemmas, which are doubts about the design that cannot be fully understood nor solved without using quantitative information. The concept intends to combine the power of analysis from computer simulation tools with the capacity of synthesis from architects. Three types of simulation tools are considered: solar analysis, thermal/energy simulation and CFD. Design dilemmas are formulated and framed according to the architect s reflection process about performance aspects. Throughout the thesis, the problem is investigated in three fields: professional, technical and theoretical fields. This approach on distinct parts of the problem aimed to i) characterize different professional categories with regards to their design practice and use of tools, ii) investigate preceding researchers on the use of simulation tools and iii) draw analogies between the proposed concept, and some concepts developed or described in previous works about design theory. The proposed concept was tested in eight design dilemmas extracted from three case studies in the Netherlands. The three investigated processes are houses designed by Dutch architectural firms. Relevant information and criteria from each case study were obtained through interviews and conversations with the involved architects. The practical application, despite its success in the research context, allowed the identification of some applicability limitations of the concept, concerning the architects need to have technical knowledge and the actual evolution stage of simulation tools
Resumo:
This Masters Degree dissertation seeks to make a comparative study of internal air temperature data, simulated through the thermal computer application DesignBuilder 1.2, and data registered in loco through HOBO® Temp Data Logger, in a Social Housing Prototype (HIS), located at the Central Campus of the Federal University of Rio Grande do Norte UFRN. The prototype was designed and built seeking strategies of thermal comfort recommended for the local climate where the study was carried out, and built with panels of cellular concrete by Construtora DoisA, a collaborator of research project REPESC Rede de Pesquisa em Eficiência Energética de Sistemas Construtivos (Research Network on Energy Efficiency of Construction Systems), an integral part of Habitare program. The methodology employed carefully examined the problem, reviewed the bibliography, analyzing the major aspects related to computer simulations for thermal performance of buildings, such as climate characterization of the region under study and users thermal comfort demands. The DesignBuilder 1.2 computer application was used as a simulation tool, and theoretical alterations were carried out in the prototype, then they were compared with the parameters of thermal comfort adopted, based on the area s current technical literature. Analyses of the comparative studies were performed through graphical outputs for a better understanding of air temperature amplitudes and thermal comfort conditions. The data used for the characterization of external air temperature were obtained from the Test Reference Year (TRY), defined for the study area (Natal-RN). Thus the author also performed comparative studies for TRY data registered in the years 2006, 2007 and 2008, at weather station Davis Precision Station, located at the Instituto Nacional de Pesquisas Espaciais INPE-CRN (National Institute of Space Research), in a neighboring area of UFRN s Central Campus. The conclusions observed from the comparative studies performed among computer simulations, and the local records obtained from the studied prototype, point out that the simulations performed in naturally ventilated buildings is quite a complex task, due to the applications limitations, mainly owed to the complexity of air flow phenomena, the influence of comfort conditions in the surrounding areas and climate records. Lastly, regarding the use of the application DesignBuilder 1.2 in the present study, one may conclude that it is a good tool for computer simulations. However, it needs some adjustments to improve reliability in its use. There is a need for continued research, considering the dedication of users to the prototype, as well as the thermal charges of the equipment, in order to check sensitivity
Resumo:
The Noise Pollution causes degradation in the quality of the environment and presents itself as one of the most common environmental problems in the big cities. An Urban environment present scenario and their complex acoustic study need to consider the contribution of various noise sources. Accordingly to computational models through mapping and prediction of acoustic scene become important, because they enable the realization of calculations, analyzes and reports, allowing the interpretation of satisfactory results. The study neighborhood is the neighborhood of Lagoa Nova, a central area of the city of Natal, which will undergo major changes in urban space due to urban mobility projects planned for the area around the stadium and the consequent changes of urban form and traffic. Thus, this study aims to evaluate the noise impact caused by road and morphological changes around the stadium Arena das Dunas in the neighborhood of Lagoa Nova, through on-site measurements and mapping using the computational model SoundPLAN year 2012 and the scenario evolution acoustic for the year 2017. For this analysis was the construction of the first acoustic mapping based on current diagnostic acoustic neighborhood, physical mapping, classified vehicle count and measurement of sound pressure level, and to build the prediction of noise were observed for the area study the modifications provided for traffic, urban form and mobility work. In this study, it is concluded that the sound pressure levels of the year in 2012 and 2017 extrapolate current legislation. For the prediction of noise were numerous changes in the acoustic scene, in which the works of urban mobility provided will improve traffic flow, thus reduce the sound pressure level where interventions are expected
Resumo:
Natural air ventilation is the most import passive strategy to provide thermal comfort in hot and humid climates and a significant low energy strategy. However, the natural ventilated building requires more attention with the architectural design than a conventional building with air conditioning systems, and the results are less reliable. Therefore, this thesis focuses on softwares and methods to predict the natural ventilation performance from the point of view of the architect, with limited resource and knowledge of fluid mechanics. A typical prefabricated building was modelled due to its simplified geometry, low cost and occurrence at the local campus. Firstly, the study emphasized the use of computational fluid dynamics (CFD) software, to simulate the air flow outside and inside the building. A series of approaches were developed to make the simulations possible, compromising the results fidelity. Secondly, the results of CFD simulations were used as the input of an energy tool, to simulate the thermal performance under different rates of air renew. Thirdly, the results of temperature were assessed in terms of thermal comfort. Complementary simulations were carried out to detail the analyses. The results show the potentialities of these tools. However the discussions concerning the simplifications of the approaches, the limitations of the tools and the level of knowledge of the average architect are the major contribution of this study
Resumo:
This work focuses on the creation and applications of a dynamic simulation software in order to study the hard metal structure (WC-Co). The technological ground used to increase the GPU hardware capacity was Geforce 9600 GT along with the PhysX chip created to make games more realistic. The software simulates the three-dimensional carbide structure to the shape of a cubic box where tungsten carbide (WC) are modeled as triangular prisms and truncated triangular prisms. The program was proven effective regarding checking testes, ranging from calculations of parameter measures such as the capacity to increase the number of particles simulated dynamically. It was possible to make an investigation of both the mean parameters and distributions stereological parameters used to characterize the carbide structure through cutting plans. Grounded on the cutting plans concerning the analyzed structures, we have investigated the linear intercepts, the intercepts to the area, and the perimeter section of the intercepted grains as well as the binder phase to the structure by calculating the mean value and distribution of the free path. As literature shows almost consensually that the distribution of the linear intercepts is lognormal, this suggests that the grain distribution is also lognormal. Thus, a routine was developed regarding the program which made possible a more detailed research on this issue. We have observed that it is possible, under certain values for the parameters which define the shape and size of the Prismatic grain to find out the distribution to the linear intercepts that approach the lognormal shape. Regarding a number of developed simulations, we have observed that the distribution curves of the linear and area intercepts as well as the perimeter section are consistent with studies on static computer simulation to these parameters.
Resumo:
Until the early 90s, the simulation of fluid flow in oil reservoir basically used the numerical technique of finite differences. Since then, there was a big development in simulation technology based on streamlines, so that nowadays it is being used in several cases and it can represent the physical mechanisms that influence the fluid flow, such as compressibility, capillarity and gravitational segregation. Streamline-based flow simulation is a tool that can help enough in waterflood project management, because it provides important information not available through traditional simulation of finite differences and shows, in a direct way, the influence between injector well and producer well. This work presents the application of a methodology published in literature for optimizing water injection projects in modeling of a Brazilian Potiguar Basin reservoir that has a large number of wells. This methodology considers changes of injection well rates over time, based on information available through streamline simulation. This methodology reduces injection rates in wells of lower efficiency and increases injection rates in more efficient wells. In the proposed model, the methodology was effective. The optimized alternatives presented higher oil recovery associated with a lower water injection volume. This shows better efficiency and, consequently, reduction in costs. Considering the wide use of the water injection in oil fields, the positive outcome of the modeling is important, because it shows a case study of increasing of oil recovery achieved simply through better distribution of water injection rates
Resumo:
In Brazil and around the world, oil companies are looking for, and expected development of new technologies and processes that can increase the oil recovery factor in mature reservoirs, in a simple and inexpensive way. So, the latest research has developed a new process called Gas Assisted Gravity Drainage (GAGD) which was classified as a gas injection IOR. The process, which is undergoing pilot testing in the field, is being extensively studied through physical scale models and core-floods laboratory, due to high oil recoveries in relation to other gas injection IOR. This process consists of injecting gas at the top of a reservoir through horizontal or vertical injector wells and displacing the oil, taking advantage of natural gravity segregation of fluids, to a horizontal producer well placed at the bottom of the reservoir. To study this process it was modeled a homogeneous reservoir and a model of multi-component fluid with characteristics similar to light oil Brazilian fields through a compositional simulator, to optimize the operational parameters. The model of the process was simulated in GEM (CMG, 2009.10). The operational parameters studied were the gas injection rate, the type of gas injection, the location of the injector and production well. We also studied the presence of water drive in the process. The results showed that the maximum vertical spacing between the two wells, caused the maximum recovery of oil in GAGD. Also, it was found that the largest flow injection, it obtained the largest recovery factors. This parameter controls the speed of the front of the gas injected and determined if the gravitational force dominates or not the process in the recovery of oil. Natural gas had better performance than CO2 and that the presence of aquifer in the reservoir was less influential in the process. In economic analysis found that by injecting natural gas is obtained more economically beneficial than CO2
Resumo:
Efforts in research and development of new technologies to reduce emission levels of pollutant gases in the atmosphere has intensified in the last decades. In this context, it can be highlighted the modern systems of electronic engine management, new automotive catalysts and the use of renewable fuels which contributes to reduce the environmental impact. The purpose of this study was a comparative analysis of gas emissions from a automotive vehicle, operating with different fuels: natural gas, AEHC or gasoline. To execute the experimental tests, a flex vehicle was installed on a chassis dynamometer equipped with a gas analyzer and other complementary accessories according to the standard guidelines of emission and security procedures. Tests were performed according to NBR 6601 and NBR 7024, which define the urban and road driving cycle, respectively. Besides the analysis of exhaust gases in the discharge tube, before and after the catalyst, using the suction probe of the gas analyzer to simulate the vehicle in urban and road traffic, were performed tests of fuel characterization. Final results were conclusive in indicating leaded gasoline as the fuel which most contributed with pollutant emissions in atmosphere and the usual gasoline being the fuel which less contributed with pollutant emissions in atmosphere
Resumo:
This work aims presenting the development of a model and computer simulation of a sucker rod pumping system. This system take into account the well geometry, the flow through the tubing, the dynamic behavior of the rod string and the use of a induction motor model. The rod string were modeled using concentrated parameters, allowing the use of ordinary differential equations systems to simulate it s behavior
Resumo:
The Potengi river estuary is located in the region of Natal (RN, Brazil), comprising a population of approximately 1,000,000 inhabitants. Besides the dominant urban presence, the estuary has fragments of mangrove forest. The objective of this study is to determine the aliphatic hydrocarbons found in the bottom sediments of this estuary, identifying their levels, distribution and their possible origins through the diagnostic rates, indexes and results comparisons with the local anthropic and natural characteristics. The samples were obtained according to a plan that allowed sampling of the estuary up to 12 km upstream from it as mounth. 36 stations were selected, grouped into 12 cross sections through the course of the river and spaced on average by 1 km. Each section consisted of three stations: the right margin, the deepest point and the left margin. The hydrocarbon n-alkanes from C10 to C36, the isoprenoids pristane and phytane, the unresolved complex mixture (UCM) and the total resolved hydrocarbons were analyzed by gas chromatography. N-alkanes, pristane, phytane and UCM were detected only at some stations. In the other, the concentration was below the detection limit defined by the analytical method (0.1 mg / kg), preventing them from being analyzed to determine the origin of the material found. By using different parameters, the results show that the estuary receives both the input of petrogenic hydrocarbons, but also of biogenic hydrocarbons, featuring a mixture of sources and relatively impacted portions. Based on the characteristics and activities found in the region, it is possible to affirm that petrogenic sources related to oil products enter the estuary via urban runoff or boats traffic, boat washing and fueling. Turning to the biogenic source, the predominant origin was terrestrial, characterized by vascular plants, indicating contribution of mangrove vegetation. It was evident the presence of, at specific points in the estuary, hydrocarbon pollution, and, therefore is recommended the adoption of actions aimed at interrupting or, at least, mitigating the sources potentially capable of damp petrogenic hydrocarbons in the estuary studied.
Resumo:
Oil production and exploration techniques have evolved in the last decades in order to increase fluid flows and optimize how the required equipment are used. The base functioning of Electric Submersible Pumping (ESP) lift method is the use of an electric downhole motor to move a centrifugal pump and transport the fluids to the surface. The Electric Submersible Pumping is an option that has been gaining ground among the methods of Artificial Lift due to the ability to handle a large flow of liquid in onshore and offshore environments. The performance of a well equipped with ESP systems is intrinsically related to the centrifugal pump operation. It is the pump that has the function to turn the motor power into Head. In this present work, a computer model to analyze the three-dimensional flow in a centrifugal pump used in Electric Submersible Pumping has been developed. Through the commercial program, ANSYS® CFX®, initially using water as fluid flow, the geometry and simulation parameters have been defined in order to obtain an approximation of what occurs inside the channels of the impeller and diffuser pump in terms of flow. Three different geometry conditions were initially tested to determine which is most suitable to solving the problem. After choosing the most appropriate geometry, three mesh conditions were analyzed and the obtained values were compared to the experimental characteristic curve of Head provided by the manufacturer. The results have approached the experimental curve, the simulation time and the model convergence were satisfactory if it is considered that the studied problem involves numerical analysis. After the tests with water, oil was used in the simulations. The results were compared to a methodology used in the petroleum industry to correct viscosity. In general, for models with water and oil, the results with single-phase fluids were coherent with the experimental curves and, through three-dimensional computer models, they are a preliminary evaluation for the analysis of the two-phase flow inside the channels of centrifugal pump used in ESP systems
Resumo:
The present study provides a methodology that gives a predictive character the computer simulations based on detailed models of the geometry of a porous medium. We using the software FLUENT to investigate the flow of a viscous Newtonian fluid through a random fractal medium which simplifies a two-dimensional disordered porous medium representing a petroleum reservoir. This fractal model is formed by obstacles of various sizes, whose size distribution function follows a power law where exponent is defined as the fractal dimension of fractionation Dff of the model characterizing the process of fragmentation these obstacles. They are randomly disposed in a rectangular channel. The modeling process incorporates modern concepts, scaling laws, to analyze the influence of heterogeneity found in the fields of the porosity and of the permeability in such a way as to characterize the medium in terms of their fractal properties. This procedure allows numerically analyze the measurements of permeability k and the drag coefficient Cd proposed relationships, like power law, for these properties on various modeling schemes. The purpose of this research is to study the variability provided by these heterogeneities where the velocity field and other details of viscous fluid dynamics are obtained by solving numerically the continuity and Navier-Stokes equations at pore level and observe how the fractal dimension of fractionation of the model can affect their hydrodynamic properties. This study were considered two classes of models, models with constant porosity, MPC, and models with varying porosity, MPV. The results have allowed us to find numerical relationship between the permeability, drag coefficient and the fractal dimension of fractionation of the medium. Based on these numerical results we have proposed scaling relations and algebraic expressions involving the relevant parameters of the phenomenon. In this study analytical equations were determined for Dff depending on the geometrical parameters of the models. We also found a relation between the permeability and the drag coefficient which is inversely proportional to one another. As for the difference in behavior it is most striking in the classes of models MPV. That is, the fact that the porosity vary in these models is an additional factor that plays a significant role in flow analysis. Finally, the results proved satisfactory and consistent, which demonstrates the effectiveness of the referred methodology for all applications analyzed in this study.
Resumo:
Petroleum evaluation is analyze it using different methodologies, following international standards to know their chemical and physicochemical properties, contaminant levels, composition and especially their ability to generate derivatives. Many of these analyzes consuming a lot of time, large amount of samples , supplies and need an organized transportation logistics, schedule and professionals involved. Looking for alternatives that optimize the evaluation and enable the use of new technologies, seven samples of different centrifuged Brazilian oils previously characterized by Petrobras were analyzed by thermogravimetry in 25-900° C range using heating rates of 05, 10 and 20ºC per minute. With experimental data obtained, characterizations correlations were performed and provided: generation of true boiling point curves (TBP) simulated; comparing fractions generated with appropriate cut standard in temperature ranges; an approach to obtain Watson characterization factor; and compare micro carbon residue formed. The results showed a good chance of reproducing simulated TBP curve from thermogravimetry taking into account the composition, density and other oil properties. Proposed correlations for experimental characterization factor and carbon residue followed Petrobras characterizations, showing that thermogravimetry can be used as a tool on oil evaluation, because your quick analysis, accuracy, and requires a minimum number of samples and consumables