842 resultados para Simulação de reservatórios
Resumo:
This work consists of the integrated design process analyses with thermal energetic simulation during the early design stages, based on six practical cases. It aims to schematize the integration process, identifying the thermal energetic analyses contributions at each design phase and identifying the highest impact parameters on building performance. The simulations were run in the DesignBuilder energy tool, which has the same EnergyPlus engine, validated. This tool was chosen due to the flexible and user friendly graphic interface for modeling and output assessment, including the parametric simulation to compare design alternatives. The six case studies energy tools are three architectural and three retrofit projects, and the author the simulations as a consultant or as a designer. The case studies were selected based on the commitment of the designers in order to achieve performance goals, and their availability to share the process since the early pre-design analyses, allowing schematizing the whole process, and supporting the design decisions with quantifications, including energy targets. The thermoenergetic performance analyses integration is feasible since the early stages, except when only a short time is available to run the simulations. The simulation contributions are more important during the sketch and detail phases. The predesign phase can be assisted by means of reliable bioclimatic guidelines. It was verified that every case study had two dominant design variables on the general performance. These variables differ according the building characteristics and always coincide with the local bioclimatic strategies. The adaptation of alternatives to the design increases as earlier it occurs. The use of simulation is very useful: to prove and convince the architects; to quantify the cost benefits and payback period to the retrofit designer; and to the simulator confirm the desirable result and report the performance to the client
Resumo:
The city of Natal has a significant daylight availability, although it use isn’t systematically explored in schools architecture. In this context, this research aims to determine procedures for the analysis of the daylight performance in school design in Natal-RN. The method of analysis is divided in Visible Sky Factor (VSF), simulating and analyzing the results. The annual variation of the daylight behavior requires the adoption of dynamic simulation as data procedure. The classrooms were modelled in SketchUp, simulated in Daysim program and the results were assessed by means of spreadsheets in Microsoft Excel. The classrooms dimensions are 7.20mx 7.20m, with windows-to-wall-ratio (WWR) of 20%, 40% and 50%, and with different shading devices, such as standard horizontal overhang, sloped overhang, standard horizontal overhang with side view protection, standard horizontal overhang with a dropped edge, standard horizontal overhang with three horizontal louvers, double standard horizontal overhang, double standard horizontal overhang with three horizontal louvers, plus the use of shelf light in half the models with WWR of 40% and 50%. The data was organized in spreadsheets, with two intervals of UDI: between 300lux and 2000 lux and between 300lux and 3000lux. The simulation was performed with the weather file of 2009 to the city of NatalRN. The graphical outputs are illuminance curves, isolines of UDI among 300lux and 2000 lux and tables with index of occurrences of glare and to an UDI among 300lux 3000lux. The best UDI300-2000lux performance was evidenced to: Phase 1 (models with WWR of 20%), Phase 2 (models with WWR of 40% and 50% with light shelf). The best UDI300-3000lux performance was evidenced to: Phase 1 (models with WWR of 20% and 40% with light shelf) and Phase 2 (models with WWR of 40% and 50% with light shelf). The outputs prove that the daylight quality mainly depends on the shading system efficacy to avoid the glare occurrence, which determines the daylight discomfort. The bioclimatic recommendations of big openings with partial shading (with an opening with direct sunlight) resulted in illuminances level higher than the acceptable upper threshold. The improvement of the shading system percentage (from 73% to 91%) in medium-size of openings (WWR 40% and 50%) reduced or eliminate the glare occurrence without compromising the daylight zone depth (7.20m). The passive zone was determined for classrooms with satisfactory daylight performance, it was calculated the daylight zone depth rule-of-thumb with the ratio between daylight zone depth and the height of the window for different size of openings. The ratio ranged from 1.54 to 2.57 for WWR of 20%, 40% and 50% respectively. There was a reduction or elimination of glare in the passive area with light shelf, or with awning window shading.
Resumo:
The hospital is a place of complex actions, where several activities for serving the population are performed such as: medical appointments, exams, surgeries, emergency care, admission in wards and ICUs. These activities are mixed with anxiety, impatience, despair and distress of patients and their families, issues involving emotional balance both for professionals who provide services for them as for people cared by them. The healthcare crisis in Brazil is getting worse every year and today, constitutes a major problem for private hospitals. The patient that comes to emergencies progressively increase, and in contrast, there is no supply of hospital beds in the same proportion, causing overcrowding, declines in the quality of care delivered to patients, drain of professionals of the health area and difficulty in management the beds. This work presents a study that seeks to create an alternative tool that can contribute to the management of a private hospital beds. It also seeks to identify potential issues or deficiencies and therefore make changes in flow for an increase in service capacity, thus reducing costs without compromising the quality of services provided. The tool used was the Computational Simulation –based in discrete event, which aims to identify the main parameters to be considered for a proper modeling of this system. This study took as reference the admission of a private hospital, based on the current scenario, where your apartments are in saturation level as its occupancy rate. The relocation of project beds aims to meet the growing demand for surgeries and hospital admissions observed by the current administration.
Resumo:
The heavy part of the oil can be used for numerous purposes, e.g. to obtain lubricating oils. In this context, many researchers have been studying alternatives such separation of crude oil components, among which may be mentioned molecular distillation. Molecular distillation is a forced evaporation technique different from other conventional processes in the literature. This process can be classified as a special distillation case under high vacuum with pressures that reach extremely low ranges of the order of 0.1 Pascal. The evaporation and condensation surfaces must have a distance from each other of the magnitude order of mean free path of the evaporated molecules, that is, molecules evaporated easily reach the condenser, because they find a route without obstacles, what is desirable. Thus, the main contribution of this work is the simulation of the falling-film molecular distillation for crude oil mixtures. The crude oil was characterized using UniSim® Design and R430 Aspen HYSYS® V8.5. The results of this characterization were performed in spreadsheets of Microsoft® Excel®, calculations of the physicochemical properties of the waste of an oil sample, i.e., thermodynamic and transport. Based on this estimated properties and boundary conditions suggested by the literature, equations of temperature and concentration profiles were resolved through the implicit finite difference method using the programming language Visual Basic® (VBA) for Excel®. The result of the temperature profile showed consistent with the reproduced by literature, having in their initial values a slight distortion as a result of the nature of the studied oil is lighter than the literature, since the results of the concentration profiles were effective allowing realize that the concentration of the more volatile decreases and of the less volatile increases due to the length of the evaporator. According to the transport phenomena present in the process, the velocity profile tends to increase to a peak and then decreases, and the film thickness decreases, both as a function of the evaporator length. It is concluded that the simulation code in Visual Basic® language (VBA) is a final product of the work that allows application to molecular distillation of petroleum and other similar mixtures.
Resumo:
Reservoirs are artificial ecosystems, intermediate between rivers and lakes, with diferent morphological and hydrological characteristics that can provide many important benefits to society. However, the use of this water for human consumption, watering livestock, leisure, irrigated agricultural production and pisciculture development, directly influence the increase loading of nutrients to aquatic environments and contribute to acceleration of eutrophication. Furthermore, global climate models are predicting a higher occurrence of extreme events such as floods and severe droughts, which will create hydrological stresses in lakes. In the semiarid northeast we can see the occurrence of these events, the drought of the years 2012, 2013 and 2014 was the worst drought in 60 years, according to the National Water Agency (ANA). Thus, this study aimed to evaluate the quality of the semiarid tropical water sources, identifying temporal patterns in periods with extreme hydrological events (floods and severe droughts). The study results showed that Gargalheiras and Cruzeta reservoirs presented significative changes in the limnological variables between rain and severe drought periods, with better appearance and in the most of the water quality variables in the rainy season and higher nutrientes concentrations and high electrical conductivity values in severe season, indicating decay of its quality. However, we found diferent behaviors between the reservoirs in severe drought. While Gargalheiras showed a typical behavior of the region, with high concentrations of algal biomass, indicating the worsening eutrophication, Cruzeta demonstrated a colapse in the total phytoplankton biomass, evidenced by the decrease in chla concentrations. This fact occurred because the low depth and proximity with the sediment facilited the inorganic solids resuspension and, consequently, resulted in turbid water column and light by limitation. In addition, the different behaviors between the reservoirs indicate that the responses of these environments problems such as extreme events must take into account factors such the region climate, size, depth of the reservoir and the basin characteristics.
Resumo:
The success achieved by thermal methods of recovery, in heavy oils, prompted the emergence of studies on the use of electromagnetic waves as heat generating sources in oil reservoirs. Thus, this generation is achieved by three types of different processes according to the frequency range used. They are: the electromagnetic induction heating, the resistive and the dielectric, also known as radiation. This study was based on computer simulations in oil reservoirs with characteristics similar to those found in the sedimentary basins of the Brazilian Northeast. All cases studied were simulated using the software STARS, CMG (Computer Group, version 2012.10 Modeling). Some simulations took into account the inclusion of electrically sensitive particles in certain sectors of the reservoir model studied by fracturing. The purpose of this work is the use of the electromagnetic induction heating as a recovery method of heavy oil, to check the influence of these aforementioned particles on the reservoir model used. Comparative analyses were made involving electromagnetic induction heating, the operation of hydraulic fracturing and the injection of water to the different situations of the reservoir model studied. It was found that fracturing the injection well in order that the electromagnetic heating occurs in the same well where there is water injection, there was a considerable increase in the recovery factor and in the cumulative oil production in relation to the models in which hydraulic fracturing occurred in the production well and water injection in the injection well. This is due to the generation of steam in situ in the reservoir.
Resumo:
In the Oil industry, oil and gas pipelines are commonly utilized to perform the transportation of production fluids to longer distances. The maintenance of the pipelines passes through the analysis of several tools, in which the most currently used are the pipelines inspection cells, popularly knowing as PIG. Among the variants existing in the market, the instrumented PIG has a significant relevance; acknowledging that through the numerous sensors existing in the equipment, it can detect faults or potential failure along the inspected line. Despite its versatility, the instrumented PIG suffers from speed variations, impairing the reading of sensors embedded in it. Considering that PIG moves depending on the speed of the production fluid, a way to control his speed is to control the flow of the fluid through the pressure control, reducing the flow rate of the produced flow, resulting in reduction of overall production the fluid in the ducts own or with the use of a restrictive element (valve) installed on it. The characteristic of the flow rate/pressure drop from restrictive elements of the orifice plate is deducted usually from the ideal energy equation (Bernoulli’s equation) and later, the losses are corrected normally through experimental tests. Thus, with the objective of controlling the fluids flow passing through the PIG, a valve shutter actuated by solenoid has been developed. This configuration allows an ease control and stabilization of the flow adjustment, with a consequent response in the pressure drops between upstream and downstream of the restriction. It was assembled a test bench for better definition of flow coefficients; composed by a duct with intern diameter of four inches, one set of shutters arranged in a plate and pressure gauges for checking the pressure drop in the test. The line was pressurized and based on the pressure drop it was possible to draw a curve able to characterize the flow coefficient of the control valve prototype and simulate in mockup the functioning, resulting in PIG speed reduction of approximately 68%.
Resumo:
The teaching of the lumbar puncture (LP) technique with simulator is not well systematized in the curricula of medical schools. Studies show that training in the simulator provides learning technical skills, acquisition and retention of knowledge, improve self-confidence of the learner and enables the transfer to clinical practice. We intend this study to introduce simulated training in LP in medical course at the Universidade Federal do Rio Grande do Norte evaluating the experience taking into account quantitative aspects (performance on standardized tests) and qualitative (perception of the students regarding the method and the teaching process learning). The study was conducted in two phases. In the first phase practical training in PL was introduced in the 3rd year of medical school. Seventy-seven students were trained in small groups, guided by a checklist developed in the model Objective Structured Assessment of Technical Skill (OSATS), at this moment they knew they were not under performance evaluation. They were also asked whether they had prior chances to make an LP in patients. At the end of the first phase the students evaluated training in the following areas: teaching technique, simulator realism, time available per group, number of participants per group and relevance to medical practice. In the second phase, two years later, 18 students trained in first stage performed a new LP on the mannequin simulator, and its performance was evaluated through the same checklist of training in order to verify the technical retention. In addition, they answered a multiple choice test about practical aspects of the LP technique. Each participant received individual feedback on their performance at the end of their participation in the study. In the first phase of the study we found that only 4% of students had performed a lumbar puncture in patients until the 3rd year. The training of LP technique with simulator mannequin was considered relevant and the teaching methods was thoroughly evaluated. In the second phase, all participants were successful in implementing the lumbar puncture on the mannequin simulator, compliance with the most steps in a reasonable time, suggesting that would be able to perform the procedure in a patient.
Resumo:
Variable reluctance motors have been increasingly used as an alternative for variable speed and high speed drives in many industrial applications, due to many advantages like the simplicity of construction, robustness, and low cost. The most common applications in recent years are related to aeronautics, electric and hybrid vehicles and wind power generation. This paper explores the theory, operation, design procedures and analysis of a variable reluctance machine. An iterative design methodology is introduced and used to design a 1.25 kW prototype. For the analysis of the machine two methods are used, an analytical method and the finite element simulation. The results obtained by both methods are compared. The results of finite element simulation are used to determine the inductance profiles and torque of the prototype. The magnetic saturation is examined visually and numerically in four critical points of the machine. The data collected in the simulation allow the verification of design and operating limits for the prototype. Moreover, the behavior of the output quantities is analyzed (inductance, torque and magnetic saturation) by variation of physical dimensions of the motor. Finally, a multiobjective optimization using Differential Evolution algorithms and Genetic Algorithms for switched reluctance machine design is proposed. The optimized variables are rotor and stator polar arcs, and the goals are to maximize the average torque, the average torque per copper losses and the average torque per core volume. Finally, the initial design and optimized design are compared.
Resumo:
In this work are considered two bidimensional systems, with distints chacacteristcs and applicabilitys. Is studied the adsorption of transition metals (MT) Fe, Co, Mn and Ru in extended defects, formed by graphene grain boundaries. First in pristine graphene The hollow site of carbon hexagon, in pristine graphene, are the most stable for MT adsorption. The Dirac cone in eletronic structure of graphene was manteined with the presence of MT. For the considered grain boundaries the MT has a greater stability for absorption in the grain boundaries sites in comparison with pristine graphene. Through the energy barrier values, are observed diffusion chanels for MT localized on the grain boundaries. This diffusion chanels indicate a possible formation of nanolines of MT in graphene. For the first stage of the nanolines, ate observed a better stability for the system with greater MT concentration, due to MT-MT interactions. Also, due to the magnetic moment of the MT, the nanolines show a magnetization. For the most stable configurations the system are metallics, particularly for Fe the band structure indicates an anisotropic spin current. In a second study, are considereted the retention capacity for metallic contaminants Cd and Hg in clayminerals, kaolinite (KAO) and montmorillonite (MMT). Through the adsorption energies of contaminantes in the clayminerals, was observed a increase in stability with the increase of contaminants concentration, due to the interaction Cd-Cd and Hg-Hg. Also, was observed that KAO has a strong interaction beteween monolayers than MMT. In this sence, for the adsoption process of contaminantes in the natural form of KAO and MMT, the latter has a better retention capacity, due to the small net work for contaminant intercalation. However, when the modification of clayminerals, with molecules that increase the spacing between monolayers, exist a optimal condition, which the contaminant absorption are more stable in KAO system than in MMT. In the Langmuir adsorption model for the clayminerals in the optimal monolayer spacing, the retention capacity for Cd and Hg in KAO system are 21% greater than in MMT system. Also, for the X-ray Absorption Near Edge Spectroscopy (XANES) for the K edge of Cd and Hg, are found a positive shift of absorption edge with the decreasing of monolayer spacing. This result indicates a possible way to determine the concentration of adsorbed contaminats in relation to unabsorbed ones, from the decomposition of experimental XANES in the obteined spectras.
Resumo:
Análises clínicas compostas podem ser aplicadas com o objetivo de economizar recursos quando se pretende classificar uma população (identificar todos os indivíduos infetados) na qual a taxa de prevalência é diminuta, apesar de reduzir a fiabilidade da classificação. Neste sentido, o presente trabalho tem como objetivo comparar o desempenho de várias metodologias de classificação (ensaios individuais, metodologia de Dorfman, algoritmos hierárquicos e testes baseados em arrays com e sem master pool), nomeadamente o custo relativo (número esperado de testes para a classificação de cada indivíduo) e a probabilidade de existência de erros de classificação (medida pela especificidade e pela sensibilidade de cada metodologia). Assim, as usuais técnicas de simulação (realizadas recorrendo ao software estatístico R) foram aplicadas a populações com distintos cenários, usando diferentes taxas de prevalência, várias dimensões do grupo, bem como diversos níveis de sensibilidade e de especificidade. Ao longo deste trabalho foi assumido que o agrupamento dos sangues (criação do sangue composto) não afeta a probabilidade de má classificação (ausência do efeito de diluição), como é comprovado por muitas análises qualitativas (presença ou ausência da infeção). As simulações realizadas mostram que os testes compostos só podem ser recomendados em casos com baixas taxas de prevalência e baixas probabilidades de erros de classificação, sendo possível identificar a metodologia mais adequada para cada caso em função da sua taxa de prevalência, sensibilidade e especificidade. Além disso, sempre que a taxa de prevalência, a sensibilidade e a especificidade são conhecidos (ou, pelo menos, estimativas razoáveis estão disponíveis), simulações podem ser realizadas para identificar a metodologia mais adequada, e, deste modo, encontrar um ponto de equilíbrio entre o custo e a fiabilidade da classificação.
Resumo:
LINS, Filipe C. A. et al. Modelagem dinâmica e simulação computacional de poços de petróleo verticais e direcionais com elevação por bombeio mecânico. In: CONGRESSO BRASILEIRO DE PESQUISA E DESENVOLVIMENTO EM PETRÓLEO E GÁS, 5. 2009, Fortaleza, CE. Anais... Fortaleza: CBPDPetro, 2009.
Resumo:
Dissertação (Mestrado)