1000 resultados para Simulação de defeitos
Resumo:
The distribution and mobilization of fluid in a porous medium depend on the capillary, gravity, and viscous forces. In oil field, the processes of enhanced oil recovery involve change and importance of these forces to increase the oil recovery factor. In the case of gas assisted gravity drainage (GAGD) process is important to understand the physical mechanisms to mobilize oil through the interaction of these forces. For this reason, several authors have developed physical models in laboratory and core floods of GAGD to study the performance of these forces through dimensionless groups. These models showed conclusive results. However, numerical simulation models have not been used for this type of study. Therefore, the objective of this work is to study the performance of capillary, viscous and gravity forces on GAGD process and its influence on the oil recovery factor through a 2D numerical simulation model. To analyze the interplay of these forces, dimensionless groups reported in the literature have been used such as Capillary Number (Nc), Bond number (Nb) and Gravity Number (Ng). This was done to determine the effectiveness of each force related to the other one. A comparison of the results obtained from the numerical simulation was also carried out with the results reported in the literature. The results showed that before breakthrough time, the lower is the injection flow rate, oil recovery is increased by capillary force, and after breakthrough time, the higher is the injection flow rate, oil recovery is increased by gravity force. A good relationship was found between the results obtained in this research with those published in the literature. The simulation results indicated that before the gas breakthrough, higher oil recoveries were obtained at lower Nc and Nb and, after the gas breakthrough, higher oil recoveries were obtained at lower Ng. The numerical models are consistent with the reported results in the literature
Resumo:
Primary processing of natural gas platforms as Mexilhão Field (PMXL-1 ) in the Santos Basin, where monoethylene glycol (MEG) has been used to inhibit the formation of hydrates, present operational problems caused by salt scale in the recovery unit of MEG. Bibliographic search and data analysis of salt solubility in mixed solvents, namely water and MEG, indicate that experimental reports are available to a relatively restricted number of ionic species present in the produced water, such as NaCl and KCl. The aim of this study was to develop a method for calculating of salt solubilities in mixed solvent mixtures, in explantion, NaCl or KCl in aqueous mixtures of MEG. The method of calculating extend the Pitzer model, with the approach Lorimer, for aqueous systems containing a salt and another solvent (MEG). Python language in the Integrated Development Environment (IDE) Eclipse was used in the creation of the computational applications. The results indicate the feasibility of the proposed calculation method for a systematic series of salt (NaCl or KCl) solubility data in aqueous mixtures of MEG at various temperatures. Moreover, the application of the developed tool in Python has proven to be suitable for parameter estimation and simulation purposes
Resumo:
This work consists of the integrated design process analyses with thermal energetic simulation during the early design stages, based on six practical cases. It aims to schematize the integration process, identifying the thermal energetic analyses contributions at each design phase and identifying the highest impact parameters on building performance. The simulations were run in the DesignBuilder energy tool, which has the same EnergyPlus engine, validated. This tool was chosen due to the flexible and user friendly graphic interface for modeling and output assessment, including the parametric simulation to compare design alternatives. The six case studies energy tools are three architectural and three retrofit projects, and the author the simulations as a consultant or as a designer. The case studies were selected based on the commitment of the designers in order to achieve performance goals, and their availability to share the process since the early pre-design analyses, allowing schematizing the whole process, and supporting the design decisions with quantifications, including energy targets. The thermoenergetic performance analyses integration is feasible since the early stages, except when only a short time is available to run the simulations. The simulation contributions are more important during the sketch and detail phases. The predesign phase can be assisted by means of reliable bioclimatic guidelines. It was verified that every case study had two dominant design variables on the general performance. These variables differ according the building characteristics and always coincide with the local bioclimatic strategies. The adaptation of alternatives to the design increases as earlier it occurs. The use of simulation is very useful: to prove and convince the architects; to quantify the cost benefits and payback period to the retrofit designer; and to the simulator confirm the desirable result and report the performance to the client
Resumo:
The city of Natal has a significant daylight availability, although it use isn’t systematically explored in schools architecture. In this context, this research aims to determine procedures for the analysis of the daylight performance in school design in Natal-RN. The method of analysis is divided in Visible Sky Factor (VSF), simulating and analyzing the results. The annual variation of the daylight behavior requires the adoption of dynamic simulation as data procedure. The classrooms were modelled in SketchUp, simulated in Daysim program and the results were assessed by means of spreadsheets in Microsoft Excel. The classrooms dimensions are 7.20mx 7.20m, with windows-to-wall-ratio (WWR) of 20%, 40% and 50%, and with different shading devices, such as standard horizontal overhang, sloped overhang, standard horizontal overhang with side view protection, standard horizontal overhang with a dropped edge, standard horizontal overhang with three horizontal louvers, double standard horizontal overhang, double standard horizontal overhang with three horizontal louvers, plus the use of shelf light in half the models with WWR of 40% and 50%. The data was organized in spreadsheets, with two intervals of UDI: between 300lux and 2000 lux and between 300lux and 3000lux. The simulation was performed with the weather file of 2009 to the city of NatalRN. The graphical outputs are illuminance curves, isolines of UDI among 300lux and 2000 lux and tables with index of occurrences of glare and to an UDI among 300lux 3000lux. The best UDI300-2000lux performance was evidenced to: Phase 1 (models with WWR of 20%), Phase 2 (models with WWR of 40% and 50% with light shelf). The best UDI300-3000lux performance was evidenced to: Phase 1 (models with WWR of 20% and 40% with light shelf) and Phase 2 (models with WWR of 40% and 50% with light shelf). The outputs prove that the daylight quality mainly depends on the shading system efficacy to avoid the glare occurrence, which determines the daylight discomfort. The bioclimatic recommendations of big openings with partial shading (with an opening with direct sunlight) resulted in illuminances level higher than the acceptable upper threshold. The improvement of the shading system percentage (from 73% to 91%) in medium-size of openings (WWR 40% and 50%) reduced or eliminate the glare occurrence without compromising the daylight zone depth (7.20m). The passive zone was determined for classrooms with satisfactory daylight performance, it was calculated the daylight zone depth rule-of-thumb with the ratio between daylight zone depth and the height of the window for different size of openings. The ratio ranged from 1.54 to 2.57 for WWR of 20%, 40% and 50% respectively. There was a reduction or elimination of glare in the passive area with light shelf, or with awning window shading.
Resumo:
The hospital is a place of complex actions, where several activities for serving the population are performed such as: medical appointments, exams, surgeries, emergency care, admission in wards and ICUs. These activities are mixed with anxiety, impatience, despair and distress of patients and their families, issues involving emotional balance both for professionals who provide services for them as for people cared by them. The healthcare crisis in Brazil is getting worse every year and today, constitutes a major problem for private hospitals. The patient that comes to emergencies progressively increase, and in contrast, there is no supply of hospital beds in the same proportion, causing overcrowding, declines in the quality of care delivered to patients, drain of professionals of the health area and difficulty in management the beds. This work presents a study that seeks to create an alternative tool that can contribute to the management of a private hospital beds. It also seeks to identify potential issues or deficiencies and therefore make changes in flow for an increase in service capacity, thus reducing costs without compromising the quality of services provided. The tool used was the Computational Simulation –based in discrete event, which aims to identify the main parameters to be considered for a proper modeling of this system. This study took as reference the admission of a private hospital, based on the current scenario, where your apartments are in saturation level as its occupancy rate. The relocation of project beds aims to meet the growing demand for surgeries and hospital admissions observed by the current administration.
Resumo:
The heavy part of the oil can be used for numerous purposes, e.g. to obtain lubricating oils. In this context, many researchers have been studying alternatives such separation of crude oil components, among which may be mentioned molecular distillation. Molecular distillation is a forced evaporation technique different from other conventional processes in the literature. This process can be classified as a special distillation case under high vacuum with pressures that reach extremely low ranges of the order of 0.1 Pascal. The evaporation and condensation surfaces must have a distance from each other of the magnitude order of mean free path of the evaporated molecules, that is, molecules evaporated easily reach the condenser, because they find a route without obstacles, what is desirable. Thus, the main contribution of this work is the simulation of the falling-film molecular distillation for crude oil mixtures. The crude oil was characterized using UniSim® Design and R430 Aspen HYSYS® V8.5. The results of this characterization were performed in spreadsheets of Microsoft® Excel®, calculations of the physicochemical properties of the waste of an oil sample, i.e., thermodynamic and transport. Based on this estimated properties and boundary conditions suggested by the literature, equations of temperature and concentration profiles were resolved through the implicit finite difference method using the programming language Visual Basic® (VBA) for Excel®. The result of the temperature profile showed consistent with the reproduced by literature, having in their initial values a slight distortion as a result of the nature of the studied oil is lighter than the literature, since the results of the concentration profiles were effective allowing realize that the concentration of the more volatile decreases and of the less volatile increases due to the length of the evaporator. According to the transport phenomena present in the process, the velocity profile tends to increase to a peak and then decreases, and the film thickness decreases, both as a function of the evaporator length. It is concluded that the simulation code in Visual Basic® language (VBA) is a final product of the work that allows application to molecular distillation of petroleum and other similar mixtures.
Resumo:
The constant necessity for new sources of renewable energy is increasingly promoting the increase of investments in this area. Among other sources, the wind power has been becoming prominent. It is important to promote the search for the improvement of the technologies involved in the topologies of the wind turbines, seeking for alternatives which enhance the gotten performance, despite the irregularity of the wind speed. This study presents a new system for speed control, in this case applied to the wind turbines - the Electromagnetic Frequency Regulator (EFR). One of the most used devices in some topologies is the mechanical gearboxes which, along with a short service life, often represent sources of noise and defects. The EFR does not need these transmission boxes, representing a technological advancement, using for that an adapted induction machine, in which the stator becomes mobile, supportive to the axis of the turbine. In the topology used in this study, the EFR also allows us to leave out the usage of the eletronic converters to establish the coupling between the generator and the electrical grid. It also the reason why it provides the possibility of obtaining the generation in alternating current, with constant voltage and frequency, where there is no electrical grid. Responsable for the mechanical speed control of the generator, the EFR can be useful in other transmission systems in which the mechanical speed control output is the objective. In addition, the EFR operates through the combination of two inputs, a mechanical and other electrical. It multiplies the possibilities of application because it is able to synergistic coupling between different arrays of energy, and, for such reasons, it enables the various sources of energy involved to be uncoupled from the network, being the synchronous generator responsible for the system connection with the electrical grid, simplifying the control strategies on the power injected in it. Experimental and simulation results are presented through this study, about a wind turbine, validating the proposal related to the efficience in the speed control of the system for different wind conditions.
Resumo:
In the Oil industry, oil and gas pipelines are commonly utilized to perform the transportation of production fluids to longer distances. The maintenance of the pipelines passes through the analysis of several tools, in which the most currently used are the pipelines inspection cells, popularly knowing as PIG. Among the variants existing in the market, the instrumented PIG has a significant relevance; acknowledging that through the numerous sensors existing in the equipment, it can detect faults or potential failure along the inspected line. Despite its versatility, the instrumented PIG suffers from speed variations, impairing the reading of sensors embedded in it. Considering that PIG moves depending on the speed of the production fluid, a way to control his speed is to control the flow of the fluid through the pressure control, reducing the flow rate of the produced flow, resulting in reduction of overall production the fluid in the ducts own or with the use of a restrictive element (valve) installed on it. The characteristic of the flow rate/pressure drop from restrictive elements of the orifice plate is deducted usually from the ideal energy equation (Bernoulli’s equation) and later, the losses are corrected normally through experimental tests. Thus, with the objective of controlling the fluids flow passing through the PIG, a valve shutter actuated by solenoid has been developed. This configuration allows an ease control and stabilization of the flow adjustment, with a consequent response in the pressure drops between upstream and downstream of the restriction. It was assembled a test bench for better definition of flow coefficients; composed by a duct with intern diameter of four inches, one set of shutters arranged in a plate and pressure gauges for checking the pressure drop in the test. The line was pressurized and based on the pressure drop it was possible to draw a curve able to characterize the flow coefficient of the control valve prototype and simulate in mockup the functioning, resulting in PIG speed reduction of approximately 68%.
Resumo:
In the last 16 years emerged in Brazil a segment of independent producers with focus on onshore basins and shallow waters. Among the challenges of these companies is the development of fields with projects with a low net present value (NPV). The objective of this work was to study the technical-economical best option to develop an oil field in the Brazilian Northeast using reservoir simulation. Real geology, reservoir and production data was used to build the geological and simulation model. Due to not having PVT analysis, distillation method test data known as the true boiling points (TBP) were used to create a fluids model generating the PVT data. After execution of the history match, four development scenarios were simulated: the extrapolation of production without new investments, the conversion of a producing well for immiscible gas injection, the drilling of a vertical well and the drilling of a horizontal well. As a result, from the financial point of view, the gas injection is the alternative with lower added value, but it may be viable if there are environmental or regulatory restrictions to flaring or venting the produced gas into the atmosphere from this field or neighboring accumulations. The recovery factor achieved with the drilling of vertical and horizontal wells is similar, but the horizontal well is a project of production acceleration; therefore, the present incremental cumulative production with a minimum rate of company's attractiveness is higher. Depending on the crude oil Brent price and the drilling cost, this option can be technically and financially viable.
Resumo:
The teaching of the lumbar puncture (LP) technique with simulator is not well systematized in the curricula of medical schools. Studies show that training in the simulator provides learning technical skills, acquisition and retention of knowledge, improve self-confidence of the learner and enables the transfer to clinical practice. We intend this study to introduce simulated training in LP in medical course at the Universidade Federal do Rio Grande do Norte evaluating the experience taking into account quantitative aspects (performance on standardized tests) and qualitative (perception of the students regarding the method and the teaching process learning). The study was conducted in two phases. In the first phase practical training in PL was introduced in the 3rd year of medical school. Seventy-seven students were trained in small groups, guided by a checklist developed in the model Objective Structured Assessment of Technical Skill (OSATS), at this moment they knew they were not under performance evaluation. They were also asked whether they had prior chances to make an LP in patients. At the end of the first phase the students evaluated training in the following areas: teaching technique, simulator realism, time available per group, number of participants per group and relevance to medical practice. In the second phase, two years later, 18 students trained in first stage performed a new LP on the mannequin simulator, and its performance was evaluated through the same checklist of training in order to verify the technical retention. In addition, they answered a multiple choice test about practical aspects of the LP technique. Each participant received individual feedback on their performance at the end of their participation in the study. In the first phase of the study we found that only 4% of students had performed a lumbar puncture in patients until the 3rd year. The training of LP technique with simulator mannequin was considered relevant and the teaching methods was thoroughly evaluated. In the second phase, all participants were successful in implementing the lumbar puncture on the mannequin simulator, compliance with the most steps in a reasonable time, suggesting that would be able to perform the procedure in a patient.
Resumo:
Variable reluctance motors have been increasingly used as an alternative for variable speed and high speed drives in many industrial applications, due to many advantages like the simplicity of construction, robustness, and low cost. The most common applications in recent years are related to aeronautics, electric and hybrid vehicles and wind power generation. This paper explores the theory, operation, design procedures and analysis of a variable reluctance machine. An iterative design methodology is introduced and used to design a 1.25 kW prototype. For the analysis of the machine two methods are used, an analytical method and the finite element simulation. The results obtained by both methods are compared. The results of finite element simulation are used to determine the inductance profiles and torque of the prototype. The magnetic saturation is examined visually and numerically in four critical points of the machine. The data collected in the simulation allow the verification of design and operating limits for the prototype. Moreover, the behavior of the output quantities is analyzed (inductance, torque and magnetic saturation) by variation of physical dimensions of the motor. Finally, a multiobjective optimization using Differential Evolution algorithms and Genetic Algorithms for switched reluctance machine design is proposed. The optimized variables are rotor and stator polar arcs, and the goals are to maximize the average torque, the average torque per copper losses and the average torque per core volume. Finally, the initial design and optimized design are compared.
Resumo:
Análises clínicas compostas podem ser aplicadas com o objetivo de economizar recursos quando se pretende classificar uma população (identificar todos os indivíduos infetados) na qual a taxa de prevalência é diminuta, apesar de reduzir a fiabilidade da classificação. Neste sentido, o presente trabalho tem como objetivo comparar o desempenho de várias metodologias de classificação (ensaios individuais, metodologia de Dorfman, algoritmos hierárquicos e testes baseados em arrays com e sem master pool), nomeadamente o custo relativo (número esperado de testes para a classificação de cada indivíduo) e a probabilidade de existência de erros de classificação (medida pela especificidade e pela sensibilidade de cada metodologia). Assim, as usuais técnicas de simulação (realizadas recorrendo ao software estatístico R) foram aplicadas a populações com distintos cenários, usando diferentes taxas de prevalência, várias dimensões do grupo, bem como diversos níveis de sensibilidade e de especificidade. Ao longo deste trabalho foi assumido que o agrupamento dos sangues (criação do sangue composto) não afeta a probabilidade de má classificação (ausência do efeito de diluição), como é comprovado por muitas análises qualitativas (presença ou ausência da infeção). As simulações realizadas mostram que os testes compostos só podem ser recomendados em casos com baixas taxas de prevalência e baixas probabilidades de erros de classificação, sendo possível identificar a metodologia mais adequada para cada caso em função da sua taxa de prevalência, sensibilidade e especificidade. Além disso, sempre que a taxa de prevalência, a sensibilidade e a especificidade são conhecidos (ou, pelo menos, estimativas razoáveis estão disponíveis), simulações podem ser realizadas para identificar a metodologia mais adequada, e, deste modo, encontrar um ponto de equilíbrio entre o custo e a fiabilidade da classificação.
Resumo:
Temos assistido nos últimos anos à proliferação da produção distribuída de electricidade, sobretudo, com recurso a fontes de energia renováveis, implicando a natural reestruturação das redes eléctricas existentes, desde a produção até ao consumidor final. As constantes preocupações na garantia da qualidade de serviço e até mesmo em termos ambientais, levam a que a operacionalidade das redes seja cada vez mais eficiente, visando a integração de tecnologias emergentes como é o caso dos sistemas de armazenamento de energia. A aposta nas energias de origem renovável nomeadamente a solar e a eólica, representa uma forma cada vez mais presente de geração de electricidade, tendo como grande inconveniente o regime de intermitência a que estão sujeitas, não se conseguindo tirar proveitos absolutos de todas as potencialidades que estas fontes proporcionam. Existem actualmente sistemas de armazenamento de energia que permitem optimizar o comportamento das redes. Nesta dissertação é feita uma abordagem a alguns desses sistemas, tendo como objectivo principal a demonstração das potencialidades de optimização dos sistemas de produção e distribuição de energia eléctrica com recurso a sistemas de armazenamento de energia, em redes isoladas e interligadas. É também feito um estudo do comportamento dinâmico de uma rede com vários cenários de ocorrência de defeitos, com e sem armazenamento de energia. Para isso a base deste trabalho consistiu na familiarização com uma ferramenta de grande potencial na simulação dinâmica de redes eléctricas, utilizado por prestigiados grupos de energia a nível mundial, na qual foi implementada a rede de teste e efectuadas as simulações do estudo.
Resumo:
LINS, Filipe C. A. et al. Modelagem dinâmica e simulação computacional de poços de petróleo verticais e direcionais com elevação por bombeio mecânico. In: CONGRESSO BRASILEIRO DE PESQUISA E DESENVOLVIMENTO EM PETRÓLEO E GÁS, 5. 2009, Fortaleza, CE. Anais... Fortaleza: CBPDPetro, 2009.