936 resultados para Scheduler simulator


Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work proposes a model to investigate the use of a cylindrical antenna used in the thermal method of recovering through electromagnetic radiation of high-viscosity oil. The antenna has a simple geometry, adapted dipole type, and it can be modelled by using Maxwell s equation. The wavelet transforms are used as basis functions and applied in conjunction with the method of moments to obtain the current distribution in the antenna. The electric field, power and temperature distribution are carefully calculated for the analysis of the antenna as electromagnetic heating. The energy performance is analyzed based on thermo-fluid dynamic simulations at field scale, and through the adaptation in the Steam Thermal and Advanced Processes Reservoir Simulator (STARS) by Computer Modelling Group (CMG). The model proposed and the numerical results obtained are stable and presented good agreement with the results reported in the specialized literature

Relevância:

10.00% 10.00%

Publicador:

Resumo:

In the artificial lift method by Electrical Submersible Pump (ESP), the energy is transmitted for the well´s deep through a flat electric handle, where it is converted into mechanical energy through an engine of sub-surface, which is connected to a centrifugal pump. This transmits energy to the fluid under the pressure form, bringing it to the surface In this method the subsurface equipment is basically divided into: pump, seal and motor. The main function of the seal is the protect the motor, avoiding the motor´s oil be contaminated by oil production and the consequent burning of it. Over time, the seal will be wearing and initiates a contamination of motor oil, causing it to lose its insulating characteristics. This work presents a design of a magnetic sensor capable of detecting contamination of insulating oil used in the artificial lift method of oil-type Electrical Submersible Pump (ESP). The objective of this sensor is to generate alarm signal just the moment when the contamination in the isolated oil is present, enabling the implementation of a predictive maintenance. The prototype was designed to work in harsh conditions to reach a depth of 2000m and temperatures up to 150°C. It was used a simulator software to defined the mechanical and electromagnetic variables. Results of field experiments were performed to validate the prototype. The final results performed in an ESP system with a 62HP motor showed a good reliability and fast response of the prototype.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Many of hydrocarbon reserves existing in the world are formed by heavy oils (°API between 10 and 20). Moreover, several heavy oil fields are mature and, thus, offer great challenges for oil industry. Among the thermal methods used to recover these resources, steamflooding has been the main economically viable alternative. Latent heat carried by steam heats the reservoir, reducing oil viscosity and facilitating the production. This method has many variations and has been studied both theoretically and experimentally (in pilot projects and in full field applications). In order to increase oil recovery and reduce steam injection costs, the injection of alternative fluid has been used on three main ways: alternately, co-injected with steam and after steam injection interruption. The main objective of these injection systems is to reduce the amount of heat supplied to the reservoir, using cheaper fluids and maintaining the same oil production levels. This works discusses the use of carbon dioxide, nitrogen, methane and water as an alternative fluid to the steam. The analyzed parameters were oil recoveries and net cumulative oil productions. The reservoir simulation model corresponds to an oil reservoir of 100 m x 100 m x 28 m size, on a Cartesian coordinates system (x, y and z directions). It is a semi synthetic model with some reservoir data similar to those found in Brazilian Potiguar Basin. All studied cases were done using the simulator STARS from CMG (Computer Modelling Group, version 2009.10). It was found that waterflood after steam injection interruption achieved the highest net cumulative oil compared to other fluids injection. Moreover, it was observed that steam and alternative fluids, co-injected and alternately, did not present increase on profitability project compared with steamflooding

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A significant fraction of the hydrocarbon reserves in the world is formed by heavy oils. From the thermal methods used to recovery these resources, Steamflooding has been one of the main economically viable alternatives. In Brazil, this technology is widely used by Petrobras in Northeast fields. Latent heat carried by steam heats the oil in the reservoir, reducing its viscosity and facilitating the production. In the last years, an alternative more and more used by the oil industry to increase the efficiency of this mechanism has been the addition of solvents. When co-injected with steam, the vaporized solvent condenses in the cooler regions of the reservoir and mixes with the oil, creating a low viscosity zone between the steam and the heavy oil. The mobility of the displaced fluid is then improved, resulting in an increase of oil recovery. To better understand this improved oil recovery method and investigate its applicability in reservoirs with properties similar to those found in Potiguar Basin, a numerical study was done to analyze the influence of some operational parameters (steam injection rate, injected solvent volume and solvent type) on oil recovery. Simulations were performed in STARS ("Steam, Thermal, and Advanced Processes Reservoir Simulator"), a CMG ("Computer Modelling Group") program, version 2009.10. It was found that solvents addition to the injected steam not only anticipated the heated oil bank arrival to the producer well, but also increased the oil recovery. Lower cold water equivalent volumes were required to achieve the same oil recoveries from the models that injected only steam. Furthermore, much of the injected solvent was produced with the oil from the reservoir

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A transmissão de radiação ultravioleta de comprimentos de onda entre 250 e 360 nm através do pelame e da epiderme de bovinos foi determinada em laboratório, usando-se amostras de couro de animais recém-abatidos. A quantidade de radiação transmitida através do pelame depende da coloração e também das características estruturais do pelame (espessura da capa; comprimento, diâmetro, número e inclinação dos pêlos), pelas quais é definido o trajeto médio de um fóton pela massa de pêlos (L). A maior transmissão é proporcionada por pelames brancos com altos valores de L, ao passo que pelames negros em geral apresentam transmissão nula ou muito baixa. Quanto menos pigmentada a epiderme, maior a transmissão de radiação através da sua superfície. A melhor proteção é proporcionada por pelames negros com baixo valor de L sobre epiderme igualmente negra, mas em vista do aquecimento causado pela absorção de radiação térmica (em vacas Holandesas a temperatura das malhas negras atinge 44,1ºC ao mesmo tempo em que a das malhas brancas é 37,7ºC), a combinação ideal para ambientes tropicais é um pelame branco com baixo valor de L sobre epiderme negra, uma combinação dificilmente encontrada em animais de raças européias. Uma alternativa seria um pelame negro com um baixo valor de L. Animais vermelhos apresentam alta transmissão de radiação UV através da epiderme e do pelame, sendo desaconselhados para ambientes tropicais. Entretanto, foi observada uma vaca Holandesa com áreas isoladas de epiderme negra coberta com pelame branco, o que pode trazer perspectivas para uma seleção para combinações mais adequadas de epiderme e pelame em bovinos de raças européias.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Este trabalho foi realizado na Faculdade de Ciências Agrárias Veterinárias - Unesp - Jaboticabal e foi conduzido, com o objetivo de avaliar a influência do método de amostragem do pasto sobre a degradabilidade in situ da matéria seca e da fração fibrosa de capim Marandu, colhido no período seco dos anos de 2003 e 2005. O experimento foi instalado em delineamento de blocos casualizados com parcela subdividida, com três repetições, representadas pelos piquetes amostrados. Nas parcelas, foram avaliados cinco métodos de amostragens de forragem (método do quadrado metálico; método de avaliação através de extrusa de bovino da raça Nelore; método de avaliação por meio de extrusa de bovino Cruzado (Red Angus x Nelore); método de avaliação por meio do pastejo, simulando bovino da raça Nelore; método de avaliação por meio do pastejo, simulando bovino Cruzado (Red Angus x Nelore) e as subparcelas foram constituídas pelos anos de amostragem, 2003 e 2005. Foram determinadas as frações da cinética ruminal: solúvel A; insolúvel potencialmente degradável B; taxa de degradação Kd; degradação potencial (DP) e fração não degradável C da MS e degradação potencial (DP), fração não degradável C e taxa de degradação Kd da FDN e da FDA. de acordo com os resultados obtidos, observou-se que o método do quadrado metálico subestimou as características da degradação do capim. Os métodos do pastejo simulado se assemelharam muito ao das extrusas, no entanto, a prática do simulador é que assegurou a amostragem eficiente, conforme foi constatado pelos dados obtidos no ano de 2003 e 2005.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

O controle sobre os processos erosivos é uma necessidade constatada em vários segmentos da engenharia. A movimentação de terra necessária para a execução de taludes rodoviários, por exemplo, pode resultar em taludes de corte ou aterro vulneráveis à erosão superficial. Dentre as alternativas tecnológicas para controle de erosão a utilização de geossintéticos se apresenta como uma solução potencial. A referida aplicação encontra-se em amplo desenvolvimento em alguns países como, por exemplo, nos EUA. No Brasil, a especificação de geossintéticos para controle de erosão é limitada pela ausência de caracterização desses produtos e de normas nacionais, sendo a única fonte técnica de informação, os catálogos dos fabricantes. Neste contexto, o objetivo deste trabalho é construir um equipamento e desenvolver métodos de ensaio para caracterização e avaliação de geossintéticos utilizados no controle de erosão superficial, com base na ASTM D7101. Além de um simulador de chuvas, o equipamento é composto por uma bancada de testes formada por: rampa de escoamento, mesa de suporte e núcleos de solo. Utilizando a bancada construída, foram realizados ensaios para avaliar o funcionamento do equipamento e o desempenho de uma geomanta na redução da taxa de erosão superficial. Os ensaios foram realizados com intensidades de precipitação de 100 ± 4mm/h e 150 ± 4mm/h, durante 30 minutos, com intervalo de leitura de 5 minutos. Os resultados obtidos nos ensaios sem a presença da geomanta mostraram uma perda de solo acentuada durante as chuvas simuladas, com uma iv tendência de crescimento linear da perda de solo acumulada em função do tempo de ensaio. Nos ensaios realizados com a presença da geomanta observou-se a ação de proteção do geossintético com uma redução da ordem de 90% da perda de solo acumulada para todas as intensidades de chuvas utilizadas

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The progressing cavity pump artificial lift system, PCP, is a main lift system used in oil production industry. As this artificial lift application grows the knowledge of it s dynamics behavior, the application of automatic control and the developing of equipment selection design specialist systems are more useful. This work presents tools for dynamic analysis, control technics and a specialist system for selecting lift equipments for this artificial lift technology. The PCP artificial lift system consists of a progressing cavity pump installed downhole in the production tubing edge. The pump consists of two parts, a stator and a rotor, and is set in motion by the rotation of the rotor transmitted through a rod string installed in the tubing. The surface equipment generates and transmits the rotation to the rod string. First, is presented the developing of a complete mathematical dynamic model of PCP system. This model is simplified for use in several conditions, including steady state for sizing PCP equipments, like pump, rod string and drive head. This model is used to implement a computer simulator able to help in system analysis and to operates as a well with a controller and allows testing and developing of control algorithms. The next developing applies control technics to PCP system to optimize pumping velocity to achieve productivity and durability of downhole components. The mathematical model is linearized to apply conventional control technics including observability and controllability of the system and develop design rules for PI controller. Stability conditions are stated for operation point of the system. A fuzzy rule-based control system are developed from a PI controller using a inference machine based on Mandami operators. The fuzzy logic is applied to develop a specialist system that selects PCP equipments too. The developed technics to simulate and the linearized model was used in an actual well where a control system is installed. This control system consists of a pump intake pressure sensor, an industrial controller and a variable speed drive. The PI control was applied and fuzzy controller was applied to optimize simulated and actual well operation and the results was compared. The simulated and actual open loop response was compared to validate simulation. A case study was accomplished to validate equipment selection specialist system

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The bidimensional periodic structures called frequency selective surfaces have been well investigated because of their filtering properties. Similar to the filters that work at the traditional radiofrequency band, such structures can behave as band-stop or pass-band filters, depending on the elements of the array (patch or aperture, respectively) and can be used for a variety of applications, such as: radomes, dichroic reflectors, waveguide filters, artificial magnetic conductors, microwave absorbers etc. To provide high-performance filtering properties at microwave bands, electromagnetic engineers have investigated various types of periodic structures: reconfigurable frequency selective screens, multilayered selective filters, as well as periodic arrays printed on anisotropic dielectric substrates and composed by fractal elements. In general, there is no closed form solution directly from a given desired frequency response to a corresponding device; thus, the analysis of its scattering characteristics requires the application of rigorous full-wave techniques. Besides that, due to the computational complexity of using a full-wave simulator to evaluate the frequency selective surface scattering variables, many electromagnetic engineers still use trial-and-error process until to achieve a given design criterion. As this procedure is very laborious and human dependent, optimization techniques are required to design practical periodic structures with desired filter specifications. Some authors have been employed neural networks and natural optimization algorithms, such as the genetic algorithms and the particle swarm optimization for the frequency selective surface design and optimization. This work has as objective the accomplishment of a rigorous study about the electromagnetic behavior of the periodic structures, enabling the design of efficient devices applied to microwave band. For this, artificial neural networks are used together with natural optimization techniques, allowing the accurate and efficient investigation of various types of frequency selective surfaces, in a simple and fast manner, becoming a powerful tool for the design and optimization of such structures

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Internet applications such as media streaming, collaborative computing and massive multiplayer are on the rise,. This leads to the need for multicast communication, but unfortunately group communications support based on IP multicast has not been widely adopted due to a combination of technical and non-technical problems. Therefore, a number of different application-layer multicast schemes have been proposed in recent literature to overcome the drawbacks. In addition, these applications often behave as both providers and clients of services, being called peer-topeer applications, and where participants come and go very dynamically. Thus, servercentric architectures for membership management have well-known problems related to scalability and fault-tolerance, and even peer-to-peer traditional solutions need to have some mechanism that takes into account member's volatility. The idea of location awareness distributes the participants in the overlay network according to their proximity in the underlying network allowing a better performance. Given this context, this thesis proposes an application layer multicast protocol, called LAALM, which takes into account the actual network topology in the assembly process of the overlay network. The membership algorithm uses a new metric, IPXY, to provide location awareness through the processing of local information, and it was implemented using a distributed shared and bi-directional tree. The algorithm also has a sub-optimal heuristic to minimize the cost of membership process. The protocol has been evaluated in two ways. First, through an own simulator developed in this work, where we evaluated the quality of distribution tree by metrics such as outdegree and path length. Second, reallife scenarios were built in the ns-3 network simulator where we evaluated the network protocol performance by metrics such as stress, stretch, time to first packet and reconfiguration group time

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This thesis presents a new structure of robust adaptive controller applied to mobile robots (surface mobile robot) with nonholonomic constraints. It acts in the dynamics and kinematics of the robot, and it is split in two distinct parts. The first part controls the robot dynamics, using variable structure model reference adaptive controllers. The second part controls the robot kinematics, using a position controller, whose objective is to make the robot to reach any point in the cartesian plan. The kinematic controller is based only on information about the robot configuration. A decoupling method is adopted to transform the linear model of the mobile robot, a multiple-input multiple-output system, into two decoupled single-input single-output systems, thus reducing the complexity of designing the controller for the mobile robot. After that, a variable structure model reference adaptive controller is applied to each one of the resulting systems. One of such controllers will be responsible for the robot position and the other for the leading angle, using reference signals generated by the position controller. To validate the proposed structure, some simulated and experimental results using differential drive mobile robots of a robot soccer kit are presented. The simulator uses the main characteristics of real physical system as noise and non-linearities such as deadzone and saturation. The experimental results were obtained through an C++ program applied to the robot soccer kit of Microrobot team at the LACI/UFRN. The simulated and experimental results are presented and discussed at the end of the text

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This work intends to analyze the behavior of the gas flow of plunger lift wells producing to well testing separators in offshore production platforms to aim a technical procedure to estimate the gas flow during the slug production period. The motivation for this work appeared from the expectation of some wells equipped with plunger lift method by PETROBRAS in Ubarana sea field located at Rio Grande do Norte State coast where the produced fluids measurement is made in well testing separators at the platform. The oil artificial lift method called plunger lift is used when the available energy of the reservoir is not high enough to overcome all the necessary load losses to lift the oil from the bottom of the well to the surface continuously. This method consists, basically, in one free piston acting as a mechanical interface between the formation gas and the produced liquids, greatly increasing the well s lifting efficiency. A pneumatic control valve is mounted at the flow line to control the cycles. When this valve opens, the plunger starts to move from the bottom to the surface of the well lifting all the oil and gas that are above it until to reach the well test separator where the fluids are measured. The well test separator is used to measure all the volumes produced by the well during a certain period of time called production test. In most cases, the separators are designed to measure stabilized flow, in other words, reasonably constant flow by the use of level and pressure electronic controllers (PLC) and by assumption of a steady pressure inside the separator. With plunger lift wells the liquid and gas flow at the surface are cyclical and unstable what causes the appearance of slugs inside the separator, mainly in the gas phase, because introduce significant errors in the measurement system (e.g.: overrange error). The flow gas analysis proposed in this work is based on two mathematical models used together: i) a plunger lift well model proposed by Baruzzi [1] with later modifications made by Bolonhini [2] to built a plunger lift simulator; ii) a two-phase separator model (gas + liquid) based from a three-phase separator model (gas + oil + water) proposed by Nunes [3]. Based on the models above and with field data collected from the well test separator of PUB-02 platform (Ubarana sea field) it was possible to demonstrate that the output gas flow of the separator can be estimate, with a reasonable precision, from the control signal of the Pressure Control Valve (PCV). Several models of the System Identification Toolbox from MATLAB® were analyzed to evaluate which one better fit to the data collected from the field. For validation of the models, it was used the AIC criterion, as well as a variant of the cross validation criterion. The ARX model performance was the best one to fit to the data and, this way, we decided to evaluate a recursive algorithm (RARX) also with real time data. The results were quite promising that indicating the viability to estimate the output gas flow rate from a plunger lift well producing to a well test separator, with the built-in information of the control signal to the PCV

Relevância:

10.00% 10.00%

Publicador:

Resumo:

There are some approaches that take advantage of unused computational resources in the Internet nodes - users´ machines. In the last years , the peer-to-peer networks (P2P) have gaining a momentum mainly due to its support for scalability and fault tolerance. However, current P2P architectures present some problems such as nodes overhead due to messages routing, a great amount of nodes reconfigurations when the network topology changes, routing traffic inside a specific network even when the traffic is not directed to a machine of this network, and the lack of a proximity relationship among the P2P nodes and the proximity of these nodes in the IP network. Although some architectures use the information about the nodes distance in the IP network, they use methods that require dynamic information. In this work we propose a P2P architecture to fix the problems afore mentioned. It is composed of three parts. The first part consists of a basic P2P architecture, called SGrid, which maintains a relationship of nodes in the P2P network with their position in the IP network. Its assigns adjacent key regions to nodes of a same organization. The second part is a protocol called NATal (Routing and NAT application layer) that extends the basic architecture in order to remove from the nodes the responsibility of routing messages. The third part consists of a special kind of node, called LSP (Lightware Super-Peer), which is responsible for maintaining the P2P routing table. In addition, this work also presents a simulator that validates the architecture and a module of the Natal protocol to be used in Linux routers

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The monitoring of patients performed in hospitals is usually done either in a manual or semiautomated way, where the members of the healthcare team must constantly visit the patients to ascertain the health condition in which they are. The adoption of this procedure, however, compromises the quality of the monitoring conducted since the shortage of physical and human resources in hospitals tends to overwhelm members of the healthcare team, preventing them from moving to patients with adequate frequency. Given this, many existing works in the literature specify alternatives aimed at improving this monitoring through the use of wireless networks. In these works, the network is only intended for data traffic generated by medical sensors and there is no possibility of it being allocated for the transmission of data from applications present in existing user stations in the hospital. However, in the case of hospital automation environments, this aspect is a negative point, considering that the data generated in such applications can be directly related to the patient monitoring conducted. Thus, this thesis defines Wi-Bio as a communication protocol aimed at the establishment of IEEE 802.11 networks for patient monitoring, capable of enabling the harmonious coexistence among the traffic generated by medical sensors and user stations. The formal specification and verification of Wi-Bio were made through the design and analysis of Petri net models. Its validation was performed through simulations with the Network Simulator 2 (NS2) tool. The simulations of NS2 were designed to portray a real patient monitoring environment corresponding to a floor of the nursing wards sector of the University Hospital Onofre Lopes (HUOL), located at Natal, Rio Grande do Norte. Moreover, in order to verify the feasibility of Wi-Bio in terms of wireless networks standards prevailing in the market, the testing scenario was also simulated under a perspective in which the network elements used the HCCA access mechanism described in the IEEE 802.11e amendment. The results confirmed the validity of the designed Petri nets and showed that Wi-Bio, in addition to presenting a superior performance compared to HCCA on most items analyzed, was also able to promote efficient integration between the data generated by medical sensors and user applications on the same wireless network

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Image segmentation is one of the image processing problems that deserves special attention from the scientific community. This work studies unsupervised methods to clustering and pattern recognition applicable to medical image segmentation. Natural Computing based methods have shown very attractive in such tasks and are studied here as a way to verify it's applicability in medical image segmentation. This work treats to implement the following methods: GKA (Genetic K-means Algorithm), GFCMA (Genetic FCM Algorithm), PSOKA (PSO and K-means based Clustering Algorithm) and PSOFCM (PSO and FCM based Clustering Algorithm). Besides, as a way to evaluate the results given by the algorithms, clustering validity indexes are used as quantitative measure. Visual and qualitative evaluations are realized also, mainly using data given by the BrainWeb brain simulator as ground truth