76 resultados para Simulação Discreta com Componentes Contínuas
Resumo:
The present study provides a methodology that gives a predictive character the computer simulations based on detailed models of the geometry of a porous medium. We using the software FLUENT to investigate the flow of a viscous Newtonian fluid through a random fractal medium which simplifies a two-dimensional disordered porous medium representing a petroleum reservoir. This fractal model is formed by obstacles of various sizes, whose size distribution function follows a power law where exponent is defined as the fractal dimension of fractionation Dff of the model characterizing the process of fragmentation these obstacles. They are randomly disposed in a rectangular channel. The modeling process incorporates modern concepts, scaling laws, to analyze the influence of heterogeneity found in the fields of the porosity and of the permeability in such a way as to characterize the medium in terms of their fractal properties. This procedure allows numerically analyze the measurements of permeability k and the drag coefficient Cd proposed relationships, like power law, for these properties on various modeling schemes. The purpose of this research is to study the variability provided by these heterogeneities where the velocity field and other details of viscous fluid dynamics are obtained by solving numerically the continuity and Navier-Stokes equations at pore level and observe how the fractal dimension of fractionation of the model can affect their hydrodynamic properties. This study were considered two classes of models, models with constant porosity, MPC, and models with varying porosity, MPV. The results have allowed us to find numerical relationship between the permeability, drag coefficient and the fractal dimension of fractionation of the medium. Based on these numerical results we have proposed scaling relations and algebraic expressions involving the relevant parameters of the phenomenon. In this study analytical equations were determined for Dff depending on the geometrical parameters of the models. We also found a relation between the permeability and the drag coefficient which is inversely proportional to one another. As for the difference in behavior it is most striking in the classes of models MPV. That is, the fact that the porosity vary in these models is an additional factor that plays a significant role in flow analysis. Finally, the results proved satisfactory and consistent, which demonstrates the effectiveness of the referred methodology for all applications analyzed in this study.
Resumo:
Petroleum evaluation is analyze it using different methodologies, following international standards to know their chemical and physicochemical properties, contaminant levels, composition and especially their ability to generate derivatives. Many of these analyzes consuming a lot of time, large amount of samples , supplies and need an organized transportation logistics, schedule and professionals involved. Looking for alternatives that optimize the evaluation and enable the use of new technologies, seven samples of different centrifuged Brazilian oils previously characterized by Petrobras were analyzed by thermogravimetry in 25-900° C range using heating rates of 05, 10 and 20ºC per minute. With experimental data obtained, characterizations correlations were performed and provided: generation of true boiling point curves (TBP) simulated; comparing fractions generated with appropriate cut standard in temperature ranges; an approach to obtain Watson characterization factor; and compare micro carbon residue formed. The results showed a good chance of reproducing simulated TBP curve from thermogravimetry taking into account the composition, density and other oil properties. Proposed correlations for experimental characterization factor and carbon residue followed Petrobras characterizations, showing that thermogravimetry can be used as a tool on oil evaluation, because your quick analysis, accuracy, and requires a minimum number of samples and consumables
Resumo:
The increase in ultraviolet radiation (UV) at surface, the high incidence of non-melanoma skin cancer (NMSC) in coast of Northeast of Brazil (NEB) and reduction of total ozone were the motivation for the present study. The overall objective was to identify and understand the variability of UV or Index Ultraviolet Radiation (UV Index) in the capitals of the east coast of the NEB and adjust stochastic models to time series of UV index aiming make predictions (interpolations) and forecasts / projections (extrapolations) followed by trend analysis. The methodology consisted of applying multivariate analysis (principal component analysis and cluster analysis), Predictive Mean Matching method for filling gaps in the data, autoregressive distributed lag (ADL) and Mann-Kendal. The modeling via the ADL consisted of parameter estimation, diagnostics, residuals analysis and evaluation of the quality of the predictions and forecasts via mean squared error and Pearson correlation coefficient. The research results indicated that the annual variability of UV in the capital of Rio Grande do Norte (Natal) has a feature in the months of September and October that consisting of a stabilization / reduction of UV index because of the greater annual concentration total ozone. The increased amount of aerosol during this period contributes in lesser intensity for this event. The increased amount of aerosol during this period contributes in lesser intensity for this event. The application of cluster analysis on the east coast of the NEB showed that this event also occurs in the capitals of Paraiba (João Pessoa) and Pernambuco (Recife). Extreme events of UV in NEB were analyzed from the city of Natal and were associated with absence of cloud cover and levels below the annual average of total ozone and did not occurring in the entire region because of the uneven spatial distribution of these variables. The ADL (4, 1) model, adjusted with data of the UV index and total ozone to period 2001-2012 made a the projection / extrapolation for the next 30 years (2013-2043) indicating in end of that period an increase to the UV index of one unit (approximately), case total ozone maintain the downward trend observed in study period
Resumo:
Farming of marine shrimp is growing worldwide and the Litopenaeus vannamei (L. vannamei) shrimp is the species most widely cultivated. Shrimp is an attractive food for its nutritional value and sensory aspects, being essential the maintenance of this attributes throughout storage, which takes place largely under freezing. The aim of this research was to evaluate quality characteristics of Litopenaeus vannamei shrimp, during freezing storage and to verify the effect of rosemary (Rosmarinus officinalis) adding. Considering the reutilization of processing shrimp wastes, total carotenoids analysis were conducted in waste of Litopenaeus vannamei shrimp and in the flour obtained after dryer. Monthly physicochemical and sensorial analysis were carried out on shrimp stored at 28,3 ± 3,8ºC for 180 days. Samples were placed in polyethylene bags and were categorized as whole shrimp (WS), peeled shrimp (PS), and PS with 0,5% dehydrated rosemary (RS). TBARS, pH, total carotenoid and sensorial Quantitative Descriptive Analysis (QDA) were carried out. Carotenoid total analysis was conducted in fresh wastes and processed flour (0 day) and after 60, 120 and 180 days of frozen storage. After 180 days, RS had lower pH (p = 0.001) and TBARS (p = 0.001) values and higher carotenoids (p = 0.003), while WS showed higher carotenoid losses. Sensory analysis showed that WS were firmer although rancid taste and smell were perceived with greater intensity (p = 0.001). Rancid taste was detected in RS only at 120 days at significantly lower intensity (p = 0.001) than WS and PS. Fresh wastes had 42.74μg/g of total carotenoids and processed flour 98.51μg/g. After 180 days of frozen storage, total carotenoids were significantly lower than 0 day (p<0,05). The addition of rosemary can improve sensory quality of frozen shrimp and reduce nutritional losses during storage. Shrimp wastes and flour of L. vannamei shrimp showed considerable astaxanthin content however, during storage it was observed losses in this pigment
Resumo:
Descriptive exploratory study, prospective with quantitative approach, performed on the Medical Regulation Central of SAMU/Natal, aiming to identify the level of professional satisfaction of the members of the nursing team working at SAMU/Natal; and verify the degree of importance attributed by the professionals to each of the components Professional Satisfaction: autonomy, interaction, professional status , work requirements, organizational rules and remuneration. The population was of 60 professionals, with data collected from january to february 2005. We used an instrument translated and validated by Lino (1999) to the portuguese language, the Professional Satisfaction Rate (PSR). The results demonstrate that there was a slight predominance of the female gender (54,9%); aged between 36 and 45 years old (60,8%); married (58,8%), 82,4% with children, 30,8% aged between 05 and 09. Regarding formation, we observed that 78,4% were nursing technicians and 21,6% nurses, formed for 11 to 15 years (17,5%). From the 11 nurses, 09 (81,8%) informed they have specialization, 29,4% of the team has been working for 11 to 15 years on the urgency area, 58.8% works for more than 02 years on SAMU, 72,6% of the team members have fixed work schedules. There was homogeneity on the work shifts: 41,2% on the day shift and 53% on the night shift. Regarding the reason to be working on SAMU, 64% chose to work in the service, and among these 76,3% predominantly perform direct care to the patients, 96,1% like and are satisfied to work in the service. Regarding the remuneration, 90,9% informed they receive 05 to 10 minimum wages; 70% of the technicians informed they receive -2 to 05 minumum wages, 50,1% informed they receive no additional benefit. The analysis of PSR through Cronbach s Alpha Coeficient resulted on the value of 0,94 and through Kendall s Tau Coeficient on 0,87, demonstrating to be a trustworthy instrument to measure the level of professional satisfaction of the SAMU nursing team, in our environment. As for the level of importance attributed to the components of professional satisfaction, we indentified that the nursing team considered the Autonomy component as the most important, followed by the component Remuneration, Interaction, Work Requirements, Work Requirements, Organizational Rules and Professional Status . Regarding the current level of professional satisfaction, we identified they were most satisfied with the Professional Status , Autonomy, Interaction, Remuneration, Work Requirements and Organizational Rules. The real professional satisfaction level, calculated through statistics, however, tells these professionals are more satisfied with Autonomy, Remuneration, Interaction, Work Requirements, professional Status and Organizational Rules. The PSR in our work was of 8,6, indicating the SAMU Natal nursing team has little satisfaction on their work environment
Resumo:
The progressing cavity pump artificial lift system, PCP, is a main lift system used in oil production industry. As this artificial lift application grows the knowledge of it s dynamics behavior, the application of automatic control and the developing of equipment selection design specialist systems are more useful. This work presents tools for dynamic analysis, control technics and a specialist system for selecting lift equipments for this artificial lift technology. The PCP artificial lift system consists of a progressing cavity pump installed downhole in the production tubing edge. The pump consists of two parts, a stator and a rotor, and is set in motion by the rotation of the rotor transmitted through a rod string installed in the tubing. The surface equipment generates and transmits the rotation to the rod string. First, is presented the developing of a complete mathematical dynamic model of PCP system. This model is simplified for use in several conditions, including steady state for sizing PCP equipments, like pump, rod string and drive head. This model is used to implement a computer simulator able to help in system analysis and to operates as a well with a controller and allows testing and developing of control algorithms. The next developing applies control technics to PCP system to optimize pumping velocity to achieve productivity and durability of downhole components. The mathematical model is linearized to apply conventional control technics including observability and controllability of the system and develop design rules for PI controller. Stability conditions are stated for operation point of the system. A fuzzy rule-based control system are developed from a PI controller using a inference machine based on Mandami operators. The fuzzy logic is applied to develop a specialist system that selects PCP equipments too. The developed technics to simulate and the linearized model was used in an actual well where a control system is installed. This control system consists of a pump intake pressure sensor, an industrial controller and a variable speed drive. The PI control was applied and fuzzy controller was applied to optimize simulated and actual well operation and the results was compared. The simulated and actual open loop response was compared to validate simulation. A case study was accomplished to validate equipment selection specialist system
Resumo:
Due to the current need of the industry to integrate data of the beginning of production originating from of several sources and of transforming them in useful information for sockets of decisions, a search exists every time larger for systems of visualization of information that come to collaborate with that functionality. On the other hand, a common practice nowadays, due to the high competitiveness of the market, it is the development of industrial systems that possess characteristics of modularity, distribution, flexibility, scalability, adaptation, interoperability, reusability and access through web. Those characteristics provide an extra agility and a larger easiness in adapting to the frequent changes of demand of the market. Based on the arguments exposed above, this work consists of specifying a component-based architecture, with the respective development of a system based on that architecture, for the visualization of industrial data. The system was conceived to be capable to supply on-line information and, optionally, historical information of variables originating from of the beginning of production. In this work it is shown that the component-based architecture developed possesses the necessary requirements for the obtaining of a system robust, reliable and of easy maintenance, being, like this, in agreement with the industrial needs. The use of that architecture allows although components can be added, removed or updated in time of execution, through a manager of components through web, still activating more the adaptation process and updating of the system
Resumo:
The exponential growth in the applications of radio frequency (RF) is accompanied by great challenges as more efficient use of spectrum as in the design of new architectures for multi-standard receivers or software defined radio (SDR) . The key challenge in designing architecture of the software defined radio is the implementation of a wide-band receiver, reconfigurable, low cost, low power consumption, higher level of integration and flexibility. As a new solution of SDR design, a direct demodulator architecture, based on fiveport technology, or multi-port demodulator, has been proposed. However, the use of the five-port as a direct-conversion receiver requires an I/Q calibration (or regeneration) procedure in order to generate the in-phase (I) and quadrature (Q) components of the transmitted baseband signal. In this work, we propose to evaluate the performance of a blind calibration technique without additional knowledge about training or pilot sequences of the transmitted signal based on independent component analysis for the regeneration of I/Q five-port downconversion, by exploiting the information on the statistical properties of the three output signals
Resumo:
The objective of the present work is develop a model to simulate electrical energy networks in transient and stead states, using the software ATP (Alternative Transient Program), able to be a way to join two distinct themes, present in classical methodology planning networks: short circuit analysis and load flow theory. Beyond that, using a tool for relay simulation, this paper intend to use the new developed model to investigate the influence of transient phenomenon in operation of protection relays, and calibrate the enterprise's protections relays. For testing the model, some relays, actually, installed at COSERN were used
Resumo:
This work presents simulation results of an identification platform compatible with the INPE Brazilian Data Collection System, modeled with SystemC-AMS. SystemC-AMS that is a library of C++ classes dedicated to the simulation of heterogeneous systems, offering a powerful resource to describe models in digital, analog and RF domains, as well as mechanical and optic. The designed model was divided in four parts. The first block takes into account the satellite s orbit, necessary to correctly model the propagation channel, including Doppler effect, attenuation and thermal noise. The identification block detects the satellite presence. It is composed by low noise amplifier, band pass filter, power detector and logic comparator. The controller block is responsible for enabling the RF transmitter when the presence of the satellite is detected. The controller was modeled as a Petri net, due to the asynchronous nature of the system. The fourth block is the RF transmitter unit, which performs the modulation of the information in BPSK ±60o. This block is composed by oscillator, mixer, adder and amplifier. The whole system was simulated simultaneously. The results are being used to specify system components and to elaborate testbenchs for design verification
Resumo:
Nowadays, where the market competition requires products with better quality and a constant search for cost savings and a better use of raw materials, the research for more efficient control strategies becomes vital. In Natural Gas Processin Units (NGPUs), as in the most chemical processes, the quality control is accomplished through their products composition. However, the chemical composition analysis has a long measurement time, even when performed by instruments such as gas chromatographs. This fact hinders the development of control strategies to provide a better process yield. The natural gas processing is one of the most important activities in the petroleum industry. The main economic product of a NGPU is the liquefied petroleum gas (LPG). The LPG is ideally composed by propane and butane, however, in practice, its composition has some contaminants, such as ethane and pentane. In this work is proposed an inferential system using neural networks to estimate the ethane and pentane mole fractions in LPG and the propane mole fraction in residual gas. The goal is to provide the values of these estimated variables in every minute using a single multilayer neural network, making it possibly to apply inferential control techniques in order to monitor the LPG quality and to reduce the propane loss in the process. To develop this work a NGPU was simulated in HYSYS R software, composed by two distillation collumns: deethanizer and debutanizer. The inference is performed through the process variables of the PID controllers present in the instrumentation of these columns. To reduce the complexity of the inferential neural network is used the statistical technique of principal component analysis to decrease the number of network inputs, thus forming a hybrid inferential system. It is also proposed in this work a simple strategy to correct the inferential system in real-time, based on measurements of the chromatographs which may exist in process under study
Resumo:
Blind Source Separation (BSS) refers to the problem of estimate original signals from observed linear mixtures with no knowledge about the sources or the mixing process. Independent Component Analysis (ICA) is a technique mainly applied to BSS problem and from the algorithms that implement this technique, FastICA is a high performance iterative algorithm of low computacional cost that uses nongaussianity measures based on high order statistics to estimate the original sources. The great number of applications where ICA has been found useful reects the need of the implementation of this technique in hardware and the natural paralelism of FastICA favors the implementation of this algorithm on digital hardware. This work proposes the implementation of FastICA on a reconfigurable hardware platform for the viability of it's use in blind source separation problems, more specifically in a hardware prototype embedded in a Field Programmable Gate Array (FPGA) board for the monitoring of beds in hospital environments. The implementations will be carried out by Simulink models and it's synthesizing will be done through the DSP Builder software from Altera Corporation.
Resumo:
This work proposes hardware architecture, VHDL described, developed to embedded Artificial Neural Network (ANN), Multilayer Perceptron (MLP). The present work idealizes that, in this architecture, ANN applications could easily embed several different topologies of MLP network industrial field. The MLP topology in which the architecture can be configured is defined by a simple and specifically data input (instructions) that determines the layers and Perceptron quantity of the network. In order to set several MLP topologies, many components (datapath) and a controller were developed to execute these instructions. Thus, an user defines a group of previously known instructions which determine ANN characteristics. The system will guarantee the MLP execution through the neural processors (Perceptrons), the components of datapath and the controller that were developed. In other way, the biases and the weights must be static, the ANN that will be embedded must had been trained previously, in off-line way. The knowledge of system internal characteristics and the VHDL language by the user are not needed. The reconfigurable FPGA device was used to implement, simulate and test all the system, allowing application in several real daily problems
Resumo:
The main purpose of this work is to develop an environment that allows HYSYS R chemical process simulator communication with sensors and actuators from a Foundation Fieldbus industrial network. The environment is considered a hybrid resource since it has a real portion (industrial network) and a simulated one (process) with all measurement and control signals also real. It is possible to reproduce different industrial process dynamics without being required any physical network modification, enabling simulation of some situations that exist in a real industrial environment. This feature testifies the environment flexibility. In this work, a distillation column is simulated through HYSYS R with all its variables measured and controlled by Foundation Fieldbus devices
Resumo:
This dissertation describes the implementation of a WirelessHART networks simulation module for the Network Simulator 3, aiming for the acceptance of both on the present context of networks research and industry. For validating the module were imeplemented tests for attenuation, packet error rate, information transfer success rate and battery duration per station