33 resultados para Modelagem de processos
Resumo:
The composition of petroleum may change from well to well and its resulting characteristics influence significantly the refine products. Therefore, it is important to characterize the oil in order to know its properties and send it adequately for processing. Since petroleum is a multicomponent mixture, the use of synthetic mixtures that are representative of oil fractions provides a better understand of the real mixture behavior. One way for characterization is usually obtained through correlation of physico-chemical properties of easy measurement, such as density, specific gravity, viscosity, and refractive index. In this work new measurements were obtained for density, specific gravity, viscosity, and refractive index of the following binary mixtures: n-heptane + hexadecane, cyclohexane + hexadecane, and benzene + hexadecane. These measurements were accomplished at low pressure and temperatures in the range 288.15 K to 310.95 K. These data were applied in the development of a new method of oil characterization. Furthermore, a series of measurements of density at high pressure and temperature of the binary mixture cyclohexane + n-hexadecane were performed. The ranges of pressure and temperature were 6.895 to 62.053 MPa and 318.15 to 413.15 K, respectively. Based on these experimental data of compressed liquid mixtures, a thermodynamic modeling was proposed using the Peng-Robinson equation of state (EOS). The EOS was modified with scaling of volume and a relatively reduced number of parameters were employed. The results were satisfactory demonstrating accuracy not only for density data, but also for isobaric thermal expansion and isothermal compressibility coefficients. This thesis aims to contribute in a scientific manner to the technological problem of refining heavy fractions of oil. This problem was treated in two steps, i.e., characterization and search of the processes that can produce streams with economical interest, such as solvent extraction at high pressure and temperature. In order to determine phase equilibrium data in these conditions, conceptual projects of two new experimental apparatus were developed. These devices consist of cells of variable volume together with a analytical static device. Therefore, this thesis contributed with the subject of characterization of hydrocarbons mixtures and with development of equilibrium cells operating at high pressure and temperature. These contributions are focused on the technological problem of refining heavy oil fractions
Resumo:
Natural gas, although basically composed by light hydrocarbons, also presents contaminant gases in its composition, such as CO2 (carbon dioxide) and H2S (hydrogen sulfide). The H2S, which commonly occurs in oil and gas exploration and production activities, causes damages in oil and natural gas pipelines. Consequently, the removal of hydrogen sulfide gas will result in an important reduction in operating costs. Also, it is essential to consider the better quality of the oil to be processed in the refinery, thus resulting in benefits in economic, environmental and social areas. All this facts demonstrate the need for the development and improvement in hydrogen sulfide scavengers. Currently, the oil industry uses several processes for hydrogen sulfide removal from natural gas. However, these processes produce amine derivatives which can cause damage in distillation towers, can cause clogging of pipelines by formation of insoluble precipitates, and also produce residues with great environmental impact. Therefore, it is of great importance the obtaining of a stable system, in inorganic or organic reaction media, able to remove hydrogen sulfide without formation of by-products that can affect the quality and cost of natural gas processing, transport, and distribution steps. Seeking the study, evaluation and modeling of mass transfer and kinetics of hydrogen removal, in this study it was used an absorption column packed with Raschig rings, where the natural gas, with H2S as contaminant, passed through an aqueous solution of inorganic compounds as stagnant liquid, being this contaminant gas absorbed by the liquid phase. This absorption column was coupled with a H2S detection system, with interface with a computer. The data and the model equations were solved by the least squares method, modified by Levemberg-Marquardt. In this study, in addition to the water, it were used the following solutions: sodium hydroxide, potassium permanganate, ferric chloride, copper sulfate, zinc chloride, potassium chromate, and manganese sulfate, all at low concentrations (»10 ppm). These solutions were used looking for the evaluation of the interference between absorption physical and chemical parameters, or even to get a better mass transfer coefficient, as in mixing reactors and absorption columns operating in counterflow. In this context, the evaluation of H2S removal arises as a valuable procedure for the treatment of natural gas and destination of process by-products. The study of the obtained absorption curves makes possible to determine the mass transfer predominant stage in the involved processes, the mass transfer volumetric coefficients, and the equilibrium concentrations. It was also performed a kinetic study. The obtained results showed that the H2S removal kinetics is greater for NaOH. Considering that the study was performed at low concentrations of chemical reagents, it was possible to check the effect of secondary reactions in the other chemicals, especially in the case of KMnO4, which shows that your by-product, MnO2, acts in H2S absorption process. In addition, CuSO4 and FeCl3 also demonstrated to have good efficiency in H2S removal
Resumo:
The nonionic surfactants when in aqueous solution, have the property of separating into two phases, one called diluted phase, with low concentration of surfactant, and the other one rich in surfactants called coacervate. The application of this kind of surfactant in extraction processes from aqueous solutions has been increasing over time, which implies the need for knowledge of the thermodynamic properties of these surfactants. In this study were determined the cloud point of polyethoxylated surfactants from nonilphenolpolietoxylated family (9,5 , 10 , 11, 12 and 13), the family from octilphenolpolietoxylated (10 e 11) and polyethoxylated lauryl alcohol (6 , 7, 8 and 9) varying the degree of ethoxylation. The method used to determine the cloud point was the observation of the turbidity of the solution heating to a ramp of 0.1 ° C / minute and for the pressure studies was used a cell high-pressure maximum ( 300 bar). Through the experimental data of the studied surfactants were used to the Flory - Huggins models, UNIQUAC and NRTL to describe the curves of cloud point, and it was studied the influence of NaCl concentration and pressure of the systems in the cloud point. This last parameter is important for the processes of oil recovery in which surfactant in solution are used in high pressures. While the effect of NaCl allows obtaining cloud points for temperatures closer to the room temperature, it is possible to use in processes without temperature control. The numerical method used to adjust the parameters was the Levenberg - Marquardt. For the model Flory- Huggins parameter settings were determined as enthalpy of the mixing, mixing entropy and the number of aggregations. For the UNIQUAC and NRTL models were adjusted interaction parameters aij using a quadratic dependence with temperature. The parameters obtained had good adjust to the experimental data RSMD < 0.3 %. The results showed that both, ethoxylation degree and pressure increase the cloudy points, whereas the NaCl decrease
Resumo:
Digital Elevation Models (DEM) are numerical representations of a portion of the earth surface. Among several factors which affect the quality of a DEM, it should be emphasized the attention on the input data and the choice of the interpolating algorithm. On the other hand, several numerical models are used nowadays to characterize nearshore hydrodynamics and morphological changes in coastal areas, whose validation is based on field data collection. Independent on the complexity of the physical processes which are modeled, little attention has been given to the intrinsic bathymetric interpolation built within the numerical models of the specific application. Therefore, this study aims to investigate and to quantify the influence of the bathymetry, as obtained by a DEM, on the hydrodynamic circulation model at a coastal stretch, off the coast of the State of Rio Grande do Norte, Northeast Brazil. This coastal region is characterized by strong hydrodynamic and littoral processes, resulting in a very dynamic morphology with shallow coastal bathymetry. Important economic activities, such as oil exploitation and production, fisheries, salt ponds, shrimp farms and tourism, also bring impacts upon the local ecosystems and influence themselves the local hydrodynamics. This fact makes the region one of the most important for the development of the State, but also enhances the possibility of serious environmental accidents. As a hydrodynamic model, SisBaHiA® - Environmental Hydrodynamics System ( Sistema Básico de Hidrodinâmica Ambiental ) was chosen, for it has been successfully employed at several locations along the Brazilian coast. This model was developed at the Coastal and Oceanographical Engineering Group of the Ocean Engineering Program at the Federal University of Rio de Janeiro. Several interpolating methods were tested for the construction of the DEM, namely Natural Neighbor, Kriging, Triangulation with Linear Interpolation, Inverse Distance to a Power, Nearest Neighbor, and Minimum Curvature, all implemented within the software Surfer®. The bathymetry which was used as reference for the DEM was obtained from nautical charts provided by the Brazilian Hydrographic Service of the Brazilian Navy and from a field survey conducted in 2005. Changes in flow velocity and free surface elevation were evaluated under three aspects: a spatial vision along three profiles perpendicular to the coast and one profile longitudinal to the coast as shown; a temporal vision from three central nodes of the grid during 30 days; a hodograph analysis of components of speed in U and V, by different tidal cycles. Small, but negligible, variations in sea surface elevation were identified. However, the differences in flow and direction of velocities were significant, depending on the DEM
Resumo:
The work is to make a brief discussion of methods to estimate the parameters of the Generalized Pareto distribution (GPD). Being addressed the following techniques: Moments (moments), Maximum Likelihood (MLE), Biased Probability Weighted Moments (PWMB), Unbiased Probability Weighted Moments (PWMU), Mean Power Density Divergence (MDPD), Median (MED), Pickands (PICKANDS), Maximum Penalized Likelihood (MPLE), Maximum Goodness-of-fit (MGF) and the Maximum Entropy (POME) technique, the focus of this manuscript. By way of illustration adjustments were made for the Generalized Pareto distribution, for a sequence of earthquakes intraplacas which occurred in the city of João Câmara in the northeastern region of Brazil, which was monitored continuously for two years (1987 and 1988). It was found that the MLE and POME were the most efficient methods, giving them basically mean squared errors. Based on the threshold of 1.5 degrees was estimated the seismic risk for the city, and estimated the level of return to earthquakes of intensity 1.5°, 2.0°, 2.5°, 3.0° and the most intense earthquake never registered in the city, which occurred in November 1986 with magnitude of about 5.2º
Resumo:
Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines
Resumo:
The software development processes proposed by the most recent approaches in Software Engineering make use old models. UML was proposed as the standard language for modeling. The user interface is an important part of the software and has a fundamental importance to improve its usability. Unfortunately the standard UML does not offer appropriate resources to model user interfaces. Some proposals have already been proposed to solve this problem: some authors have been using models in the development of interfaces (Model Based Development) and some proposals to extend UML have been elaborated. But none of them considers the theoretical perspective presented by the semiotic engineering, that considers that, through the system, the designer should be able to communicate to the user what he can do, and how to use the system itself. This work presents Visual IMML, an UML Profile that emphasizes the aspects of the semiotic engineering. This Profile is based on IMML, that is a declarative textual language. The Visual IMML is a proposal that aims to improve the specification process by using a visual modeling (using diagrams) language. It proposes a new set of modeling elements (stereotypes) specifically designed to the specification and documentation of user interfaces, considering the aspects of communication, interaction and functionality in an integrated manner
Resumo:
This dissertation presents a model-driven and integrated approach to variability management, customization and execution of software processes. Our approach is founded on the principles and techniques of software product lines and model-driven engineering. Model-driven engineering provides support to the specification of software processes and their transformation to workflow specifications. Software product lines techniques allows the automatic variability management of process elements and fragments. Additionally, in our approach, workflow technologies enable the process execution in workflow engines. In order to evaluate the approach feasibility, we have implemented it using existing model-driven engineering technologies. The software processes are specified using Eclipse Process Framework (EPF). The automatic variability management of software processes has been implemented as an extension of an existing product derivation tool. Finally, ATL and Acceleo transformation languages are adopted to transform EPF process to jPDL workflow language specifications in order to enable the deployment and execution of software processes in the JBoss BPM workflow engine. The approach is evaluated through the modeling and modularization of the project management discipline of the Open Unified Process (OpenUP)
Resumo:
This thesis presents the results of application of SWAN Simulating WAves Nearshore numerical model, OF third generation, which simulates the propagation and dissipation of energy from sea waves, on the north continental shelf at Rio Grande do Norte, to determine the wave climate, calibrate and validate the model, and assess their potential and limitations for the region of interest. After validation of the wave climate, the results were integrated with information from the submarine relief, and plant morphology of beaches and barrier islands systems. On the second phase, the objective was to analyze the evolution of the wave and its interaction with the shallow seabed, from three transverse profiles orientation from N to S, distributed according to the parallel longitudinal, X = 774000-W, 783000-W e 800000-W. Subsequently, it was were extracted the values of directional waves and winds through all the months between november 2010 to november 2012, to analyze the impact of these forces on the movement area, and then understand the behavior of the morphological variations according to temporal year variability. Based on the results of modeling and its integration with correlated data, and planimetric variations of Soledade and Minhoto beach systems and Ponta do Tubarão and Barra do Fernandes barrier islands systems, it was obtained the following conclusions: SWAN could reproduce and determine the wave climate on the north continental shelf at RN, the results show a similar trend for the measurements of temporal variations of significant height (HS, m) and the mean wave period (Tmed, s); however, the results of parametric statistics were low for the estimates of the maximum values in most of the analyzed periods compared data of PT 1 and PT 2 (measurement points), with alternation of significant wave heights, at times overrated with occasional overlap of swell episodes. By analyzing the spatial distribution of the wave climate and its interaction with the underwater compartmentalization, it was concluded that there is interaction of wave propagation with the seafloor, showing change in significant heights whenever it interacts with the seafloor features (beachrocks, symmetric and asymmetric longitudinal dunes, paleochannel, among others) in the regions of outer, middle and inner shelf. And finally, it is concluded that the study of the stability areas allows identifications of the most unstable regions, confirming that the greatest range of variation indicates greater instability and consequent sensitivity to hydrodynamic processes operating in the coastal region, with positive or negative variation, especially at Ponta do Tubarão and Barra do Fernandes barrier islands systems, where they are more susceptible to waves impacts, as evidenced in retreat of the shoreline
Resumo:
The area between Galinhos and São Bento do Norte beaches, located in the northern coast of the Rio Grande do Norte State is submitted to intense and constant processes of littoral and aeolian transport, causing erosion, alterations in the sediments balance and modifications in the shoreline. Beyond these natural factors, the human interference is huge in the surroundings due to the Guamaré Petroliferous Pole nearby, the greater terrestrial oil producing in Brazil. Before all these characteristics had been organized MAMBMARE and MARPETRO projects with the main objective to execute the geo-environmental monitoring of coastal areas on the northern portion of RN. There is a bulky amount of database from the study area such as geologic and geophysical multitemporal data, hydrodynamic measurements, remote sensing multitemporal images, thematic maps, among others; it is of extreme importance to elaborate a Geographic Database (GD), one of the main components of a Geographic Information System (GIS), to store this amount of information, allowing the access to researchers and users. The first part of this work consisted to elaborate a GD to store the data of the area between Galinhos and São Bento do Norte cities. The main goal was to use the potentiality of the GIS as a tool to support decisions in the environmental monitoring of this region, a valuable target for oil exploration, salt companies and shrimp farms. The collected data was stored as a virtual library to assist men decisions from the results presented as digital thematic maps, tables and reports, useful as source of data in the preventive planning and as guidelines to the future research themes both on regional and local context. The second stage of this work consisted on elaborate the Oil-Spill Environmental Sensitivity Maps. These maps based on the Environmental Sensitivity Index Maps to Oil Spill developed by the Ministry of Environment are cartographic products that supply full information to the decision making, contingency planning and assessment in case of an oil spilling incident in any area. They represent the sensitivity of the areas related to oil spilling, through basic data such as geology, geomorphology, oceanographic, social-economic and biology. Some parameters, as hydrodynamic data, sampling data, coastal type, declivity of the beach face, types of resources in risk (biologic, economic, human or cultural) and the land use of the area are some of the essential information used on the environmental sensitivity maps elaboration. Thus using the available data were possible to develop sensitivity maps of the study area on different dates (June/2000 and December/2000) and to perceive that there was a difference on the sensitivity index generated. The area on December presented more sensible to the oil than the June one because hydrodynamic data (wave and tide energy) allowed a faster natural cleaning on June. The use of the GIS on sensitivity maps showed to be a powerful tool, since it was possible to manipulate geographic data with correctness and to elaborate more accurate maps with a higher level of detail to the study area. This presented an medium index (3 to 4) to the long shore and a high index (10) to the mangrove areas highly vulnerable to oil spill
Resumo:
Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - CAPES
Resumo:
Actually, Brazil is one of the larger fruit producer worldwide, with most of its production being consumed in nature way or either as juice or pulp. It is important to highlig ht in the fruit productive chain there are a lot lose due mainly to climate reasons, as well as storage, transportation, season, market, etc. It is known that in the pulp and fruit processing industy a yield of 50% (in mass) is usually obtained, with the other part discarded as waste. However, since most this waste has a high nutrient content it can be used to generate added - value products. In this case, drying plays an important role as an alternative process in order to improve these wastes generated by the fruit industry. However, despite the advantage of using this technique in order to improve such wastes, issues as a higher power demand as well as the thermal efficiency limitation should be addressed. Therefore, the control of the main variables in t his drying process is quite important in order to obtain operational conditions to produce a final product with the target specification as well as with a lower power cost. M athematical models can be applied to this process as a tool in order to optimize t he best conditions. The main aim of this work was to evaluate the drying behaviour of a guava industrial pulp waste using a batch system with a convective - tray dryer both experimentally and using mathematical modeling. In the experimental study , the dryin g carried out using a group of trays as well as the power consume were assayed as response to the effects of operational conditions (temperature, drying air flow rate and solid mass). Obtained results allowed observing the most significant variables in the process. On the other hand, the phenomenological mathematical model was validated and allowed to follow the moisture profile as well as the temperature in the solid and gas phases in every tray. Simulation results showed the most favorable procedure to o btain the minimum processing time as well as the lower power demand.
Resumo:
In the last decades, analogue modelling has been used in geology to improve the knowledge of how geological structures are nucleated, how they grow and what are the main important points in such processes. The use of this tool in the oil industry, to help seismic interpretations and mainly to search for structural traps contributed to disseminate the use of this tool in the literature. Nowadays, physical modelling has a large field of applications, since landslide to granite emplacement along shear zones. In this work, we deal with physical modelling to study the influence of mechanical stratifications in the nucleation and development of faults and fractures in a context of orthogonal and conjugated oblique basins. To simulate a mechanical stratigraphy we used different materials, with distinct physical proprieties, such as gypsum powder, glass beads, dry clay and quartz sand. Some experiments were run along with a PIV (Particle Image Velocimetry), an instrument that shows the movement of the particles to each deformation moment. Two series of experiments were studied: i) Series MO: We tested the development of normal faults in a context of an orthogonal (to the extension direction) basin. Experiments were run taking into account the change of materials and strata thickness. Some experiments were done with sintectonic sedimentation. We registered differences in the nucleation and growth of faults in layers with different rheological behavior. The gypsum powder layer behaves in a more competent mode, which generates a great number of high angle fractures. These fractures evolve to faults that exhibit a higher dip than when they cross less competent layers, like the one of quartz sand. This competent layer exhibits faulted blocks arranged in a typical domino-style. Cataclastic breccias developed along the faults affecting the competent layers and showed different evolutional history, depending on the deforming stratigraphic sequence; ii) Series MOS2: Normal faults were analyzed in conjugated sub-basins (oblique to the extension direction) developed in a sequence with and without rheological contrast. In experiments with rheological contrast, two important grabens developed along the faulted margins differing from the subbasins with mechanical stratigraphy. Both experiments developed oblique fault systems and, in the area of sub-basins intersection, faults traces became very curved.
Resumo:
The fluorescent proteins are an essential tool in many fields of biology, since they allow us to watch the development of structures and dynamic processes of cells in living tissue, with the aid of fluorescence microscopy. Optogenectics is another technique that is currently widely used in Neuroscience. In general, this technique allows to activate/deactivate neurons with the radiation of certain wavelengths on the cells that have ion channels sensitive to light, at the same time that can be used with fluorescent proteins. This dissertation has two main objectives. Initially, we study the interaction of light radiation and mice brain tissue to be applied in optogenetic experiments. In this step, we model absorption and scattering effects using mice brain tissue characteristics and Kubelka-Munk theory, for specific wavelengths, as a function of light penetration depth (distance) within the tissue. Furthermore, we model temperature variations using the finite element method to solve Pennes’ bioheat equation, with the aid of COMSOL Multiphysics Modeling Software 4.4, where we simulate protocols of light stimulation tipically used in optogenetics. Subsequently, we develop some computational algorithms to reduce the exposure of neuron cells to the light radiation necessary for the visualization of their emitted fluorescence. At this stage, we describe the image processing techniques developed to be used in fluorescence microscopy to reduce the exposure of the brain samples to continuous light, which is responsible for fluorochrome excitation. The developed techniques are able to track, in real time, a region of interest (ROI) and replace the fluorescence emitted by the cells by a virtual mask, as a result of the overlay of the tracked ROI and the fluorescence information previously stored, preserving cell location, independently of the time exposure to fluorescent light. In summary, this dissertation intends to investigate and describe the effects of light radiation in brain tissue, within the context of Optogenetics, in addition to providing a computational tool to be used in fluorescence microscopy experiments to reduce image bleaching and photodamage due to the intense exposure of fluorescent cells to light radiation.
Resumo:
Advanced Oxidation Processes (AOP) are techniques involving the formation of hydroxyl radical (HO•) with high organic matter oxidation rate. These processes application in industry have been increasing due to their capacity of degrading recalcitrant substances that cannot be completely removed by traditional processes of effluent treatment. In the present work, phenol degrading by photo-Fenton process based on addition of H2O2, Fe2+ and luminous radiation was studied. An experimental design was developed to analyze the effect of phenol, H2O2 and Fe2+ concentration on the fraction of total organic carbon (TOC) degraded. The experiments were performed in a batch photochemical parabolic reactor with 1.5 L of capacity. Samples of the reactional medium were collected at different reaction times and analyzed in a TOC measurement instrument from Shimadzu (TOC-VWP). The results showed a negative effect of phenol concentration and a positive effect of the two other variables in the TOC degraded fraction. A statistical analysis of the experimental design showed that the hydrogen peroxide concentration was the most influent variable in the TOC degraded fraction at 45 minutes and generated a model with R² = 0.82, which predicted the experimental data with low precision. The Visual Basic for Application (VBA) tool was used to generate a neural networks model and a photochemical database. The aforementioned model presented R² = 0.96 and precisely predicted the response data used for testing. The results found indicate the possible application of the developed tool for industry, mainly for its simplicity, low cost and easy access to the program.