999 resultados para Modelagem de processos
Resumo:
On this research we investigated how new technologies can help the process of design and manufacturing of furniture in such small manufacturers in Rio Grande do Norte state. Google SketchUp, a 3D software tool, was developed in such a way that its internal structures are opened and can be accessed using SketchUp s API for Ruby and programs written in Ruby language (plugins). Using the concepts of the so-called Group Technology and the flexibility that enables adding new functionalities to this software, it was created a Methodology for Modeling of Furniture, a Coding System and a plugin for Google s tool in order to implement the Methodology developed. As resulted, the following facilities are available: the user may create and reuse the library s models over-and-over; reports of the materials manufacturing process costs are provided and, finally, detailed drawings, getting a better integration between the furniture design and manufacturing process
Resumo:
The nonionic surfactants are composed of substances whose molecules in solution, does not ionize. The solubility of these surfactants in water due to the presence of functional groups that have strong affinity for water. When these surfactants are heated is the formation of two liquid phases, evidenced by the phenomenon of turbidity. This study was aimed to determine the experimental temperature and turbidity nonilfenolpoliethoxyled subsequently perform a thermodynamic modeling, considering the models of Flory-Huggins and the empirical solid-liquid equilibrium (SLE). The method used for determining the turbidity point was the visual method (Inoue et al., 2008). The experimental methodology consisted of preparing synthetic solutions of 0,25%, 0,5%, 1%, 2%, 3%, 4%, 5%, 6%, 7%, 8%, 9%, 10%, 12,5%, 15%, 17% and 20% by weight of surfactant. The nonionic surfactants used according to their degree of ethoxylation (9.5, 10, 11, 12 and 13). During the experiments the solutions were homogenized and the bath temperature was gradually increased while the turbidity of the solution temperature was checked visually Inoue et al. (2003). These temperature data of turbidity were used to feed the models evaluated and obtain thermodynamic parameters for systems of surfactants nonilfenolpoliethoxyled. Then the models can be used in phase separation processes, facilitating the extraction of organic solvents, therefore serve as quantitative and qualitative parameters. It was observed that the solidliquid equilibrium model (ESL) was best represented the experimental data.
Resumo:
The composition of petroleum may change from well to well and its resulting characteristics influence significantly the refine products. Therefore, it is important to characterize the oil in order to know its properties and send it adequately for processing. Since petroleum is a multicomponent mixture, the use of synthetic mixtures that are representative of oil fractions provides a better understand of the real mixture behavior. One way for characterization is usually obtained through correlation of physico-chemical properties of easy measurement, such as density, specific gravity, viscosity, and refractive index. In this work new measurements were obtained for density, specific gravity, viscosity, and refractive index of the following binary mixtures: n-heptane + hexadecane, cyclohexane + hexadecane, and benzene + hexadecane. These measurements were accomplished at low pressure and temperatures in the range 288.15 K to 310.95 K. These data were applied in the development of a new method of oil characterization. Furthermore, a series of measurements of density at high pressure and temperature of the binary mixture cyclohexane + n-hexadecane were performed. The ranges of pressure and temperature were 6.895 to 62.053 MPa and 318.15 to 413.15 K, respectively. Based on these experimental data of compressed liquid mixtures, a thermodynamic modeling was proposed using the Peng-Robinson equation of state (EOS). The EOS was modified with scaling of volume and a relatively reduced number of parameters were employed. The results were satisfactory demonstrating accuracy not only for density data, but also for isobaric thermal expansion and isothermal compressibility coefficients. This thesis aims to contribute in a scientific manner to the technological problem of refining heavy fractions of oil. This problem was treated in two steps, i.e., characterization and search of the processes that can produce streams with economical interest, such as solvent extraction at high pressure and temperature. In order to determine phase equilibrium data in these conditions, conceptual projects of two new experimental apparatus were developed. These devices consist of cells of variable volume together with a analytical static device. Therefore, this thesis contributed with the subject of characterization of hydrocarbons mixtures and with development of equilibrium cells operating at high pressure and temperature. These contributions are focused on the technological problem of refining heavy oil fractions
Resumo:
Natural gas, although basically composed by light hydrocarbons, also presents contaminant gases in its composition, such as CO2 (carbon dioxide) and H2S (hydrogen sulfide). The H2S, which commonly occurs in oil and gas exploration and production activities, causes damages in oil and natural gas pipelines. Consequently, the removal of hydrogen sulfide gas will result in an important reduction in operating costs. Also, it is essential to consider the better quality of the oil to be processed in the refinery, thus resulting in benefits in economic, environmental and social areas. All this facts demonstrate the need for the development and improvement in hydrogen sulfide scavengers. Currently, the oil industry uses several processes for hydrogen sulfide removal from natural gas. However, these processes produce amine derivatives which can cause damage in distillation towers, can cause clogging of pipelines by formation of insoluble precipitates, and also produce residues with great environmental impact. Therefore, it is of great importance the obtaining of a stable system, in inorganic or organic reaction media, able to remove hydrogen sulfide without formation of by-products that can affect the quality and cost of natural gas processing, transport, and distribution steps. Seeking the study, evaluation and modeling of mass transfer and kinetics of hydrogen removal, in this study it was used an absorption column packed with Raschig rings, where the natural gas, with H2S as contaminant, passed through an aqueous solution of inorganic compounds as stagnant liquid, being this contaminant gas absorbed by the liquid phase. This absorption column was coupled with a H2S detection system, with interface with a computer. The data and the model equations were solved by the least squares method, modified by Levemberg-Marquardt. In this study, in addition to the water, it were used the following solutions: sodium hydroxide, potassium permanganate, ferric chloride, copper sulfate, zinc chloride, potassium chromate, and manganese sulfate, all at low concentrations (»10 ppm). These solutions were used looking for the evaluation of the interference between absorption physical and chemical parameters, or even to get a better mass transfer coefficient, as in mixing reactors and absorption columns operating in counterflow. In this context, the evaluation of H2S removal arises as a valuable procedure for the treatment of natural gas and destination of process by-products. The study of the obtained absorption curves makes possible to determine the mass transfer predominant stage in the involved processes, the mass transfer volumetric coefficients, and the equilibrium concentrations. It was also performed a kinetic study. The obtained results showed that the H2S removal kinetics is greater for NaOH. Considering that the study was performed at low concentrations of chemical reagents, it was possible to check the effect of secondary reactions in the other chemicals, especially in the case of KMnO4, which shows that your by-product, MnO2, acts in H2S absorption process. In addition, CuSO4 and FeCl3 also demonstrated to have good efficiency in H2S removal
Resumo:
The nonionic surfactants when in aqueous solution, have the property of separating into two phases, one called diluted phase, with low concentration of surfactant, and the other one rich in surfactants called coacervate. The application of this kind of surfactant in extraction processes from aqueous solutions has been increasing over time, which implies the need for knowledge of the thermodynamic properties of these surfactants. In this study were determined the cloud point of polyethoxylated surfactants from nonilphenolpolietoxylated family (9,5 , 10 , 11, 12 and 13), the family from octilphenolpolietoxylated (10 e 11) and polyethoxylated lauryl alcohol (6 , 7, 8 and 9) varying the degree of ethoxylation. The method used to determine the cloud point was the observation of the turbidity of the solution heating to a ramp of 0.1 ° C / minute and for the pressure studies was used a cell high-pressure maximum ( 300 bar). Through the experimental data of the studied surfactants were used to the Flory - Huggins models, UNIQUAC and NRTL to describe the curves of cloud point, and it was studied the influence of NaCl concentration and pressure of the systems in the cloud point. This last parameter is important for the processes of oil recovery in which surfactant in solution are used in high pressures. While the effect of NaCl allows obtaining cloud points for temperatures closer to the room temperature, it is possible to use in processes without temperature control. The numerical method used to adjust the parameters was the Levenberg - Marquardt. For the model Flory- Huggins parameter settings were determined as enthalpy of the mixing, mixing entropy and the number of aggregations. For the UNIQUAC and NRTL models were adjusted interaction parameters aij using a quadratic dependence with temperature. The parameters obtained had good adjust to the experimental data RSMD < 0.3 %. The results showed that both, ethoxylation degree and pressure increase the cloudy points, whereas the NaCl decrease
Resumo:
Digital Elevation Models (DEM) are numerical representations of a portion of the earth surface. Among several factors which affect the quality of a DEM, it should be emphasized the attention on the input data and the choice of the interpolating algorithm. On the other hand, several numerical models are used nowadays to characterize nearshore hydrodynamics and morphological changes in coastal areas, whose validation is based on field data collection. Independent on the complexity of the physical processes which are modeled, little attention has been given to the intrinsic bathymetric interpolation built within the numerical models of the specific application. Therefore, this study aims to investigate and to quantify the influence of the bathymetry, as obtained by a DEM, on the hydrodynamic circulation model at a coastal stretch, off the coast of the State of Rio Grande do Norte, Northeast Brazil. This coastal region is characterized by strong hydrodynamic and littoral processes, resulting in a very dynamic morphology with shallow coastal bathymetry. Important economic activities, such as oil exploitation and production, fisheries, salt ponds, shrimp farms and tourism, also bring impacts upon the local ecosystems and influence themselves the local hydrodynamics. This fact makes the region one of the most important for the development of the State, but also enhances the possibility of serious environmental accidents. As a hydrodynamic model, SisBaHiA® - Environmental Hydrodynamics System ( Sistema Básico de Hidrodinâmica Ambiental ) was chosen, for it has been successfully employed at several locations along the Brazilian coast. This model was developed at the Coastal and Oceanographical Engineering Group of the Ocean Engineering Program at the Federal University of Rio de Janeiro. Several interpolating methods were tested for the construction of the DEM, namely Natural Neighbor, Kriging, Triangulation with Linear Interpolation, Inverse Distance to a Power, Nearest Neighbor, and Minimum Curvature, all implemented within the software Surfer®. The bathymetry which was used as reference for the DEM was obtained from nautical charts provided by the Brazilian Hydrographic Service of the Brazilian Navy and from a field survey conducted in 2005. Changes in flow velocity and free surface elevation were evaluated under three aspects: a spatial vision along three profiles perpendicular to the coast and one profile longitudinal to the coast as shown; a temporal vision from three central nodes of the grid during 30 days; a hodograph analysis of components of speed in U and V, by different tidal cycles. Small, but negligible, variations in sea surface elevation were identified. However, the differences in flow and direction of velocities were significant, depending on the DEM
Resumo:
The work is to make a brief discussion of methods to estimate the parameters of the Generalized Pareto distribution (GPD). Being addressed the following techniques: Moments (moments), Maximum Likelihood (MLE), Biased Probability Weighted Moments (PWMB), Unbiased Probability Weighted Moments (PWMU), Mean Power Density Divergence (MDPD), Median (MED), Pickands (PICKANDS), Maximum Penalized Likelihood (MPLE), Maximum Goodness-of-fit (MGF) and the Maximum Entropy (POME) technique, the focus of this manuscript. By way of illustration adjustments were made for the Generalized Pareto distribution, for a sequence of earthquakes intraplacas which occurred in the city of João Câmara in the northeastern region of Brazil, which was monitored continuously for two years (1987 and 1988). It was found that the MLE and POME were the most efficient methods, giving them basically mean squared errors. Based on the threshold of 1.5 degrees was estimated the seismic risk for the city, and estimated the level of return to earthquakes of intensity 1.5°, 2.0°, 2.5°, 3.0° and the most intense earthquake never registered in the city, which occurred in November 1986 with magnitude of about 5.2º
Resumo:
Nowadays, the importance of using software processes is already consolidated and is considered fundamental to the success of software development projects. Large and medium software projects demand the definition and continuous improvement of software processes in order to promote the productive development of high-quality software. Customizing and evolving existing software processes to address the variety of scenarios, technologies, culture and scale is a recurrent challenge required by the software industry. It involves the adaptation of software process models for the reality of their projects. Besides, it must also promote the reuse of past experiences in the definition and development of software processes for the new projects. The adequate management and execution of software processes can bring a better quality and productivity to the produced software systems. This work aimed to explore the use and adaptation of consolidated software product lines techniques to promote the management of the variabilities of software process families. In order to achieve this aim: (i) a systematic literature review is conducted to identify and characterize variability management approaches for software processes; (ii) an annotative approach for the variability management of software process lines is proposed and developed; and finally (iii) empirical studies and a controlled experiment assess and compare the proposed annotative approach against a compositional one. One study a comparative qualitative study analyzed the annotative and compositional approaches from different perspectives, such as: modularity, traceability, error detection, granularity, uniformity, adoption, and systematic variability management. Another study a comparative quantitative study has considered internal attributes of the specification of software process lines, such as modularity, size and complexity. Finally, the last study a controlled experiment evaluated the effort to use and the understandability of the investigated approaches when modeling and evolving specifications of software process lines. The studies bring evidences of several benefits of the annotative approach, and the potential of integration with the compositional approach, to assist the variability management of software process lines
Resumo:
The software development processes proposed by the most recent approaches in Software Engineering make use old models. UML was proposed as the standard language for modeling. The user interface is an important part of the software and has a fundamental importance to improve its usability. Unfortunately the standard UML does not offer appropriate resources to model user interfaces. Some proposals have already been proposed to solve this problem: some authors have been using models in the development of interfaces (Model Based Development) and some proposals to extend UML have been elaborated. But none of them considers the theoretical perspective presented by the semiotic engineering, that considers that, through the system, the designer should be able to communicate to the user what he can do, and how to use the system itself. This work presents Visual IMML, an UML Profile that emphasizes the aspects of the semiotic engineering. This Profile is based on IMML, that is a declarative textual language. The Visual IMML is a proposal that aims to improve the specification process by using a visual modeling (using diagrams) language. It proposes a new set of modeling elements (stereotypes) specifically designed to the specification and documentation of user interfaces, considering the aspects of communication, interaction and functionality in an integrated manner
Resumo:
This dissertation presents a model-driven and integrated approach to variability management, customization and execution of software processes. Our approach is founded on the principles and techniques of software product lines and model-driven engineering. Model-driven engineering provides support to the specification of software processes and their transformation to workflow specifications. Software product lines techniques allows the automatic variability management of process elements and fragments. Additionally, in our approach, workflow technologies enable the process execution in workflow engines. In order to evaluate the approach feasibility, we have implemented it using existing model-driven engineering technologies. The software processes are specified using Eclipse Process Framework (EPF). The automatic variability management of software processes has been implemented as an extension of an existing product derivation tool. Finally, ATL and Acceleo transformation languages are adopted to transform EPF process to jPDL workflow language specifications in order to enable the deployment and execution of software processes in the JBoss BPM workflow engine. The approach is evaluated through the modeling and modularization of the project management discipline of the Open Unified Process (OpenUP)
Resumo:
This thesis presents the results of application of SWAN Simulating WAves Nearshore numerical model, OF third generation, which simulates the propagation and dissipation of energy from sea waves, on the north continental shelf at Rio Grande do Norte, to determine the wave climate, calibrate and validate the model, and assess their potential and limitations for the region of interest. After validation of the wave climate, the results were integrated with information from the submarine relief, and plant morphology of beaches and barrier islands systems. On the second phase, the objective was to analyze the evolution of the wave and its interaction with the shallow seabed, from three transverse profiles orientation from N to S, distributed according to the parallel longitudinal, X = 774000-W, 783000-W e 800000-W. Subsequently, it was were extracted the values of directional waves and winds through all the months between november 2010 to november 2012, to analyze the impact of these forces on the movement area, and then understand the behavior of the morphological variations according to temporal year variability. Based on the results of modeling and its integration with correlated data, and planimetric variations of Soledade and Minhoto beach systems and Ponta do Tubarão and Barra do Fernandes barrier islands systems, it was obtained the following conclusions: SWAN could reproduce and determine the wave climate on the north continental shelf at RN, the results show a similar trend for the measurements of temporal variations of significant height (HS, m) and the mean wave period (Tmed, s); however, the results of parametric statistics were low for the estimates of the maximum values in most of the analyzed periods compared data of PT 1 and PT 2 (measurement points), with alternation of significant wave heights, at times overrated with occasional overlap of swell episodes. By analyzing the spatial distribution of the wave climate and its interaction with the underwater compartmentalization, it was concluded that there is interaction of wave propagation with the seafloor, showing change in significant heights whenever it interacts with the seafloor features (beachrocks, symmetric and asymmetric longitudinal dunes, paleochannel, among others) in the regions of outer, middle and inner shelf. And finally, it is concluded that the study of the stability areas allows identifications of the most unstable regions, confirming that the greatest range of variation indicates greater instability and consequent sensitivity to hydrodynamic processes operating in the coastal region, with positive or negative variation, especially at Ponta do Tubarão and Barra do Fernandes barrier islands systems, where they are more susceptible to waves impacts, as evidenced in retreat of the shoreline
Resumo:
The area between Galinhos and São Bento do Norte beaches, located in the northern coast of the Rio Grande do Norte State is submitted to intense and constant processes of littoral and aeolian transport, causing erosion, alterations in the sediments balance and modifications in the shoreline. Beyond these natural factors, the human interference is huge in the surroundings due to the Guamaré Petroliferous Pole nearby, the greater terrestrial oil producing in Brazil. Before all these characteristics had been organized MAMBMARE and MARPETRO projects with the main objective to execute the geo-environmental monitoring of coastal areas on the northern portion of RN. There is a bulky amount of database from the study area such as geologic and geophysical multitemporal data, hydrodynamic measurements, remote sensing multitemporal images, thematic maps, among others; it is of extreme importance to elaborate a Geographic Database (GD), one of the main components of a Geographic Information System (GIS), to store this amount of information, allowing the access to researchers and users. The first part of this work consisted to elaborate a GD to store the data of the area between Galinhos and São Bento do Norte cities. The main goal was to use the potentiality of the GIS as a tool to support decisions in the environmental monitoring of this region, a valuable target for oil exploration, salt companies and shrimp farms. The collected data was stored as a virtual library to assist men decisions from the results presented as digital thematic maps, tables and reports, useful as source of data in the preventive planning and as guidelines to the future research themes both on regional and local context. The second stage of this work consisted on elaborate the Oil-Spill Environmental Sensitivity Maps. These maps based on the Environmental Sensitivity Index Maps to Oil Spill developed by the Ministry of Environment are cartographic products that supply full information to the decision making, contingency planning and assessment in case of an oil spilling incident in any area. They represent the sensitivity of the areas related to oil spilling, through basic data such as geology, geomorphology, oceanographic, social-economic and biology. Some parameters, as hydrodynamic data, sampling data, coastal type, declivity of the beach face, types of resources in risk (biologic, economic, human or cultural) and the land use of the area are some of the essential information used on the environmental sensitivity maps elaboration. Thus using the available data were possible to develop sensitivity maps of the study area on different dates (June/2000 and December/2000) and to perceive that there was a difference on the sensitivity index generated. The area on December presented more sensible to the oil than the June one because hydrodynamic data (wave and tide energy) allowed a faster natural cleaning on June. The use of the GIS on sensitivity maps showed to be a powerful tool, since it was possible to manipulate geographic data with correctness and to elaborate more accurate maps with a higher level of detail to the study area. This presented an medium index (3 to 4) to the long shore and a high index (10) to the mangrove areas highly vulnerable to oil spill
Resumo:
A modelagem baseada no indivíduo tem sido crescentemente empregada para analisar processos ecológicos, desenvolver e avaliar teorias, bem como para fins de manejo da vida silvestre e conservação. Os modelos baseados no indivíduo (MBI) são bastante flexíveis, permitem o uso detalhado de parâmetros com maior significado biológico, sendo portanto mais realistas do que modelos populacionais clássicos, mais presos dentro de um rígido formalismo matemático. O presente artigo apresenta e discute sete razões para a adoção dos MBI em estudos de simulação na Ecologia: (1) a inerente complexidade de sistemas ecológicos, impassíveis de uma análise matemática formal; (2) processos populacionais são fenômenos emergentes, resultando das interações entre seus elementos constituintes (indivíduos) e destes com o meio; (3) poder de predição; (4) a adoção definitiva, por parte da Ecologia, de uma visão evolutiva; (5) indivíduos são entidades discretas; (6) interações são localizadas no espaço e (7) indivíduos diferem entre si.
Resumo:
O modelo matemático apresentado tem como objetivos: (1) simular as dinâmicas populacionais de um sistema hospedeiro parasitóide de três níveis tróficos composto pelas populações de mosca-do-mediterrâneo Ceratitis capitata (Wiedemann), vespa braconídea parasitóide Diachasmimorpha longicaudata (Ashmed) e frutos cítricos; (2) auxiliar no melhor entendimento dos principais fatores biológicos e ecológicos que regem as interações populacionais e (3) colaborar com programas mais eficientes de controle biológico para o sistema em questão. A metodologia empregada baseou-se na formulação de sistemas de equações de diferenças que descrevessem os processos de interação do sistema trófico. Posteriormente, foram elaboradas resoluções numéricas desses sistemas de equações e sua representação gráfica, utilizando-se o programa computacional Matlab, versão 6.1. Os dados biológicos e ecológicos necessários para a formulação das equações matemáticas foram fornecidos por especialistas em controle de C. capitata e retirados da literatura referente ao controle biológico das moscas-das-frutas em plantações de citros no Brasil, principalmente através da utilização de vespas parasitóides, como D. longicaudata. Os resultados obtidos nas simulações sugerem que o modelo proposto descreve adequadamente o sistema ecológico em questão e permite entender melhor suas principais características biológicas e ecológicas. em conseqüência pode auxiliar na escolha do modo e momento para liberação da vespa parasitóide para o controle mais efetivo de C. capitata.
Resumo:
Não é uma tarefa fácil definir requisitos para os sistemas de software que darão suporte a um negócio, dada a dinâmica de mudanças nos processos. O levantamento de requisitos tem sido feito de forma empírica, sem o apoio de métodos sistematizados que garantam o desenvolvimento baseado nos reais objetivos do negócio. A engenharia de software carece de métodos que tornem mais ordenadas e metódicas as etapas de modelagem de negócios e de levantamento de requisitos de um sistema. Neste artigo é apresentada uma metodologia de desenvolvimento de software resultante da incorporação de atividades propostas para modelagem de negócios e levantamento de requisitos, baseadas em uma arquitetura de modelagem de negócios. Essas atividades tornam o desenvolvimento de software mais sistemático e alinhado aos objetivos da organização, e podem ser incorporadas em qualquer metodologia de desenvolvimento baseada no UP (Unified Process - Processo Unificado).